Search   |   Back Issues   |   Author Index   |   Title Index   |   Contents

Conference Report


D-Lib Magazine
September/October 2009

Volume 15 Number 9/10

ISSN 1082-9873

Report on OAI 6

CERN Workshop on Innovations in Scholarly Communication, Geneva 17-19 June 2009


Elena Giglia
University of Turin

Red Line


During the three days of OAI 6 in Geneva last June, you could almost breathe in the atmosphere of openness, dialogue and debate; interest and passion for the most recent innovations; and excitement upon hearing about the most advanced positions in new technologies and paradigms of scientific communications. The enthusiasm generated by the workshop is revealed in the attendees' photographs.

OAI 6 provided a well-combined mix of plenary sessions – videos and slideshows of which are online – and small breakout groups, where discussions went on, new proposals were put forward, and information about new projects or useful applications emerged.

During the six plenary sessions some of the central topics regarding the changing landscape of scientific communication were debated. These are discussed below.

Plenary 1: Compound objects

Exhibiting his unique clarity and vision, Herbert Van de Sompel (Los Alamos National Laboratory, USA) opened the workshop advocating for a global reconsideration and re-imagining of scholarly communication. He stressed that "the current scholarly communication system is nothing but a scanned copy of the paper based system". It should be noted that in his speech Van de Sompel never used the term "article", but instead used the term "scholarly record". As usual, with his vision he is already a step ahead of most stakeholders in this area.

Current indicators of changes in scholarly communication include:

  • An increased number of scientific records with a machine-actionable substrate. This not only allows and fosters text mining and the semantic Web but calls for the creation of new machine-agents that can filter information and combine disparate findings in new ways, furthering knowledge;
  • An increase in the integration of datasets into the scholarly record. This has been accompanied by implementation of the functions of scholarly communication for data, focusing on "certification", "archiving", and "awareness";
  • Exposure of process and its integration in the scholarly record. Traditionally this has been achieved via citation analysis, but it is now being rethought and mapped via usage data.

The My Experiment initiative is one example of a new and innovative shared and collaborative science environment. And the OAI-ORE (Object Reuse and Exchange) interoperability framework also fits well into the Web-oriented flow of scholarly communication, providing resource aggregation.

Three presentations in this first plenary show concrete applications of the OAI-ORE protocol:

  1. In results visualization and topology navigation in JSTOR (Robert Sanderson, University of Liverpool, UK);
  2. In the creation of "enhanced publications" offering durable access to value-added services and datasets (Marteen Hoogerwerf, DANS, The Netherlands); and
  3. In simplification of the data publishing workflow and the acquisition/integration of metadata with the help of SWORD [Simple Web-service Offering Repository Deposit] (Tim Di Lauro, Johns Hopkins University, USA).

Plenary 2: Mandates and preservation

This session, convened by David Prosser (SPARC Europe), dealt with fundamental topics such as economic sustainability, long-term preservation, and mandates.

John Houghton (Centre for Strategic Economic Studies, Victoria University, Australia) presented the results of the JISC survey on Economic Implications of Alternative Publishing Models, which sought to determine if Open Access results would prove to be cost-effective after quantifying cost, benefits, and return on investments. Based on the system savings in an Open Access scenario in the UK, it should be possible to meet costs of alternative publishing models from within current budgetary allocations.

Tom Cochrane (Queensland University of Technology (QUT), Australia) discussed the key issue in the pioneering QUT experience with institutional mandates, which was to understand the core business of each research community, and the motives and drivers of researchers themselves. A big incentive in motivating authors to deposit their works in their institutional repository has been the repository usage statistics, which show an increase in download and thereby an increase in authors' prestige and visibility. Other possible benefits include correct interpretation of copyright as an enabler of scientific dissemination, instead of as a barrier or a concern for researchers, and the new, alternative metrics of impact that can lead to better research evaluation.

Wouter Spek (Alliance for Permanent Access, EU) presented the PARSE.insight project, funded by the European Commission, the aim of which is to set a roadmap and provide recommendations for developing the e-infrastructure, in order to maintain the long-term accessibility and usability of scientific digital information in Europe.

Andreas Rauber (Vienna University of Technology, Austria) outlined an accountable preservation plan using PLATO, open source software that identifies the requirements of a specific digitizing project and ensures that each step of the plan is taken into consideration.

Plenary 3: Use and re-use

This session, chaired by William Nixon (University of Glasgow, UK) put in the foreground the copyright issue, as seen from both an institutional repository manager's view and a publisher's view.

Morag Greig (University of Glasgow, UK) showed the shifts in the last few years both in publishers' attitudes and in authors' concerns over copyright – helped by services like SHERPA/RoMEO – pointing out how the main concern for an author now is actually about the version available in the repository and the possible loss of citations.

David Hoole (Nature Publishing Group), speaking on behalf of one of the most prestigious traditional publishers, emphasized how the balancing of rights has to evolve along with technology innovations and keep pace with new communication paradigms. He presented the Nature Manuscript Deposition Service and the Nature Licence to Publish. The Nature Licence to Publish is an exclusive licence, but it allows some forms of re-use. In October 2009, Nature plans to launch Nature Communication, a peer review publishing service that is quick, and provides an Open Access option and Creative Commons licenses. In waiting for the results of the PEER European project about the effects of massive self-archiving, Nature is open and willing to try original forms of re-use.

Sophia Ananiadou (National Centre for Text Mining, UK) gave a practical demonstration of the techniques of term identification, semantic tagging, automatic summarization, fact extraction, and disambiguation, which can lead to a very enriched user experience in information retrieval.

Alexander Lerchl (Jacobs University, Bremen, DE) emphasized the absolute need for raw data to be published within the scientific record, as a matter of reproducibility on one side and of scientific integrity on the other. His two case studies demonstrated how free access to raw data made it possible to discover fabricated or falsified data.

Plenary 4: Embedding

Frank Scholze (University of Stuttgart, DE) led this session, for which the topic was resource integration.

Martin Van Lujit (Utrecht University Library, The Netherlands) gave an inspiring example of a "library on the move" from being a traditional library to becoming an innovative partner in science. The library's D-Space installation was at first just a place to store and find knowledge; afterwards, in a bottom-up process tailored for real users' needs, the library has developed ways to support the workflows of the actual production and presentation of scientific knowledge products, via the Virtual Knowledge Centre. The Centre offers a multitude of tools for communication, collaboration, access to a professional network of experts and scientists, and a gateway to a wide range of relevant information, scientific resources and data storage facilities – all in a "one-stop-shopping" way.

Peter Burnhill (University of Edinburgh, UK) illustrated the JORUM project for a national, open learning materials repository in the UK, to be integrated within courses, and Travis Brooks (SLAC National Accelerator Laboratory) spoke about the first steps of INSPIRE, a new platform for High Energy Physics built upon the successful SPIRES features and content, and the arXiv experience. INSPIRE exploits new technologies and Web 2.0 features to enable user-generated content and further encourage community knowledge building. Very useful on this path is the Authors' profile tool with links to co-authors, preferred journals, and most-used keywords.

Plenary 5: Community building

Who better than Thomas Krichel (University of Long Island, USA and Novosibirsk State University, RU), founder of RePEc, could have chaired the session on cohesion and community building?

Christian Zimmermann (University of Connecticut, USA) highlighted the key factors that made RePEc the leading Web community among economists. With RePEc, the reduced time needed for the dissemination of pre-prints – in a field where delays in publication were notorious – worked as catalyst to generate a virtuous circle among authors, institutions and publishers and made participation in RePEc a "must" for economists. To create author loyalty, new value-added services have been implemented, such as provision of usage statistics, citation tracking, and ranking, which together have created a positively competitive climate.

James Pringle (Thomson Reuters) presented the new ResearcherID profile service, while Jim Pitman (University of California Berkeley, USA) carried the expectations of the mathematical community with the Bibliographic Knowledge Network. The Bibliographic Knowledge Network is aimed at creating open network bibliographic data stores and associated services, and demonstrating the value of open bibliographic data by application of machine learning and graphical visualization tools for knowledge discovery.

Plenary 6: Quality assurance

Johan Bollen (Los Alamos National Laboratory, USA) was the undisputed leading actor of this last session, chaired again by Frank Scholze (University of Stuttgart, DE). In his charming and captivating way, Bollen dealt with two of the most debated topics in scholarly communication: impact and metrics. He presented the results of the MESUR project, which he carried out with Herbert Van De Sompel, focusing on usage data as a real-time, detailed recording of each user's activity – not only scholars – and on network metrics, which outline the context and the framework of scientific activity better than traditional citations counts can. The stimulating map of science created from the over one billion collected usage events suggests unexpected links and activities, and reveals novel connections between non-contiguous disciplines [1]. A very useful and innovative, interactive version of the map is now available on the MESUR/Services website: by clicking a dot on the map, you can easily follow all the interactions. In such a dynamic network, do citations still play as determinant a role as they do in a paper-based environment? The comparison between 39 (37 in the map) metrics (partly citation-based, partly usage-based) and their correlations in a metric map give evidence of various aspects of impact and prestige [2]. Impact Factor, the most used citation metric, appears to be in a marginal position between indicators of popularity rather than prestige. The map demonstrates that Impact Factor is only one, and actually only a partial, measure of impact. There are lots of other usage-based metrics that better represent the online era, with its widespread access to information resources, in the effort to arrive at a more accurate, pro-active evaluation of scientific impact.

Ulrich Poeschl (Max Planck Institute fur Chemie, DE) reopened the question on the other canon of scholarly communication, i.e., peer review. He pointed out the actual inefficiency of the traditional system of peer review, which has failed as a seal of quality (as can be demonstrated by cases of fraud or carelessness) and, moreover, is inadequate for today's highly diverse and rapidly evolving world of science. In answering to the conflicting needs of science, those of rapid publication and those of thorough revision and discussion, the method supported by the experience of "Atmospheric chemistry and physics" consists of a three-step process. First, rapid publication of "discussion papers" that have been selected by editors, and are fully citable and permanently archived. Second, a phase of public peer review and interactive discussion, published alongside the paper, anonymous or not. Third, review completion and publication of the final paper. The highly performing indexes of Impact Factor stand for a good reputation among scientists; the open discussion also led to high quality papers, with a rejection rate of only 10-20%. This is a clear example of how the advantages of open access and open peer review can be efficiently and flexibly combined with the strengths of traditional publishing and peer review. Due to its high quality and impact, high efficiency, and low cost, the same or similar concepts have recently also been adopted in other disciplines, such as life sciences and economics.

Closing Keynote

In his keynote speech at the close of OAI 6, Paul Ayris (University College London, UK) summarized the main issues with which the workshop dealt: an arduous task, because of the richness and the importance of the workshop themes.

Two final notes on OAI 6

First, from the workshop on Intellectual Property, Wilma Mossnik (SURF Foundation, NL) mentioned that the JISC-SURF Licence to Publish is under revision, focusing mostly on the six months embargo.

And last, from the breakout group on the future of scholarly publishing – chaired by a David Prosser (SPARC Europe) at his best – a lot of suggestions on:

  • The form of the "scientific record" of the future: shorter, more interactive, surely with datasets;
  • The new parallel channels of communication such as wikis and blogs, to be taken into consideration in the research evaluation framework;
  • The new expected rules for research evaluation: Heather Morrison (Simon Fraser University, Canada) stressed that a change is needed and expected, as those who set the rules for evaluation are the same researchers who feel these rules to be inadequate;
  • The paradoxes of an inelastic market that has to cope with the world economic crisis and budgetary cuts;
  • The need for an open, transparent method of peer review to foster science and not to delay it;
  • The need for new metrics for assessing impact in the Web age.

In the breakout group, someone issued a caution to all to consider the economic and cultural implications of a future of worldwide Open Access. Someone else replied: What revolution has ever been carried out that considered and put on paper in advance all the possible effects, adverse or not?


[1] Bollen J, Van de Sompel H, Hagberg A, Bettencourt L, Chute R, et al. Clickstream Data Yields High-Resolution Maps of Science. PLoS ONE 2009 4(3): e4803 <>.

[2] Bollen J, Van de Sompel H, Hagberg A, Chute R, A Principal Component Analysis of 39 Scientific Impact Measures. PLoS ONE 2009 4(6): e6022 <>.

Copyright © 2009 Elena Giglia

Top | Contents
Search | Author Index | Title Index | Back Issues
Previous Article | Next Conference Report
Home | E-mail the Editor


D-Lib Magazine Access Terms and Conditions