Robert A. Arko
Kathryn M. Ginger
Kim A. Kastens
Formal, structured annotation metadata provides a mechanism by which digital libraries can provide additional information about library resources above and beyond that which is included in the master resource metadata record. Annotations can be used to capture, organize, and convey information that might otherwise be lost in the ephemera of emails or list servers, such as users' opinions about the usefulness of a resource or suggestions for adapting the resource for use in a classroom. Annotations can codify the professional judgment of third parties, who are neither resource creator nor library builder, for example, by marking resources that are judged to take an advocacy position on controversial issues. Annotations can flag resources that are of interest to a specialized subset of users, thus conveying specialized information that is only of interest to that sub-audience.
Although the value of annotations in digital libraries has long been recognized from a theoretical perspective , annotations are just beginning to demonstrate their practical value to real-world users. In this article, we describe a fully implemented annotation system within the Digital Library for Earth System Education (DLESE) . The DLESE Community Review System (CRS)  captures feedback from teachers and learners who have used a DLESE resource, aggregates this information into formats of interest to other users and potential users of the resource, and disseminates the results as annotations. This article describes how CRS annotations are created, disseminated, and used to generate displays for end-users. In the collaboration reported here, Arko and Kastens at Lamont-Doherty Earth Observatory (LDEO) were the annotation providers, while Ginger and Weatherley at the University Corporation for Atmospheric Research (UCAR) maintained the library collections and primary search portal. We conclude with some lessons learned and suggestions for other creators of digital library annotation systems.
How are annotations useful in digital libraries?
Merriam-Webster defines "annotate" as "to make or furnish critical or explanatory notes or comment". In the digital library context, annotations can serve broadly to create new information resources, to interpret existing ones, to access resources in new ways, and to support the effective use of resources .
Annotations can strengthen a digital library for education in several ways:
Why haven't annotations been more widely adopted?
While annotations are valuable for the reasons outlined above, they still face hurdles that have slowed their adoption in digital libraries. No widely-accepted framework for annotation metadata (analogous to Dublin Core  for resource metadata) or widely-used tools for the creation of annotation metadata have yet emerged, though the Annotea project  and others are working to address those issues. There is no standard protocol for registering annotations so that a resource record can "track forward" to locate all of its associated annotation records. Most digital library builders do not yet incorporate annotations into their public interfaces. As a consequence, users haven't become accustomed to seeing annotation information in digital libraries, and haven't begun to think up new, creative uses for annotations.
The rest of this paper describes steps we have taken to overcome these hurdles and leverage the strengths of the annotation mechanism.
Annotations in the DLESE Community Review System
The Digital Library for Earth System Education (DLESE) is a distributed effort involving educators, students, and scientists working together to improve the quality, quantity, and efficiency of teaching and learning about the Earth system at all levels [10, 11, 12]. DLESE provides access to high-quality educational resources and data sets; support services for educators and learners; and communication networks that facilitate interactions and collaborations among its users. DLESE was a founding member of the National Science Digital Library (NSDL) .
The DLESE Metadata FrameworkDLESE has deployed a robust metadata infrastructure  that uses library catalog records of different types to describe educational resources, collections, and annotations:
The DLESE Annotation Framework
The DLESE Annotation framework version 1.0.00 defines an annotation as "additional content or metadata that is appropriate to associate with an educational resource" . The framework has the following features and requirements:
The framework defines the following fields (+ indicates required):
The list of annotation types is expected to grow in response to community needs; other suggested types include Historical Perspective and Critical Thinking .
As noted above, the framework requires that only one annotation be allowed per annotation metadata record. Some judgment is needed to determine what constitutes "one annotation." For example, each comment about a resource (from a particular author at a particular date and time) could be cataloged as a separate annotation, or all the comments about a resource could be aggregated into a single annotation.
The "Annotated Collection" Concept
Annotations may serve to group resources into collections. From the library user's perspective, such "annotated collections" are valuable because they allow users to refine search to include only resources having annotations related to a specific interest.
The DLESE Collection System (DCS)  has been extended to allow the GUI-based creation of annotated collections. DLESE currently includes three annotated collections from different providers: The Community Review System (CRS) collection (described below), the Climate Change Collection (CCC) , and the Journal of Earth System Education (JESSE) collection .
About the CRSThe DLESE Community Review System (CRS) is a library component that gathers feedback from educators and learners who have used DLESE resources. The goals of the CRS are to :
The CRS gathers feedback from educators and learners via a web-based recommendation engine. The CRS gathers two types of feedback: Comments/Teaching Tips and Structured Reviews.
A user who wishes to submit a comment, teaching tip, or review can reach the appropriate website via three routes: follow a link from the DLESE Discovery System resource description (Figure 1), click the "DLESE: Submit Feedback" button available on some resources, or follow the "Review or Comment on a Resource" link from the CRS home page. 
Community feedback from the CRS is a good candidate to be cast as annotations because it is contributed by a wide variety of individuals other than the resource creator, and grows incrementally over time. The CRS decided early in its design phase to cast all of its publicly available content as annotations.
Dissemination of annotations to library builders via programmatic interface
CRS annotation content is stored in a PostgreSQL relational database backend at LDEO. This content is exported nightly into a collection of XML metadata records via automated shell scripts, and validated against the XML schema published as part of the DLESE Annotation framework . The XML records are distributed online via a public server at LDEO using the Open Archives Initiative Protocol for Metadata Harvest (OAI-PMH) . This is in contrast to other annotation providers whose review content is not public, or whose review content is public but not available via a programmatic interface. The records are harvested by the DLESE Program Center (DPC) at UCAR, and used by the DLESE Discovery System  to create the "Submit a Review" and "See Reviews, Teaching Tips..." links in the resource descriptions (Figure 1).
The total body of resources that contain at least one CRS annotation constitute a permanent collection within DLESE: the CRS Annotated Collection. Because the DLESE Discovery System supports search by collection, users can search for only those resources for which DLESE has received user feedback (Figure 2). By combining this with DLESE's other search capabilities, a user could find, for example, all DLESE resources that are (a) about earthquakes, (b) for high school students, and (c) have CRS feedback. This model is extensible to any library portal, such as NSDL, that can harvest XML records and validate against a published schema.
Dissemination of annotations to end-users via graphical interfaces
The Web-based displays produced from CRS annotation content are:
These CRS web displays are discoverable by end-users via a master menu on the CRS home page; the DLESE Discovery System (DDS) resource description page; and the "DLESE: Read Feedback" link embedded directly into some resources . The annotation metadata records are also discoverable through DLESE's Web search services , which can be used to create customized, distributed library interfaces.
In implementing our annotation service, we have observed the following:
We gratefully acknowledge Mary Marlino and the entire team of dedicated professionals at the DLESE Program Center (DPC) in Boulder for their support in this collaborative effort. We thank Neil Holzman and Dale Chayes at Lamont-Doherty Earth Observatory for their time and insight in building our technical infrastructure.
This work was supported by the National Science Foundation through grant awards DUE00-85827, DUE02-26292, EAR03-05092, and EAR04-44680. This is Lamont-Doherty Earth Observatory contribution number 6887.
Notes and References
 Nuernberg, P., R. Furuta, J. Leggett, C. Marshall, F. Shipman (1995). Digital Libraries: Issues and Architectures. In Proc. of the 2nd Conference on the Theory and Practice of Digital Libraries (DL'95), July 11-13, 1995, Austin, Texas, USA. <http://www.csdl.tamu.edu/DL95/papers/nuernberg/nuernberg.html>.
 Agosti, M., N. Ferro, I. Frommholz, U. Thiel (2004). Annotations in Digital Libraries and Collaboratories Facets, Models and Usage. In Proc. of the 8th European Conference on Research and Advanced Technology for Digital Libraries (ECDL 2004), September 12-17, 2004, Bath, UK. LNCS 3232, Springer-Verlag, Berlin/Heidelberg, Germany, 244-255. <http://www.springerlink.com/openurl.asp?genre=issue&issn=0302-9743&volume=3232>.
 Kastens, K. (2004). Making DLESE into "the" Source for Pedagogical Content Knowledge Pertaining to the Earth & Environment. DLESE Quality Workshop 2004. October 27, 2004. <http://swiki.dlese.org/quality/uploads/1/Geo_PCK_source.pdf>.
 "What Are the Physical Effects of Glaciers?" (online educational resource) <http://www.dlese.org/dds/catalog_NASA-Edmall-446.htm>.
 Manduca, C. and D. Mogk (2000). The Digital Library for Earth System Education (DLESE): A Community Plan. Final Report to the National Science Foundation (NSF), Grant 99-06648. June 2000. <http://www.dlese.org/documents/plans/CommPlanFinal_secure.pdf>.
 Marlino, M., T. Sumner, D. Fulker, C. Manduca, D. Mogk (2001). The Digital Library for Earth System Education: Building Community, Building the Library. Communications of the ACM, v.44 n.5, p.80-81, May 2001. <http://doi.acm.org/10.1145/374308.374356>.
 Wright, M., M. Marlino, T. Sumner (2002). Meta-Design of a Community Digital Library. D-Lib Magazine, May 2002, 8(5). <http://www.dlib.org/dlib/may02/wright/05wright.html>.
 Lagoze, C., W. Arms, et al (2002). Core Services in the Architecture of the National Science Digital Library (NSDL). In Proceedings of the 2nd ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL 2002), July 14-18, 2002, Portland, Oregon, USA. ACM Press, New York, NY, 201-209. <http://doi.acm.org/10.1145/544220.544264>.
 Kastens, K. (2004). Concerning Annotations and Annotation Collections in the context of DLESE. February 13, 2004. <http://www.dlese.org/documents/papers/KK_Annotations_WP.pdf>.
 Kastens, K. (2005). The DLESE Community Review System: Gathering, Aggregating, and Disseminating User Feedback about the Effectiveness of Web-based Educational Resources. Journal of Geoscience Education, 53(1), 37-43. <http://www.nagt.org/files/nagt/jge/abstracts/Kastens_v53n1.pdf>. Note: Until summer 2005, the CRS gathered specialist reviews as well as community reviews, and had an additional goal: to identify excellent resources and advance them into the DLESE Reviewed Collection. Specialist reviews are now being handled by Science Education Solutions: <http://www.dlese-project.org/2005workstatement.html#pr>.
 Kastens, K. and N. Holzman (2006). The Digital Library for Earth System Education provides Individualized Reports for Teachers on the Effectiveness of Educational Resources in their own Classrooms. D-Lib Magazine, Jan. 2006, 12(1). <http://www.dlib.org/dlib/january06/kastens/01kastens.html>.
 DLESE Reviewed Collection (DRC) Best Practices <http://www.dlese.org/Metadata/collections/drc-best-practices.htm>.
 Example of the DLESE Discovery System (DDS) resource description page for the "Virtual Earthquake" educational resource:
 Open Archives Initiative Protocol for Metadata Harvest (OAI-PMH) <http://www.openarchives.org/OAI/openarchivesprotocol.html>.
 Example of the CRS "Available Reports" page for the "Virtual Earthquake" educational resource: <http://crs.dlese.org/annotations/?id=DLESE-000-000-000-050>.
 Weatherley, J. (2005). A web service framework for embedding discovery services in distributed library interfaces. In Proceedings of the 5th ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL 2005), June 7-11, 2005, Denver, Colorado, USA. ACM Press, New York, NY, 42-43. <http://doi.acm.org/10.1145/1065385.1065394>.
 Davis, L. (2005). Usability Study of the DLESE Community Review System. September 9, 2005. <http://www.dlese.org/documents/reports/crs_usability_report2005.pdf>.
 Kastens, K., B. DeFelice, et al (2005). Questions & Challenges Arising in Building the Collection of a Digital Library for Education: Lessons from Five Years of DLESE. D-Lib Magazine, Nov. 2005, 11(11). <http://www.dlib.org/dlib/november05/kastens/11kastens.html>.
Copyright © 2006 Robert A. Arko, Kathryn M. Ginger, Kim A. Kastens, John Weatherley