Search   |   Back Issues   |   Author Index   |   Title Index   |   Contents

Articles

spacer

D-Lib Magazine
May 2006

Volume 12 Number 5

ISSN 1082-9873

Using Annotations to Add Value to a Digital Library for Education

 

Robert A. Arko
Lamont-Doherty Earth Observatory
Columbia University
<arko@Ideo.columbia.edu>

Kathryn M. Ginger
University Corporation for Atmospheric Research
<ginger@ucar.edu>

Kim A. Kastens
Lamont-Doherty Earth Observatory and Department of Earth & Environmental Science
Columbia University
<kastens@Ideo.columbia.edu>

John Weatherley
University Corporation for Atmospheric Research
<jweather@ucar>

Red Line

spacer

Introduction

Formal, structured annotation metadata provides a mechanism by which digital libraries can provide additional information about library resources above and beyond that which is included in the master resource metadata record. Annotations can be used to capture, organize, and convey information that might otherwise be lost in the ephemera of emails or list servers, such as users' opinions about the usefulness of a resource or suggestions for adapting the resource for use in a classroom. Annotations can codify the professional judgment of third parties, who are neither resource creator nor library builder, for example, by marking resources that are judged to take an advocacy position on controversial issues. Annotations can flag resources that are of interest to a specialized subset of users, thus conveying specialized information that is only of interest to that sub-audience.

Although the value of annotations in digital libraries has long been recognized from a theoretical perspective [1], annotations are just beginning to demonstrate their practical value to real-world users. In this article, we describe a fully implemented annotation system within the Digital Library for Earth System Education (DLESE) [2]. The DLESE Community Review System (CRS) [3] captures feedback from teachers and learners who have used a DLESE resource, aggregates this information into formats of interest to other users and potential users of the resource, and disseminates the results as annotations. This article describes how CRS annotations are created, disseminated, and used to generate displays for end-users. In the collaboration reported here, Arko and Kastens at Lamont-Doherty Earth Observatory (LDEO) were the annotation providers, while Ginger and Weatherley at the University Corporation for Atmospheric Research (UCAR) maintained the library collections and primary search portal. We conclude with some lessons learned and suggestions for other creators of digital library annotation systems.

Annotations

How are annotations useful in digital libraries?

Merriam-Webster defines "annotate" as "to make or furnish critical or explanatory notes or comment"[4]. In the digital library context, annotations can serve broadly to create new information resources, to interpret existing ones, to access resources in new ways, and to support the effective use of resources [5].

Annotations can strengthen a digital library for education in several ways:

  • Engage the community. Annotations offer a comparatively fast and easy way for library users to contribute to the library, based on their own experience and expertise, without becoming experts in metadata and cataloging.
  • Capture diffuse and ephemeral information. Annotations capture ephemeral insights and feedback from users that is otherwise lost in transient media like conversation, emails, or list servers. When this information is expressed in structured records, it is preserved in the library and becomes discoverable. In a digital library for education, such insights may contain gems of pedagogical content knowledge [6], i.e. knowledge of how to teach with a given educational resource.
  • Increase flexibility. The suite of annotations associated with a particular resource can grow and change rapidly without requiring constant, staff-intensive revision of the master resource record. This increases the library's flexibility and responsiveness. There is no limit to the number or variety of possible annotations for a given resource.
  • Codify the professional judgment of third parties. Annotations provide an organized mechanism for individuals and groups to delineate whether and how resources align with criteria in which that group has specific expertise. For example, a group with expertise in history of science could use the annotation mechanism to flag resources that present now-outdated viewpoints that were valid at some time in history.
  • Serve specialized sub-audiences. Annotations enable communities of users to highlight resources of interest to specific audiences. Within such an "annotated collection", each annotation can carry specialized information of interest only to that audience. For example, one could imagine a New York City educators' annotated collection within a larger library, in which a general resource about physical effects of glaciers [7] is annotated to indicate where students can observe glacial striations on a field trip to Central Park.
  • Disseminate outcomes of a review process. The outcomes could be an overall categorical rating of a resource (e.g., 1 to 5), a narrative summary, scores from a set of rubrics, or some other format.

Why haven't annotations been more widely adopted?

While annotations are valuable for the reasons outlined above, they still face hurdles that have slowed their adoption in digital libraries. No widely-accepted framework for annotation metadata (analogous to Dublin Core [8] for resource metadata) or widely-used tools for the creation of annotation metadata have yet emerged, though the Annotea project [9] and others are working to address those issues. There is no standard protocol for registering annotations so that a resource record can "track forward" to locate all of its associated annotation records. Most digital library builders do not yet incorporate annotations into their public interfaces. As a consequence, users haven't become accustomed to seeing annotation information in digital libraries, and haven't begun to think up new, creative uses for annotations.

The rest of this paper describes steps we have taken to overcome these hurdles and leverage the strengths of the annotation mechanism.

Annotations in the DLESE Community Review System

About DLESE

The Digital Library for Earth System Education (DLESE) is a distributed effort involving educators, students, and scientists working together to improve the quality, quantity, and efficiency of teaching and learning about the Earth system at all levels [10, 11, 12]. DLESE provides access to high-quality educational resources and data sets; support services for educators and learners; and communication networks that facilitate interactions and collaborations among its users. DLESE was a founding member of the National Science Digital Library (NSDL) [13].

The DLESE Metadata Framework

DLESE has deployed a robust metadata infrastructure [14] that uses library catalog records of different types to describe educational resources, collections, and annotations:
  • A resource metadata record contains the fundamental information to uniquely identify and describe an educational resource. Required fields include catalog number, title, URL, description, subjects (topics), technical requirements, resource type, audience (grade range), copyright, cost, resource creator and cataloger, language, terms of use, metadata framework, creation and accession dates, and record status. Optional fields include spatial and temporal coverage, science and geography standards, keywords, interactivity type and level, size and duration of the resource, and typical learning time.
  • A collection metadata record describes a group of resources (activities, modules, annotations, etc.) organized around a theme, organization, topic, audience, learning strategy, or some other criteria that can be articulated.
  • An annotation metadata record contains additional information about an educational resource, separate from the resource record, that may be contributed by any resource creator or user. The DLESE Annotation Framework grew out of the original NSDL Annotation Model [15] and is consistent with the NSDL technical architecture [16, 17], which specifically identifies "Annotation" as a core service.

The DLESE Annotation Framework

The DLESE Annotation framework version 1.0.00 defines an annotation as "additional content or metadata that is appropriate to associate with an educational resource" [18]. The framework has the following features and requirements:

  • Only one annotation is allowed per annotation record, and only one earth system resource may be annotated by that record.
  • An annotation metadata record is associated with the earth system resource being annotated using the resource's catalog number (not the URL).
  • An annotation metadata record may either encapsulate the annotation entirely or refer to an external location (URL) where the substance of the annotation resides.
  • Any number of annotation records may exist for a given earth system resource.

The framework defines the following fields (+ indicates required):

+ Service the name of the service, organization or person making the annotation accessible
+ Record ID the identification number of the annotations record
    Title the full name of the annotation
    Status CONTROLLED VOCABULARY: { In progress, Completed, Retired }
+ Date created date annotation record was created
    Date contributed date annotation record was contributed to the library
    Date modified date annotation record was last updated
+ Item ID the catalog number of the metadata record of the resource being annotated
+ Contributor the person or organization authoring the annotation
+ Type CONTROLLED VOCABULARY: the kind of annotation
space { Assessment strategy
space   Bias
space   Challenging audience
space   Comment
space   Editor's summary
space   Educational standard
space   Example
space   Misconception
space   Quantitative information
space   Review
space   See also
space   Skill
space   Teaching tip  }
+ Content the substance of the annotation given as text, URL, or an overall rating of the resource
    Format CONTROLLED VOCABULARY: { Audio, Graphical, Text Video }
    Context the page (expressed as a URL) of the resource to which an annotation applies directly
    More info additional information provided in the form of one or more XML documents
    Share indicates whether a contributor's name and email address is to be displayed in DLESE user interfaces

The list of annotation types is expected to grow in response to community needs; other suggested types include Historical Perspective and Critical Thinking [19].

As noted above, the framework requires that only one annotation be allowed per annotation metadata record. Some judgment is needed to determine what constitutes "one annotation." For example, each comment about a resource (from a particular author at a particular date and time) could be cataloged as a separate annotation, or all the comments about a resource could be aggregated into a single annotation.

The "Annotated Collection" Concept

Annotations may serve to group resources into collections. From the library user's perspective, such "annotated collections" are valuable because they allow users to refine search to include only resources having annotations related to a specific interest.

The DLESE Collection System (DCS) [20] has been extended to allow the GUI-based creation of annotated collections. DLESE currently includes three annotated collections from different providers: The Community Review System (CRS) collection (described below), the Climate Change Collection (CCC) [21], and the Journal of Earth System Education (JESSE) collection [22].

About the CRS

The DLESE Community Review System (CRS) is a library component that gathers feedback from educators and learners who have used DLESE resources. The goals of the CRS are to [23]:
  • provide feedback from resource users to resource creators, allowing creators to iteratively improve their resources;
  • provide information for prospective users, based on other users' experience, that will help them make informed decisions about whether to use a resource, or will help them use the resource more effectively;
  • provide geoscience educators with an overview of their own students' assessment of how well a DLESE resource served as a learning activity; and
  • provide education professors with a tool to assess their students' insights into their own learning processes and their ability to critique digital learning resources [24].

The CRS gathers feedback from educators and learners via a web-based recommendation engine. The CRS gathers two types of feedback: Comments/Teaching Tips and Structured Reviews.

  • A Comment is an informal remark that states a fact or expresses an opinion about the resource. Closely related, a Teaching Tip is an informal remark that conveys advice about how the resource can be used effectively.
  • In a Structured Review, the user rates the resource on rubrics designed to assess the resource's ease of use, quality of documentation, pedagogical effectiveness, effectiveness at motivating or inspiring learners, scientific accuracy, and robustness as a digital resource [25]. In addition, users have an opportunity to report how well the resource worked with specific audiences, such as learners who have visual impairments.

A user who wishes to submit a comment, teaching tip, or review can reach the appropriate website via three routes: follow a link from the DLESE Discovery System resource description (Figure 1), click the "DLESE: Submit Feedback" button available on some resources, or follow the "Review or Comment on a Resource" link from the CRS home page. [26]

Community feedback from the CRS is a good candidate to be cast as annotations because it is contributed by a wide variety of individuals other than the resource creator, and grows incrementally over time. The CRS decided early in its design phase to cast all of its publicly available content as annotations.

Dissemination of annotations to library builders via programmatic interface

CRS annotation content is stored in a PostgreSQL relational database backend at LDEO. This content is exported nightly into a collection of XML metadata records via automated shell scripts, and validated against the XML schema published as part of the DLESE Annotation framework [27]. The XML records are distributed online via a public server at LDEO using the Open Archives Initiative Protocol for Metadata Harvest (OAI-PMH) [28]. This is in contrast to other annotation providers whose review content is not public, or whose review content is public but not available via a programmatic interface. The records are harvested by the DLESE Program Center (DPC) at UCAR, and used by the DLESE Discovery System [29] to create the "Submit a Review" and "See Reviews, Teaching Tips..." links in the resource descriptions (Figure 1).

screen shot of the DLESE Discovery System resource description page

Figure 1: DLESE Discovery System resource description page.

The total body of resources that contain at least one CRS annotation constitute a permanent collection within DLESE: the CRS Annotated Collection. Because the DLESE Discovery System supports search by collection, users can search for only those resources for which DLESE has received user feedback (Figure 2). By combining this with DLESE's other search capabilities, a user could find, for example, all DLESE resources that are (a) about earthquakes, (b) for high school students, and (c) have CRS feedback. This model is extensible to any library portal, such as NSDL, that can harvest XML records and validate against a published schema.

screem shot of DLESE Discovery System search page with CRS Annotated Collection selected

Figure 2: DLESE Discovery System search page with CRS Annotated Collection selected.

Dissemination of annotations to end-users via graphical interfaces

The Web-based displays produced from CRS annotation content are:

  • Tally of Rubrics – the quantitative scores from the community reviewers, aggregated into bar graphs (Figure 3).
  • Challenging Audiences – a table summarizing the usefulness of the resource for specific categories of learners, such as learners with limited English (Figure 4).
  • Comments & Teaching Tips – verbatim remarks and suggestions directly from educators and learners (Figure 5).
  • Editor's Summary – a narrative summary integrating information from all available review information for the resource.

screen shot of the CRS Tally of Rubrics display page

Figure 3: CRS "Tally of Rubrics" display page.

 

screen shot of CRS Challenging Audiences display page

Figure 4: CRS "Challenging Audiences" display page.

 

screen shot of the CRS Comments and Teaching Tips display page

Figure 5: CRS "Comments and Teaching Tips" display page.

These CRS web displays are discoverable by end-users via a master menu on the CRS home page; the DLESE Discovery System (DDS) resource description page; and the "DLESE: Read Feedback" link embedded directly into some resources [30]. The annotation metadata records are also discoverable through DLESE's Web search services [31], which can be used to create customized, distributed library interfaces.

Lessons Learned

In implementing our annotation service, we have observed the following:

  • A properly-designed metadata framework allows multiple annotation providers to share their content with multiple library builders. DLESE's annotation structure has benefited from the division of work among several partner institutions, which forced us to develop a generalized, portable, well-documented metadata framework.
  • Annotations are more easily integrated into libraries if the annotation content is encapsulated entirely within the metadata record, e.g., not requiring resolution of an embedded URL to access the content at some external location. Our original implementation of CRS annotation records used only embedded URLs; this reduced maintenance overhead because the records were simple to construct and did not require updating; however, it gave the DLESE Discovery System little flexibility in how the content was displayed.
  • At present, the "See reviews, teaching tips, related resources, etc." link from the DLESE Discovery System leads to an intermediate page [32] that provides further links to whatever review or comment information is available for that resource. Our Usability Study [33] showed that this page presents a confusing barrier to users. Our lesson learned is to minimize the number of clicks in the user interface from a resource to its annotations. For the future, we plan to display the annotation content in-line within the Discovery System description, when possible, rather than linking to it from a separate intermediate page.
  • Many CRS Comment and Teaching Tips annotations refer to a specific location, section, or figure within the resource, but most educational resources are not designed to permit precise internal navigation. This suggests that future developers could benefit from designing educational resources that permit navigation using a standard protocol such as XPath [34].
  • For an annotation service that deals with user feedback, human intervention/judgment is still essential to ensure quality [35]. CRS employs an experienced earth science educator who vets each Review or Comment for inappropriate content, and occasionally initiates/facilitates further discussion among the commenter/reviewer, resource creator, and library staff about issues raised in the review or comment.
  • User education about what annotations are, and how to use them, is a continuing issue.

Acknowledgments

We gratefully acknowledge Mary Marlino and the entire team of dedicated professionals at the DLESE Program Center (DPC) in Boulder for their support in this collaborative effort. We thank Neil Holzman and Dale Chayes at Lamont-Doherty Earth Observatory for their time and insight in building our technical infrastructure.

This work was supported by the National Science Foundation through grant awards DUE00-85827, DUE02-26292, EAR03-05092, and EAR04-44680. This is Lamont-Doherty Earth Observatory contribution number 6887.

Notes and References

[1] Nuernberg, P., R. Furuta, J. Leggett, C. Marshall, F. Shipman (1995). Digital Libraries: Issues and Architectures. In Proc. of the 2nd Conference on the Theory and Practice of Digital Libraries (DL'95), July 11-13, 1995, Austin, Texas, USA. <http://www.csdl.tamu.edu/DL95/papers/nuernberg/nuernberg.html>.

[2] Digital Library for Earth System Education (DLESE) <http://www.dlese.org/>.

[3] DLESE Community Review System (CRS) <http://crs.dlese.org/>.

[4] Merriam-Webster Coll. Dict. 10th Ed. <http://www.m-w.com/dictionary/annotating>.

[5] Agosti, M., N. Ferro, I. Frommholz, U. Thiel (2004). Annotations in Digital Libraries and Collaboratories – Facets, Models and Usage. In Proc. of the 8th European Conference on Research and Advanced Technology for Digital Libraries (ECDL 2004), September 12-17, 2004, Bath, UK. LNCS 3232, Springer-Verlag, Berlin/Heidelberg, Germany, 244-255. <http://www.springerlink.com/openurl.asp?genre=issue&issn=0302-9743&volume=3232>.

[6] Kastens, K. (2004). Making DLESE into "the" Source for Pedagogical Content Knowledge Pertaining to the Earth & Environment. DLESE Quality Workshop 2004. October 27, 2004. <http://swiki.dlese.org/quality/uploads/1/Geo_PCK_source.pdf>.

[7] "What Are the Physical Effects of Glaciers?" (online educational resource) <http://www.dlese.org/dds/catalog_NASA-Edmall-446.htm>.

[8] Dublin Core Metadata Initiative (DCMI) < http://www.dublincore.org/>.

[9] World-Wide Web Consortium (W3C) Semantic Web Advanced Development (SWAD) Live Early Adoption and Demonstration (LEAD) project: Annotea <http://www.w3.org/2001/Annotea/>.

[10] Manduca, C. and D. Mogk (2000). The Digital Library for Earth System Education (DLESE): A Community Plan. Final Report to the National Science Foundation (NSF), Grant 99-06648. June 2000. <http://www.dlese.org/documents/plans/CommPlanFinal_secure.pdf>.

[11] Marlino, M., T. Sumner, D. Fulker, C. Manduca, D. Mogk (2001). The Digital Library for Earth System Education: Building Community, Building the Library. Communications of the ACM, v.44 n.5, p.80-81, May 2001. <http://doi.acm.org/10.1145/374308.374356>.

[12] Wright, M., M. Marlino, T. Sumner (2002). Meta-Design of a Community Digital Library. D-Lib Magazine, May 2002, 8(5). <http://www.dlib.org/dlib/may02/wright/05wright.html>.

[13] National Science Digital Library (NSDL) <http://nsdl.org/>.

[14] DLESE Metadata frameworks for Resources, Collections, and Annotations <http://www.dlese.org/Metadata/>.

[15] NSDL Annotation and Review Services <http://annotations.comm.nsdlib.org/>.

[16] Fulker, D. and G. Janee (2002). Components of an NSDL Architecture: Technical Scope and Functional Model. <http://arxiv.org/abs/cs.DL/0201027>.

[17] Lagoze, C., W. Arms, et al (2002). Core Services in the Architecture of the National Science Digital Library (NSDL). In Proceedings of the 2nd ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL 2002), July 14-18, 2002, Portland, Oregon, USA. ACM Press, New York, NY, 201-209. <http://doi.acm.org/10.1145/544220.544264>.

[18] Op. Cit. [14]

[19] Kastens, K. (2004). Concerning Annotations and Annotation Collections in the context of DLESE. February 13, 2004. <http://www.dlese.org/documents/papers/KK_Annotations_WP.pdf>.

[20] DLESE Collection System (DCS) <http://dcs.dlese.org/preview/>.

[21] Climate Change Collection (CCC) <http://serc.carleton.edu/climatechange/>.

[22] Journal for Earth System Education (JESSE) <http://jesse.usra.edu/>.

[23] Kastens, K. (2005). The DLESE Community Review System: Gathering, Aggregating, and Disseminating User Feedback about the Effectiveness of Web-based Educational Resources. Journal of Geoscience Education, 53(1), 37-43. <http://www.nagt.org/files/nagt/jge/abstracts/Kastens_v53n1.pdf>. Note: Until summer 2005, the CRS gathered specialist reviews as well as community reviews, and had an additional goal: to identify excellent resources and advance them into the DLESE Reviewed Collection. Specialist reviews are now being handled by Science Education Solutions: <http://www.dlese-project.org/2005workstatement.html#pr>.

[24] Kastens, K. and N. Holzman (2006). The Digital Library for Earth System Education provides Individualized Reports for Teachers on the Effectiveness of Educational Resources in their own Classrooms. D-Lib Magazine, Jan. 2006, 12(1). <http://www.dlib.org/dlib/january06/kastens/01kastens.html>.

[25] DLESE Reviewed Collection (DRC) Best Practices <http://www.dlese.org/Metadata/collections/drc-best-practices.htm>.

[26] Example of the DLESE Discovery System (DDS) resource description page for the "Virtual Earthquake" educational resource: <http://www.dlese.org/dds/catalog_DLESE-000-000-000-050.htm>.
The CRS "Review or Comment on a Resource" page builds a menu of reviewable resources via live queries to the DLESE Search Web Service: <http://crs.dlese.org/submit/menu.html>.

[27] Op. Cit. [14]

[28] Open Archives Initiative Protocol for Metadata Harvest (OAI-PMH) <http://www.openarchives.org/OAI/openarchivesprotocol.html>.

[29] DLESE Discovery System (DDS) <http://www.dlese.org/dds/>.

[30] Example of the CRS "Available Reports" page for the "Virtual Earthquake" educational resource: <http://crs.dlese.org/annotations/?id=DLESE-000-000-000-050>.
Example of the "Reviews, teaching tips, and related resources" search result page from the DLESE Discovery System for same: <http://www.dlese.org/dds/view_resource.do?reviews=DLESE-000-000-000-050>.
Example of DLESE Search Web Service query for same: <http://www.dlese.org/dds/services/ddsws1-0?verb=GetRecord&id=DLESE-000-000-000-050>.

[31] Weatherley, J. (2005). A web service framework for embedding discovery services in distributed library interfaces. In Proceedings of the 5th ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL 2005), June 7-11, 2005, Denver, Colorado, USA. ACM Press, New York, NY, 42-43. <http://doi.acm.org/10.1145/1065385.1065394>.

[32] Op. Cit. [26]

[33] Davis, L. (2005). Usability Study of the DLESE Community Review System. September 9, 2005. <http://www.dlese.org/documents/reports/crs_usability_report2005.pdf>.

[34] XML Path Language (XPath) 2.0 <http://www.w3.org/TR/xpath20/>.

[35] Kastens, K., B. DeFelice, et al (2005). Questions & Challenges Arising in Building the Collection of a Digital Library for Education: Lessons from Five Years of DLESE. D-Lib Magazine, Nov. 2005, 11(11). <http://www.dlib.org/dlib/november05/kastens/11kastens.html>.

Copyright © 2006 Robert A. Arko, Kathryn M. Ginger, Kim A. Kastens, John Weatherley
spacer
spacer

Top | Contents
Search | Author Index | Title Index | Back Issues
Previous Article | Next article
Home | E-mail the Editor

spacer
spacer

D-Lib Magazine Access Terms and Conditions

doi:10.1045/may2006-arko