Authentication tools
Annotated Bibliography - Tools for Authenticating Digital Content
References
(2008). Why Authentication Procedures Matter for US and UK Public Legal Resources on the Web. Legal Information Management, 8, 35-42.
This article published in Legal Information Management provides an overview of the state of authentication of legal resources provided by the US federal government, state governments, and the government of the United Kingdom. The author defines authentic resources in this context as electronic resource that are recognized as official by the government or non-official surrogates for official resources which may or may not exist. The author demonstrates a lack of authentication of these types of resources by state and federal governments. The article implies that the material provide by governmental public legal resources in the United States and the United Kingdom in many cases has largely not been authenticated either for its ability to be maintained or to check whether it has been previously altered. The author makes special note that the UK no longer publishes any official new codifications but instead only provides a database which is legally unofficial and unauthenticated. The article then looks at ways the problem of authentication in public online legal resources might be dealt with. The author includes a description of INTERPares and talks about its implications towards solving this problem. The author also spends time talking about the benefits of computational methods in solving the problem of authentication; at least from the perspective of ensuring lack of alteration, and provides of an example of how to use a public/private key system to ensure a document has not been altered. Finally the author highlights Ohio Supreme Courts online decisions database which although the legal resources provided by it are explicitly not official provides a good example of how to correctly implement an authentication system. This article provides a good overview of the state of authentication as it pertains to public government resources and provides good examples of best practices and a framework for the future work on the topic.
Adams, S. (2010). Preserving Authenticity in the Digital Age. Library Hi Tech, Vol. 28(4), 595-604.
This article published in Library Hi Tech by Sharon Adams is on how authenticity is now being assessed in regards to digital records in an increasingly digital environment. Adams open the article with a discussion and definition of authenticity. She then goes on to discuss how authenticity vary between physical and digital objects identifying the main difference between the authenticity of a physical object and a digital object. She states that a physical object is not considered an authentic original if a reproduction while a digital object can be considered authentic when reproduced if it retains the key aspects of the original. The article then speaks on the possibility of loss in file translation before going on to highlight the necessity of seeing digital object authentication in terms of an ongoing process that takes into account their dynamic nature. After this the article cites an advantage of digital objects; they are all made out of binary code and so can check automatically to see if they meet authentication standards by software. Adams then goes onto to highlight three automated resource for authenticating digital resources. Adams uses these to highlight the adaptive nature of digital content and concludes the article with a call for a fundamental change in views about authentication towards ongoing authentication to meet the needs of this rapidly changing curation world. This article provides a good overview of the theoretical groundwork that goes into defining what "authenticity" of digital objects should be considered and how we should frame our work to authenticate these digital objects.
Altman, M. M. (2009). Transformative effects of NDIIPP, the case of the Henry A. Murray archive. Library Trends, 57(3), 338-351.
This article published in Library Trends by Micah Altman outlines the improvement made to the Henry A. Murray Research Archives from the Archives enrollment in the National Digital Information Infrastructure and Preservation Programs (NDIIPP) Data-PASS program. The article begins with a brief outline of the archives mission and the integration of the Archive into the Institute of Quantitative Social Sciences (IQSS) and Harvard MIT Data Center (HMDC) which lead directly to the archive participating in and implementing the improvements from the Data-PASS program. Altman then goes onto to describe the evolution Henry A. Murray Archive and describes how they first adopted the software component Data-PASS project. The article then describes how Data-PASS was used to develop a structure for more efficient and secure practices for the Archives handling of new data as well as the large amount of data the archive had to transfer from its analog collection. The article states that this process allowed for the automation of all stages of the processing workflow including ingestion and authentication. Altman then goes onto to discuss several key acquisitions made during the Data-PASS project and highlights the specific improvements made by adopting the new software and data handling/intake standards at the Archives. The article then concludes with a discussion of the Henry A. Murray Archives and its acquisitions procedures. This article provides a good overview of the Henry A. Murray Research Archives took to implement new authentication and ingest procedures through a national initiative, NDIIPP Data Pass, and provides a good description of how these changes to its process were applied and what effects those changes had on the Archives.
Gilliland-Swetland, A., & Eppard, P. (2000). Preserving the authenticity of contingent digital objects: The InterPARES project. D-Lib Magazine, 6(7-8), 11p.. doi:10.1045/july2000-eppard
The article published D-Lib Magazine by Anne J. Gilliland-Swetland and Phillip B. Eppard the goals of, case studies involved in, and conclusions drawn from InterPARES a multinational project with the goal of developing the knowledge essential to the for the long term preservation and authentication of digital objects. The article opens with a discussion of what authenticity is for electronic records and then describes InterPARES and its international efforts to study and define authenticity in the electronic context. Gilliland-Swetland and Eppard describe the process know as Diplomatics as a method for determine authenticity going back to the eighteenth century and connect the application of Diplomatics to the determination of the authenticity of electronic resources. They describe one of the major goals of InterPARES to analyze the elements of documents and from this topology develop requirements for document authenticity in electronic documents. The article then describes some of these elements used in Diplomatics and then goes on briefly to describe four grounded theory theoretically approached case studies the InterPARES project used to draw its conclusions on what elements truly matter in the authenticity of electronic resources. The article concludes with a the elements that do and do not matter in authenticity of electronic resources. The InterPARES studies have determined that physical carrier does not matter as much as intellectual fixity in the authenticity of electronic resources. The InterPARES project seeks gain the knowledge of how to maintain the integrity of electronic resources recreate them authentically. The article concludes by stating that the diplomatic analysis has been used to determine the authenticity of records in the past that the InterPARES project seeks to understand better the nature of electronic records and the elements they have that could be used to determine their authenticity in the future. This article is certainly valuable for understanding InterPARES and its approach to authentication it is especially noteworthy for outlining both its the Diplomatics approach and InterPARES's adaption of that approach in developing authentication standards for the modern world.
Gorraiz, J., & Gumpenberger, C. (2010). Going beyond Citations: SERUM — a new Tool Provided by a Network of Libraries. Liber Quarterly: The Journal Of European Research Libraries, 20(1), 80-93.
In this article by Juan Gorraiz and Christian Gumpenberger published in Liber Quarterly suggest the use of Standardized Electronic Resource Usage Metrics (SERUM) by academic libraries. This systems uses global download data to track demand for documents library users. In the article Gorraiz and Gumpenberg argue for the set up of a global network of librarians to ensure the integrity of authenticity of the downloaded works tracked and the tracking data itself. They propose that this method, SERUM, would be better at tracking user demand than Journal Citation Reports (JCR) and provide greater benefit to users, libraries, publishers, and authors if supported by a large group of librarians and publishers. The article argues that download count is a better proxy for usage and if the authenticity of the data is established it can help provide libraries with a better understanding of patron desires and needs. The article provides a good overview of the SERUM system which provides libraries with both a better way to track their acquisitions and authenticate large amounts of usage data.
Joint, N. N. (2009). Recent trends in authentication and national information management policy in the UK. Library Review, 58(6), 405-413.
In this article published in Library Review by Nicholas Joint the author provide an overview of and discussion of the evolution and current state of authentication standards in digital libraries in the UK. The article opens with a brief history of user access in UK University library systems and overviews the unique BIDS system which sets up a unique user access and authentication situation for all UK university digital libraries. The author overviews the contrary approaches taken by other countries to user authentication in digital libraries and then describes the large accidental benefit of universal access and authentication that the system the UK uses offers. Joint then calls for an overhaul of the current to allow for authentication without as much violation of privacy and for the integration of authentication into a greater role in the UKs national information strategy.
Kärberg, T. (01/01/2013). "Digital preservation of knowledge in the public sector: a pre-ingest tool". Archival science (1389-0166), p. 1. doi:10.1007/s10502-013-9211-z
In this article published online trough Archival Science by Tarvo Kärberg the author present a case study involving the Estonian National Archives the Universal Archiving Model (UAM) in the preparation, transfer, preservation of digital records. The article notes that most major frameworks in the archiving world do not take into account or prescribe step or action for the pre-ingest/ data preparation stage of the archiving process. Kärberg points out that their has a recent trend among public archives to take into account pre-ingest as part of the digital preservation process and suggest that the Electronic Records Management Systems (ERMS) used by most curatorial and preservation organization today provide a good place in the electronic resources life cycle to preform pre-ingest functions. The article states that the pre-ingest stage of preservation is one of the best places for enrichment of information and the addition of metadata because it is important to provide structure and detail to records at the earliest possible stage. The article discusses how ERMSs produces records for UAMs to read by translating the XML generated by the ERM through XSLT into an XML format that matches the UAM standards and because of this individual UAMs can check if ingested data matches the archival standards of a given institution. The article describes a ten step data preparation, authentication, and transfer project taken on by the Estonian Office of the Minister for Population and Ethnic affairs. This case study found that it was possible to reuse metadata from ERMs or other pre-ingest sources in a valid and authentic manner and found that these pre-ingest tools can be used in a flexible manner to provide detailed configuration before transfer. Kärberg concludes this article with a discussion of how the cases study shows the ability to add descriptive value during the pre-ingest phase of a digital curation and preservation project while maintaining authenticity and providing future users of the resource with richer data. The author also talks about how this can be technically accomplished through ERMS and UAM to allow for easier and richer transfer of digital resources between institutions. The article as well talks about how deeper validation and authentication of those resource can be accomplished through the standards developed for the pre-ingest processing of electronic resources and is a good, if detailed, description of applying tools already available in new ways to improve ingest and authentication.
Park, E., & Oh, S. (2012). Examining Attributes of Open Standard File Formats for Long-term Preservation and Open Access. Information Technology & Libraries, 31(4), 44-65.
In this study by Eun G. Park and Sam Oh published in Information Technology and Libraries common attributes of file formats used to established selection criteria for valid or authentic files in open-standard file-format selection. The individual elements of these formats were systematically reviewed to reveal which matter most in open-standard file-format selection. The article separates standard file formats into two main types: preservation formats and access formats. The article then goes onto define some other relevant categories of file standards present in the professional world including standard file formats, open standards and the Library of Congresses seven standard requirements for file-format characteristics. The study points out that many of these standards as they exist in the real world are poorly and unsystematically defined and seeks to better define and establish an open standard file-format criteria. Park and Oh did this by preforming a literature review on literature concerning the key attributes of the ISO three standard file-formats (PDF, Xml, and PDF/A). What they found was that the most common criteria for judging a file-format were functionality, metadata, openness, interoperability, and independence. They also found that among the attributes discussed in the literature authenticity was paramount, especially for archives and records management. They found that stability, traceability, and integrity all matter when authenticity was used to judge file-formats in the literature. Park and Oh conclude the article with a discussion of how it remains difficult to assess the appropriateness of a file-format even for a specific task. They suggest that this study can provide a framework from which to create appropriate strategies when selecting file-formats for long term preservation but concede that the question of which file format to use in a given preservation project can only be determined at a local level. The article includes in detailed table in the appendix outlining the attributes look at in the literature. The article provides a good overview of the characteristics required in selecting file formats that support authenticity and the framework the lay out can be used in determining the support a file format might provide for given institutions.
Rudersdorf, A. A. (2012). Digital preservation ingest can be a "CINCH". Library Hi Tech, 30(3), 449- 456.
This article published in Library Hi Tech and written by Amy Rudersdorf provides an overview of Capture, Ingest, and Checksum (CINCH) tool designed to address the need to automate the transfer of digital content between providers and repositories. The article states that CINCH provides automated virus checking, automated checksums, and automated authentication of incoming resources and is aimed at small and medium sized preservation organizations especially those mandated to take in certain material. CINCH provides small to medium size organizations with the means to ingest documents without much modification, it provides a history of the document from ingest, it check if the files are what they purport to be, it checks for nefarious agents, and it can certify that the files ingested are ready for preservation. Unlike other tools of its type CINCH takes care of the collection and packaging of content from providers. Rudersdorf states that it is also repository neutral and open source, available for free download through North Carolina's statewide online library service, NCLIVE. The article goes onto to explain how CINCH works. Stating that its main goal is simplicity and ease of use it is highlighted that a simple manifest file (.CVS) is used to point to content submitters content and ingest this into CINCH for processing. When processing CINCH first preforms a fixity check on the files downloaded then check them for viruses. CINCH will then throw out any corrupted files then add their filenames to the errors manifest. After this extract metadata from the files, preform a hash value comparison, and check the current collection for duplicates; saving any found duplicates in a separate file for manual acceptance or deletion by archive administrators. Rudersdorf goes onto to state that future goals for CINCH will be to add greater metadata extraction abilities including integrating OpenNPL to allow for subject heading generation. This article describe a useful tool that is available for authenticating, ingesting, and managing material from third parties that is available to medium to small sized archives for free.
Walker, C. H. (2010). Record authentication as a barrier: Reflections on returning to CONSER. Cataloging And Classification Quarterly, 48(2-3), 161-168.
This article by Christopher H. Walker published in Cataloging and Classification Quarterly recounts the authors personal experience with CONSER. CONSER is an authorized serial cataloging group in OCLCs WorldCat system. They are authorized to make bibliographic and metadata records considered authentic by OCLC for any serial publication. Members of the organization are also entitled to make changes to these records while non-members are not. This article recounts the authors movement from an institution were he was a member of CONSER to one were he was not. The article recounts the many errors in the records that he came across while at an institution that was not a member of CONSER that he could not fix or replace. Walker makes the argument based on this experience the authentication may be a barrier to sharing information and expertise. This article provides a perspective that shows that authentication can serve as a barrier to good curation and preservation practice if it too many restrictive and unnecessary walls are put in the way of providers in order to maintain it. The article provides a good example of how protecting authenticity blindly may both hinder good curation, records management, and preservation practices but also can get in the way of actual authentication as well.