The Virtual Library of Science and Technology in the American venture the same as theVirtual Library of the Old Toledo School of Translators, is a new project from theIgnacio Larramendi Foundation that has been carried out, using current technology and the latest information coding. In a librarian environment this means the use of theMARC 21 format in its last update of April 2014, as well as theResource Description and Access (RDA), whose documentation is updated by the Joint Steering Committee for development of RDA.
TheLinked Open Data (LOD), technology has naturally been used, in the form in which is specified in the Europeana Data Model (EDM). The MARC repository of the Polymath Virtual Library automatically generates RDF descriptions according to the latest versions, 5.2.4 and 5.2.5. The EDM model has a great influence on the way to carry out this new Virtual Library, which like all the others has been implemented with DIGIBIBsoftware, already in its version 9, produced and developed byDIGIBÍS. It is worth mentioning that during the period for preparation of bibliographic records and digital objects, theSpanish Agency for International Development Cooperation (AECID in Spanish), among other institutions, has begun to use this same software for its digital library.
One of the EDM case studies presents just one of the libraries created with the DIGIBIBIB software, which gives a clear idea of the suitability of the Ignacio Larramendi Polymath Virtual Libraryfor the European and, by extension, American projects. As is well known, the Digital Public Library of America (DPLA) also uses this namespace, which has facilitated the creation by DIGIBIS of an app that allows joint searches in Europeana and in the DPLA. In addition, EDM version 5.2.5. adds two new RDA elements to the 7 already included in previous versions.
It is important to note that in this library, as in the previous ones prepared by the Ignacio Larramendi Foundation, and in general in all those implemented with DIGIBIB, the information coded in fields 37X of the MARC 21 format is automatically converted to the elements of the RDA namespace assumed by EDM. The systematic use of these fields to structure the relationships between the authors is fundamental for the Polymath Virtual Library.
Further information can be read in the article Data Aggregation and Dissemination of Authority Records through Linked Open Data in a European Context, by Xavier Agenjo, Francisca Hernández and Andrés Viedma, published in Cataloging & Classification Quarterly, Volume 50, No. 8, 2012.
One of the essential aspects of the Polymath Virtual Libraries is the special treatment of the authorities' records in accordance with the new cataloguing rulesResource Description and Access (RDA). The fundamental task is not only to establish a preferred heading for an author, in our case a translator/commentator, but also to provide as much information as possible about their biography and historical and cultural context. This provides enormous unity and completeness to the records and the database.
In this Library, unlike the Virtual Library of the Old Toledo School of Translators, the authors are the creators of the works collected and not their translators or commentators. The practically encyclopaedic records on these authors, who we call polygraphs, present numerous links to library resources that have already been adapted to RDA and EDM/LOD, thanks to the granularity of the MARC21 format and its malleability to adapt to new regulations and standards.
One of the LOD sources that is systematically used is the VIAF (Virtual International Authority File), that is, the virtual international file of authorised names and their variants - which is very important in a library like this one.
Thus, each author is identified by a web address that has special technical characteristics (a URI), and is linked to the VIAF database. In this way, the user can browse, by clicking on the links, corresponding to the different information resources that VIAF gathers, obtaining information that is often very rich, such as co-authors, timelines, written works, attributed works, etc.
DBpedia, the semantic structuring of Wikipedia, has also been systematically used, which allows for a very rich navigation through all kinds of concepts. See for example, Friar Bernardino de Sahagún. Other sources used to enrich information are the Linked Data Service Authorities and Vocabularies of the Library of Congress, the Faceted Application of Subject Terminology (FAST) of the OCLC, the Gemeinsame Normdatei of the Deutsche Nationalbibliothek, the Bibliothèque Nationale de France catalogue, and the authorities record of the Biblioteca Nacional de España (BNE).
One of the peculiarities based on the data used is the generation of time lines and geographical locations, as is already done with the DIGIMUSsoftware, also from DIGIBÍS. At the time of writing this note this functionality had not yet been implemented in this virtual library.
Although it may seem redundant, the use of all these sources is not always so, and we have preferred to err on the side of excessive zeal. Thus, for example, the BNE and VIAF files are not always at the same level and there are times when an author included in the VIAF is not in the BNE and vice versa. In addition, the VIAF always allows you to access WorldCat Identities through the about entry and find really extensive information, as for example happens with Carlos Sigüenza y Góngora.
All the subject headings allow a navigation between concepts thanks to the integration of theSubject Headings List published by the Subdirectorate General for Library Coordination of the Ministry of Education, Culture and Sport in SKOS (a representation of the knowledge organisation systems which is a recommendation of the W3C) and which in turn are linked to the Subject headings list in Galician (LEMAG) and with the Llista de encapçalaments de matèria de la Biblioteca de Catalunya (LEMAC). It is also the case that these SKOS records are linked to other lists of subject headings, such as those already mentioned in the Library of Congress Subject Headings (LCSH), the Répertoire d'autorité-matière encyclopédique et alphabétique unifié (RAMEAU) or theGemeinsame Normdatei.
As can be seen, the polygraph records have tried to provide as much information as possible about polygraphs and special care has been taken to link them to open data environments such as those already set out. It should be added that in the "More information" section of these files includes a link to Wikipedia and that each of them can also be accessed from Wikipedia itself - using the Wikipedian in Residence methodology. Therefore, we consider that the cataloguing of an author or polygraph is finished when a link is added to their file in the "External links” section of the corresponding Wikipedia entry. An example of what we say can be seen in the record of Bernardino de Sahagún in our Virtual Library and in Wikipedia.
In short, browsing can lead us to data located throughout the Web: from a consultation to the database, the user can move around the Web, guided by the links established with the datasets created and maintained by prestigious and powerful international institutions such as those mentioned above.
One of the greatest successes ofEuropeana has been the creation of an API that allows you to consult the entire Europeana database from any application that installs it. Thanks toDIGIBÍS, the Ignacio Larramendi Foundation has this API in its Polymath Virtual Library.
In this way, if we consult an author or search for one of their works in our Library, we will almost instantly obtain a considerably higher number of findings, since the search is carried out both in our library catalogue and in Europeana. As Europeana increases its collections, the number of results will increase.
Based on the fact that, thanks to theData Exchange Agreement (DEA), it is possible to reuse the metadata of digital objects accessible in Europeana through its API, DIGIBIS has developed a powerful harvesterDIGIHUB, for its library management programme, DIGIBIB, which allows records located in Europeana to be downloaded into a work file and, after being improved bibliographically, can be incorporated into any other database.
This rigorous process involves a great effort and provides an enormous wealth of information, which is very useful for the user. It is more difficult to identify the works correctly because of the frequent errors observed in the allocation of uniform titles and even in the transcription of the titles themselves, in the metadata provided by the different institutions. If the headings are already complex in themselves, the titles are much more complex, which is why it is necessary to carry out an enormous amount of research in order to unify the records in terms of cataloguing.
No less than 612 works by 25 authors have been incorporated into this new virtual library. The main heading is coded with a 1XX field in MARC 21 format and with 3XX fields to add semantic content and group the authors in different categories. This enrichment has been carried out, among others, with the Virtual International Authorities File (VIAF).
A truly costly and meticulous task that requires vast culture, extensive and specialized professional training and access to an exhaustive bibliography that is not usually online.
Probably no researcher from the period of the transmission of the science of the American venture will ever have had the works themselves available to them, as they are about to have, through the Virtual Library of Science and Technology in the American venture.
Digital Public Library of America has been operational since 18 April 2013. It is important to note that the DPLA will provide a large amount of digital information and it will be strange, given the importance of the participants in the project, for it not to become, like Europeana, a very important source of bibliographical, archival or museological information.
It is too early to say, but it is more than likely that the DPLA will provide important digitized collections that will help to increase the digital resources of the Virtual Library of Science and Technology in the American venture. Over the years, American libraries, archives and museums, and especially the great American magnates, have acquired magnificent bibliographic pieces in Europe and Spain that have finally gone to an institution that has probably at some point proceeded to its digitization.
It should never be forgotten that the first edition of La Celestina is held in a North American institution or the reasons why that institution has this unique copy. If this happened with the first edition of the comedy by Calisto and Melibea, it is likely that it has also happened with manuscripts linked to the Science and Technology in the American venture, with first editions, with second editions or reprints of those translations that would have been so widespread in Europe. It is certain that this will be so and that is why the DPLA is so important, because it will allow us to search in a single place, in a single point of consultation, the information distributed in a multitude of institutions.
The support that the DPLA has adopted the Europeana Data Model (EDM) as its Linked Open Data (LOD) model - it can be consulted in Metadata Application Profile— has meant that this is very important to Europeana. The purpose of this introduction is not to assess the importance that this decision has had for Europeana, but it should be put on record that this will make it possible to access new contents.
The similarity of both data structures has allowed DIGIBÍS to develop a pioneering application that allows the simultaneous consultation of these two immense databases. Through a single search the results in Europeana and the DPLA are obtained on the same screen. The application is offered by the DPLA on its Apps Library page.
One of the purposes that guides the creation of these Virtual Libraries is to give the greatest visibility to documents - whether printed or manuscripts, maps or photographs - and to Hispanic thought over 2000 years. There is no point in digitizing bibliographic materials if they are then stored in a repository completely disconnected from the Web or in a supplementary memory unit.
To achieve the highest possible visibility, digital materials are encoded according to a broad set of standardized metadata. The experience of the Project Director of the Ignacio Larramendi Foundation as the person in charge of theCollective Catalogue of Bibliographic Heritage, of the computerization of theNational Library, with its Ariadne system, or of many other projects of this institution, has clearly defined a line of work focused on standardisation. TheFHL Virtual Libraries have followed the same vein. This principle of seeking visibility on the basis of standardisation has been a norm in the interaction between the Ignacio Larramendi Foundation and theDIGIBÍScompany, owned by the Foundation.
Thus, many of the specifications that have been prepared to develop the software for FHL Virtual Libraries have been part of the development of the DIGIBIBLibrary Management System or the DIGIARCH Archive Management System and, vice versa, the development of these two systems or others in the pipeline will undoubtedly benefit FHL Virtual Libraries, as has happened in the specific case of the Virtual Library of the Old School of Translators of Toledo.
The initiative on digital libraries has already been known since the Lisbon Agenda, but it was the creation of Europeana and its incorporation into the Digital Agenda for Europe that gave the large European digital library its legal backing. Europeana has been developing its functional specifications, data model and regulatory environment in coordination with theW3C Library Linked Data Incubator Group (W3C LLD). The final report of this international working group was based on the analysis of a series of case studies including the Ignacio Larramendi Polygraph Virtual Library, named the Polymath Virtual Library. The FHL Virtual Libraries have sought to adhere to Europeana's data model, the EDM, from the outset.
As the Europeana Data Model evolves it has been implemented in our virtual libraries. In fact, the last EDM specification 5.2.5 corresponds to May 2014, just when we finished the project for the Virtual Library of the Science and Technology in the American venture
It has already been mentioned that the Virtual Library of Francisco Sánchez, the Sceptic is one of the cases of the use of EDM for libraries and therefore there is nothing special about continuing to adjust to it. However, based on long experience, it was decided to keep the MARC 21 format as the backbone of the application and transform these records to ECM through an automatic mapping process.
The harsh experience of the author of this introduction, who in the mid-1980s had to learn how to create MARC records for exchange without specifications, has been definitive in guiding the project towards standardisation and for the transformations between data models to be carried out by the software in a way that is transparent to the user. Therefore, the data entry of FHL Virtual Libraries is designed for a complete cataloguing that can later generate records in ISO 2709, but also in EDM.
At the same time, the new MARC 21 format fields have been incorporated to adapt to the RDA cataloguing rules. We believe that the most important thing, or the one with the most future, has been to introduce the necessary fields to generate links to the value vocabularies or datasets recommended by W3C LLD and Europeana. These recommendations have been an invaluable guide when designing the relationship model for the Virtual Library of the Old Toledo School of Translators, without which it is quite possible that many blind people would not have been able to use them.
It is not a question of giving a full explanation of the use of the RDA and the Marc 21/RDA format, since this was already done in a paper presented to the congress of the IFLA in Puerto Rico and, a year later, in parallel with the progress of the FHL Virtual Libraries projects, in a paper already mentioned that was published in "Cataloging & Classification Quarterly”.
It is evident that the availability of data in Linked Open Data has to give rise to new and more powerful functionalities for a more human consultation of this data, since the download of vocabularies or datasets or their consultation using SPARQL cannot be called user-friendly.
The Polymath Virtual Library's authorities and bibliographic records file is registered in The Data Hub and is one of the few cases included in this resource. It goes without saying that these records, like all those created from DIGIBÍS applications and in particular the DIGIBIB application 9, on which they are built, dynamically and transparently generate two OAI-PMH repositories, one for authority records and the other for bibliographic records, and at the same time provide EDM records which provide broad visibility throughout the Web, which is the primary objective of FHL Virtual Libraries.
It may be the case that when accessing an author's digitized works in this library, we may come across a small number of them. This number will be automatically increased thanks to the Europeana API, which we have implemented in all our virtual libraries. At the time of writing these lines, Europeana already has more than 33 million digitized works and the number is growing steadily.
In this Virtual Library we have also added the possibility of searching by the date of an electronic resource and by the date of the original resource. Sometimes there will be no digitization dates in the bibliographic records, which will occur when the data provider (any library or archive) does not provide them to Europeana. In these cases, a text will appear instead: [Unidentified date of publication]. In any case the date of creation of the electronic resource has been identified whenever possible. Similarly, if the date of the original is available, this appears on the bibliographical record in the "Notes" section, where can also be read, in addition to a brief description and bibliographical analysis of the digitized original material.
Europeana is the source from which we obtain the records we offer, reviewed and re-catalogued, in this Virtual Library of Science and Technology in the American venture and which are significantly enriched by links to other bibliographic resources on the Web.