22:3 (2007:09) 22nd Conference (2007): Vision Session: The Evolution of Reading and Writing in the Networked EraAugust 30, 2007 at 4:40 pm | Posted in Conference Reports, Vision Sessions | Leave a comment
The Evolution of Reading and Writing in the Networked Era
Bob Stein, USC Annenberg Center, Institute for the Future of the Book
Reported by Mary Bailey
In the early days of books, when professors made notes in the margins and students added their notes when they read the same book, an ongoing conversation was created. Bob Stein proposes that the future of the book is an ongoing conversation in the margin of the electronic book. Turning the world of authorship and copyright on end, the book as we know it, published in a definitive form, never to be changed, would no longer exist.
An MIT project in 1981 began adding an audio/visual component to books. Designed to enhance the book by answering the questions a reader might have as they read, it allowed the reader to control the speed, to reread sections, and to stop and think about what had happened in the book. In effect, it was user driven rather than producer driven.
Moving to 2004 and our remix culture, we are now talking about networked books, with comments added by readers. Stories could change before they are told. Books could be written in chapters with comments added before the next chapter is written, thus creating an entirely new writing process, and possibly a new form of authorship. Software, called Sophie, has been developed which enables not only the writing and comment component but also audio versions, an interactive glossary, running commentaries of musical selections and more.
Consider blogs. We think, we write, we create, and others comment. We think, we write again, and others write again (we hope). A new creation appears. However, who is the author or creator now? Is the author speaking, are those ommenting also authors, or is the book now speaking? The book becomes dynamic and is no longer limited to text and static photos or illustrations; it now contains video and links to other sites.
If the work is always in process, will there ever be a version for copyright? Will there ever be a final authoritative version? Will copyright be necessary or will it become another piece of history? Will the original article become the least important piece and the discussion more interesting than the book or article?
The challenge of the future will be how to deal with the changes. Bob Stein asks, “Given the vast amount of information and conversation available on any subject, should it be a goal to enable a single individual to master it? What will it mean to be ‘human’ in the age of digital networking? What is the definitive version or does anyone care?”
In Bob Stein’s future, the book and reading are no longer a solitary pastime, but an interactive work developed by all who are interested.
22nd CONFERENCE (2007)
Publishing 101 – The Basics of Academic Publishing
Zachary Rolnik, Now Publishers
Reported by Lisa C. Gomes
This half-day preconference comprised useful information for everyone involved in the serials industry. Zachary Rolnik of Now Publishers has twenty-plus years experience in the serials publishing industry, which made him uniquely qualified to teach this session. He discussed the history of serials publishing and continued with a review of the market and the factors affecting it. Rolnik also included a review of the publishers from commercial through society and university presses.
Mr. Rolnik focused his discussion on the Scientific, Technical, and Medical (STM) market serials publishing, since that is his background. Market analysts identified scientific publishing as the fastest growing media sub-sector driven by the “publish or perish” mentality. However, there are three primary changes underway that affect this market, according to the analysts. First, there is a cyclical slowdown due to library budget cuts. Second, the scales are tipped toward the larger publishers — the large companies already have the business, and ninety-five percent of the market is based on annual renewals. Finally, the majority of money is spent on the Web interface for e-journals, which again tips the market to the larger publishers, as they can spend more money on their online platforms.
Typically, it is difficult for publishers to generate revenue from new journals. Therefore, large publishers have increased their focus on acquiring other, smaller publishers and enter into agreements with societies to license their content. In the meantime, the small and medium publishers develop niche markets and are author-centric.
There is also a new group of publishers entering the market that focus on current trends in the industry such as: updatable content, open access, licensing versus copyright, community-focused and subject-focused, alternate sales options and so on. This group of publishers is often responsible for the most innovation in this market. Some examples of these newer publishers include Mr. Rolnik’s company Now Publishing, Berkeley Electronic Press, (BE Press), and the Social Science Research Network (SSRN).
Mr. Rolnik also compared the differences between book and journal publishing. Although book publishing is a one-time process, journal publishing requires long-term commitment. The process of choosing a topic is quite different. In book publishing, the topic is either commissioned or the author already has a book they would like to publish. In contrast, the subject for journals requires market research to identify an underserved subject niche or subject fields. This process can be time-consuming.
The complicated structure for publishing a journal requires many different roles within the publishing companies. Therefore, a good portion of this preconference was dedicated to a discussion of the organizational structure of a typical publisher. The publishing or acquisitions department’s primary role is to identify topics, trends, authors, and editors. Other areas that Mr. Rolnik reviewed included: manufacturing and production, who turn the articles into the publication; marketing and/or public relations, which could be responsible for the traditional marketing avenues, but may also include website development and getting the journal listed in different indices; sales; business development; fulfillment; customer service; accounting/finance; and technology.
22:3 (2007:09) 22nd Conference (2007): Preconference: SCCTP Integrating Resources Cataloging WorkshopAugust 30, 2007 at 2:20 pm | Posted in Conference Reports, Preconferences | Leave a comment
22nd CONFERENCE (2007)
SCCTP Integrating Resources Cataloging Workshop
Joseph Hinger, St. John’s University
Reported by Selina Lin
Using the manual prepared by Steven J. Miller, University of Wisconsin-Milwaukee Libraries, in 2003, and revised February 2005, Joseph Hinger updated some parts of the course as necessary for this workshop. The workshop was taught in two days and divided into six sessions. Day one covered core sessions 1-3: Introduction, Original Cataloging and Updating Integrating Resources’ Records; and day two covered optional sessions 4-6: Copy Cataloging, Record Modification and Maintenance, Case Studies, and Updating Loose-leafs. Emphasis of the workshop was on electronic integrating resources as they present more challenges and catalogers are more familiar with updating loose-leafs. Session 7, Selection of Online Resources and Options for Providing Access, was omitted due to time constraints and its lesser relevance.
With the advent of HTTP around 1991, many publications began to appear in electronic format by 1995. These earlier electronic publications were treated as computer files, leader/06 type of record code “m”, regardless of their contents. As the Internet evolved and online databases and websites became prevalent, coupled with dissatisfaction with current rules for serials and loose-leafs, a desire to change OCLC and MARC to accommodate these emerging resources became self evident. The 1997 Crystal Graham/Jean Hirons paper “Issues Related to Seriality,” which was a major effort to harmonize AACR, ISSN, and ISBN, paved the way to the eventual complete revision of AACR2 and other changes in 2002. The new concepts of “continuing resources” and “integrating resources” were born. On December 1, 2002, LC implemented new AACR2 rules and LCRIs; OCLC and RLG also implemented most new 006/008 codes. Leader/06, type of record code “i”, and leader/07, bibliographic level code “i”, were added to MARC to represent integrating resources.
An integrating resource, IR, is defined as “a bibliographic resource that is added to or changed by means of updates that do not remain discrete and are integrated into the whole.” An integrating resource may be finite or continuing. Updating websites, updating databases, and updating loose-leafs are all integrating resources. However, online and loose-leaf format resources may be monographic, serial, or integrating. LCRI 1.0 provides guidance in making the decision. If the resource is basically complete, but may be corrected in some parts, treat it as a monograph. If it is likely to be updated over time, treat it as a serial or integrating resource.
Hinger continued the workshop with detailed information on each core session.
22nd ANNUAL CONFERENCE (2007)
Metadata Standards and Applications
Diane Hillmann, Cornell University; Rhonda Marker, Rutgers University
Reported by Deanna Briggs
Diane Hillmann and Rhonda Marker instructed approximately forty students in the Metadata Standards and Applications preconference session. The class was developed by Hillmann for the Library of Congress and the Association for Library Collections & Technical Services in early 2007. Many preconference participants expressed that their desire to attend the class was due to an impending project to develop a digital repository. As expected, most attendees were catalogers in some capacity.
The class covered a variety of metadata topics, including: metadata relationship models, interoperability, application profiles, and more. Hillmann and Marker explained early in the session that working with metadata standards and applications requires the metadata specialist to take a broad view of metadata, and consider how their metadata must function. For instance, one function of metadata is to manage documents. Therefore, the metadata specialist should look at items that require management in aggregate to make the best choices for the collection of items as a whole. The presenters stressed how important it is to frequently look at websites and digital libraries and mentally deconstruct them, asking themselves how the site applies metadata in bulk to collections to meet its functional goals. To illustrate this point, the class completed an exercise examining several digital library sites, including Birdsource, which is a database-driven site.
The preconference presenters continued to expand upon this vision of the aggregate view concerning metadata creation, storage, management, and distribution. They discussed the pros and cons of different metadata creation and storage models. They also remarked how important it is to maximize human resource efficiency in any project. For example, on the metadata distribution side, any one project might achieve some efficiency by harvesting metadata; but doing so may require additional human resources to implement the best methods to normalize the metadata for interoperability. Again, Hillmann and Marker focused the class on examples to see these principles in action, as in the case of the Country Walkers’ site. This site uses its metadata to draw potential customers in due to the ease of browsability, by destination, for instance.
No metadata information session would be complete without mentioning metadata relationship models and specific metadata standards. In this context, Hillmann provided the class with an update on the status of RDA and the class discussed relationships in UNIMARC, Dublin Core, and FRBR. The presenters noted that most metadata standards do not explicitly reference content standards, but simply provide guidance on content management. Some of the specific standards discussed included MARC21, Dublin Core, MODS, IEEE-LOM, and ONIX for Books.
The next lesson was metadata interoperability and distribution. As expected, OAI-PMH, OpenURL, and cross-walks were the focus of this section. Hillmann and Marker alerted attendees of the importance of documenting your institution’s specific practices and interpretations of any one standard to enable appropriate sharing of metadata. The presenters also raised the issue of documentation in the lesson on application profiles, including the many benefits of documenting the terms in an application profile.
The preconference also covered vocabularies and data quality. While it is important to document and register your vocabulary, Hillmann and Marker also emphasized the degree to which the choice of a vocabulary should be situation-specific, especially because there are so many different vocabularies. Similarly, the presenters noted that data quality should be evaluated at the community level, as different communities may have different levels of data quality that may be acceptable for their purposes.
In summary, the course was an excellent whirlwind into the world of metadata standards.
22nd CONFERENCE (2007)
Reported by Susan Markley
The opening program for NASIG’s 22nd Annual Conference began with the introduction of the 2007 NASIG award recipients, followed by a warm welcome from the Dean of the University of Louisville Libraries, Hannelore Rader. She spoke briefly about the university with its diverse student and faculty population, and the varied services that the Libraries on campus offered. Rader was followed by the delightful keynote speaker, Louisville historian and professor, Tom Owen.
Dr. Owen began his presentation by telling the audience about a 5-year experiment in which the urban city government was “married” to some suburban governments in an effort to improve services to all populations. This was followed by a fascinating history of the community from its earliest roots.
For those who delight in discovering the lively history of a city, Owen introduced the audience to George Rogers Clark, the founder of the settlement that became the city of Louisville in 1778. Clark was the preeminent American military leader on the northwestern frontier during the American Revolutionary War. Louisville was developed as a “necessity of war” to protect scattered settlements against the British army and native Indians. The British were encouraging the natives in their attacks. Clark’s successful attacks on the British troops and their forts eventually played a part in the ceding of the entire Northwest Territories to the United States after the war.
The city was actually named after the French King Louis XVI in gratitude for his help in the American Revolution with arms, officers, and equipment. All the region’s distilleries used his family name – Bourbon.
Kentucky was originally part of Virginia, but broke off in 1792. Considered a border state, it was the dividing line between the North and the South. Although the state did not join the Confederacy, they did join with the southern states after the Civil War because of strong economic ties.
Dr. Owen ended his presentation with a quick mention of some interesting local sites and some equally now famous local citizens.
His keynote address was just the right introduction to the start of our 22nd annual conference.