Robert Darton, Director of the University Library at Harvard, has a very interesting piece in the June 12th issue of the NY Review of Books (which is not the New York Times Book Review, in case you were wondering). He begins with an overview of the technological evolution of books (written language circa 4000 b.c.e., alphabetic script 1000 b.c.e., scrolls 0 b.c.e., the codex, or bound, flip page book 3 century c.e., movable type in China around year 1000, metallic movable type in Korea, two centuries later, Gutenburg the 1450's). Eventually he uses this timeline to argue for the inherent instability of texts and the lasting value of books. The value of books lay in the format's many variations: size and material reflect economic and social information, variant texts indicate many authorial and editorial intentions and accidents. All of this is valuable to a historian.
Needless to say, since 1969 and the early experiments to link up computers, the pace of change in the world of texts has been mind boggling. As January, 2005, there were 60 million blogs, for instance. By July, 70 million. Google is now in the mix, striking deals with Oxford, the NYC public library, Michigan, Stanford, and Harvard to digitize their collections.
Much has been written about a "post-literate" or "hyper-literate" generation, a cohort largely innocent of books. Amongst academics and college students, the web has displaced much of the library's traditional function: a sanctum and a place where you could find information available no where else. Darnton argues that this is still the case and that crucial information will be lost not by the act of digitizing texts, but rather by the growing belief that an authoritative text is online and that the search, as it were, can end there.
He presents several points:
1) Google can not possibly put all texts, and their variants online. Someone will choose the "true" text; someone will be making the decision to leave out thousands of chapbooks, DIY's, trade paperbacks, etc.
2) 60 percent of the books being currently digitized in only one of the above libraries. This is a mere fraction of what is available in research libraries across the U.S. Even if Google manages to digitze 90 per cent of this vast holding, many millions of books may vanish. And if we neglect our libraries because we think everything is on Google, they will vanish.
3) Copyrights are a huge problem. They extend 70 years past the lifetime of the author. This means hundreds of millions of items of text can only be read a few lines at a time, the standard "fair use" asserted by Google.
4) Companies vanish. Google will too. Then what?
5) Human error, computer error. Errors compounding errors.
6) Hardware and software become obsolete. This happens really fast, doesn't it?
7) Since Google plans to digitize many versions of the same book, how will they be prioritized? And how would anyone know which versions have been left out? A quote from the article: "Google employs hundreds, of not thousands, of engineers but, as far as I know, not a single bibliographer."
8) The size and texture of a book cannot be digitized. The article is worth reading for the details here.
As someone who loves the smell of binders glue and old bookshops, and considers human language the ultimate moving target, count me as a Darntonian. I still recall the gasp that escaped me when I stumbled across James Joyce's so-called "Buffalo Notebooks" at the BC library- these are the fascimiles of his notes for Finnegan's Wake- it was like when one first apprehends the true dimensions of an iceberg.
Ten years ago I would have asserted that the record album would remain a vital format. I don't think so anymore. The album- LP, CD, or MP3- is crumbling back into individual songs. A book, however, holds unique information about production in a way that the very standardized music formats of the past and present never have. Maybe the book will last...or at least have a decline in proportion to it longevity. Text maybe unstable, but ink on paper is far less so.
Needless to say, since 1969 and the early experiments to link up computers, the pace of change in the world of texts has been mind boggling. As January, 2005, there were 60 million blogs, for instance. By July, 70 million. Google is now in the mix, striking deals with Oxford, the NYC public library, Michigan, Stanford, and Harvard to digitize their collections.
Much has been written about a "post-literate" or "hyper-literate" generation, a cohort largely innocent of books. Amongst academics and college students, the web has displaced much of the library's traditional function: a sanctum and a place where you could find information available no where else. Darnton argues that this is still the case and that crucial information will be lost not by the act of digitizing texts, but rather by the growing belief that an authoritative text is online and that the search, as it were, can end there.
He presents several points:
1) Google can not possibly put all texts, and their variants online. Someone will choose the "true" text; someone will be making the decision to leave out thousands of chapbooks, DIY's, trade paperbacks, etc.
2) 60 percent of the books being currently digitized in only one of the above libraries. This is a mere fraction of what is available in research libraries across the U.S. Even if Google manages to digitze 90 per cent of this vast holding, many millions of books may vanish. And if we neglect our libraries because we think everything is on Google, they will vanish.
3) Copyrights are a huge problem. They extend 70 years past the lifetime of the author. This means hundreds of millions of items of text can only be read a few lines at a time, the standard "fair use" asserted by Google.
4) Companies vanish. Google will too. Then what?
5) Human error, computer error. Errors compounding errors.
6) Hardware and software become obsolete. This happens really fast, doesn't it?
7) Since Google plans to digitize many versions of the same book, how will they be prioritized? And how would anyone know which versions have been left out? A quote from the article: "Google employs hundreds, of not thousands, of engineers but, as far as I know, not a single bibliographer."
8) The size and texture of a book cannot be digitized. The article is worth reading for the details here.
As someone who loves the smell of binders glue and old bookshops, and considers human language the ultimate moving target, count me as a Darntonian. I still recall the gasp that escaped me when I stumbled across James Joyce's so-called "Buffalo Notebooks" at the BC library- these are the fascimiles of his notes for Finnegan's Wake- it was like when one first apprehends the true dimensions of an iceberg.
Ten years ago I would have asserted that the record album would remain a vital format. I don't think so anymore. The album- LP, CD, or MP3- is crumbling back into individual songs. A book, however, holds unique information about production in a way that the very standardized music formats of the past and present never have. Maybe the book will last...or at least have a decline in proportion to it longevity. Text maybe unstable, but ink on paper is far less so.
No comments:
Post a Comment