Disqus Shortname

Responsive Ads Here

Monday, July 23, 2018

Microfilm Lasts Half a Millennium

I recently acquired a decommissioned microfilm reader. My university bought the reader for $16,000 in 1998, but its value has depreciated to $0 in their official bookkeeping records. Machines like it played a central role in both research and secret-agent tasks of the last century. But this one had become an embarrassment.

The bureaucrats wouldn’t let me store the reader in a laboratory that also houses a multi-million-dollar information-display system. They made me promise to “make sure no VIPs ever see it there.” After lots of paperwork and negotiation, I finally had to transport the machine myself. Unlike a computer—even an old one—it was heavy and ungainly. It would not fit into a car, and it could not be carried by two people for more than a few feet. Even moving the thing was an embarrassment. No one wanted it, but no one wanted me to have it around either.

And yet, the microfilm machine is still widely used. It has centuries of lasting power ahead of it, and new models are still being manufactured. It’s a shame that no intrigue will greet their arrival, because these machines continue to prove essential for preserving and accessing archival materials.  


The first micrographic experiments, in 1839, reduced a Daguerreotype image down by a factor of 160. By 1853, the format was already being assessed for newspaper archives. The processes continued to be refined during the 19th century. Even so, microfilm was still considered a novelty when it was displayed at the Centennial Exposition in Philadelphia of 1876.

The contemporary microfilm reader has multiple origins. Bradley A. Fiske filed a patent for a “reading machine” on March 28, 1922, a pocket-sized hand-held device that could be held up to one eye to magnify columns of tiny print on a spooling paper tape. But the apparatus that gained traction was G. L. McCarthy’s 35mm scanning camera, which Eastman Kodak introduced as the Rekordak in 1935, specifically to preserve newspapers. By 1938, universities began using it to microfilm dissertations and other research papers. During World War II, microphotography became a tool for espionage, and for carrying military mail, and soon there was a recognition that massive archives of information and cross-referencing gave agencies an advantage. Libraries adopted microfilm by 1940, after realizing that they could not physically house an increasing volume of publications, including newspapers, periodicals, and government documents. As the war concluded in Europe, a coordinated effort by the U.S. Library of Congress and the U.S. State Department also put many international newspapers on microfilm as a way to better understand quickly changing geopolitical situations. Collecting and cataloguing massive amounts of information, in microscopic form, from all over the world in one centralized location led to the idea of a centralized intelligence agency in 1947.

It wasn’t just spooks and archivists, either. Excited by the changing future of reading, in 1931, Gertrude Stein, William Carlos Williams, F. W. Marinetti, and 40 other avant-garde writers ran an experiment for Bob Brown’s microfilm-like reading machine. The specially processed texts, called “readies,” produced something between an art-stunt and a pragmatic solution to libraries needing more shelf space and better delivery systems. Over the past decade, I have redesigned the readies for 21st-century reading devices like smartphones, tablets, and computers.

By 1943, 400,000 pages had been transferred to microfilm by the U.S. National Archives alone, and the originals were destroyed. Millions more were reproduced and destroyed worldwide in an effort to protect the content from the ravages of war. In the 1960s, the U.S. government offered microfilm documents, especially newspapers and periodicals, for sale to libraries and researchers; by the end of the decade, copies of nearly 100,000 rolls (with about 700 pages on each roll) were available.


Their longevity was another matter. As early as May 17, 1964, as reported in The New York Times, microfilm appeared to degrade, with “microfilm rashes” consisting of “small spots tinged with red, orange, or yellow” appearing on the surface. An anonymous executive in the microfilm market was quoted as saying they had “found no trace of measles in our film but saw it in the film of others and they reported the same thing about us.” The acetate in the film stock was decaying after decades of use and improper storage, and the decay also created a vinegar smell—librarians and researchers sometimes joked about salad being made in the periodical rooms. The problem was solved by the early 1990s, when Kodak introduced polyester-based microfilm, which promised to resist decay for at least 500 years.

Microfilm got a competitor when National Cash Register (NCR), a company now known for introducing magnetic-strip and electronic data-storage devices in the late 1950s and early ’60s, marketed Carl O. Carlson’s microfiche reader in 1961. This storage system placed more than 100 pages on one four-by-six-inch sheet of film in a grid pattern. Because microfiche was introduced much later than microfilm, it played a reduced role in newspaper preservation and government archives; it was more widely used in emerging computer data-storage systems. Eventually, electronic archives replaced microfiche almost entirely, while its cousin microfilm remained separate.

Microfilm’s decline intensified with the development of optical character recognition (OCR) technology. Initially used to search microfilm in the 1930s, Emanuel Goldberg designed a system that could read characters on film and translate them into telegraph code. At MIT, a team led by Vannevar Bush designed a Microfilm Rapid Selector capable of finding information rapidly on microfilm. Ray Kurzweil further improved OCR, and by the end of the 1970s, he had created a computer program, later bought by Xerox, that was adopted by LexisNexis, which sells software for electronically storing and searching legal documents.

By the 1980s and ’90s, OCR was fast replacing microfilm as the go-to search and retrieval mechanism for business and legal documents, but parallel to that decline, microfilm emerged in a recurring role in mystery and horror movies, as seen in Ryan Creed’s YouTube compilation video of “Hot Chicks Looking at Microfilm in Horror Movies.” Microfilm had become part of a campy joke about discovering dark, salacious secrets.


Microfilm machines trained people’s eyes to read differently: A blur of rapidly advancing images replaced flipping through pages, a precursor for the transition from reading books to surfing the web. Once we adjusted to the nonlinear reading devices, we wanted to jump around instead of advance through page after page. When Adobe introduced the portable document format (PDF) in the late 1990s, allowing facsimile-like scans to be available in electronic and, later, in searchable OCR forms, microfilm fell further out of favor as a storage and retrieval system.

Today’s digital searches allow a reader to jump directly to a desired page and story, eliminating one downside of microfilm. But there’s a trade-off: Digital documents usually omit the context. The surrounding pages in the morning paper, or the rest of the issue of a magazine or journal vanish when a single, specific article can be retrieved directly. That context includes more than a happenstance encounter with an abutting news story. It also includes advertisements, the position and size of one story in relation to others, and even the overall design of the page at the time of its publication. A digital search might retrieve what you are looking for (it also might not!), but it can obscure the historical context of that material.

Digital searches also turn search activity into data that someone else can surveil, compare, quantify, and visualize. The user’s own thinking becomes the object of search and retrieval, not just the documents that user hopes to find. None of this happens when using a microfilm machine. A library can record what materials a user requests or checks out, but the microfilm reader itself cannot track what someone looks at when using the machine. It is not networked to all other searches. No entity, corporate or governmental, uses algorithms to analyze microfilm readers’ habits and predilections. The microfilm reader does not read you, your emotions, or your political or consumerist desires.

Recently, as revelations about the extent of data collection and analysis online have come out, people have become more aware of the spy-like function of online services. Literal espionage has also become more visible, as cyberattacks impact corporations, infrastructure, and even elections. But nobody considers microfilm a viable alternative. Despite its spy-craft credentials, people aren’t even fascinated with microfilm as an object of retro nostalgia. It doesn’t have the hipster cachet of a typewriter or letterpress typesetting or a record player, for example; no one is making earrings from the keys or knobs of microfilm machines.


There’s a reason for that: Those keys and knobs are still in use. Microfilm machines haven’t been mined for their decontextualized parts, and they are not yet truly obsolete. The devices are still in widespread use, and their mechanical simplicity could help them last longer than any of the current electronic technologies. As the web comic xkcd once observed, microfilm has better lasting power than websites, which often vanish, or CD-ROMs, for which most computers don’t have readers anymore.

The xkcd comic gets a laugh because it seems absurd to suggest microfilm as the most reliable way to store archives, even though it will remain reliable for 500 years. Its lasting power keeps it a mainstay in research libraries and archives. But as recent cutting-edge technologies approach ever more rapid obsolescence, past (and passed-over) technologies like the microfilm machine won’t go away. They’ll remain, steadily doing the same work they have done for the last century for another five more at least—provided the libraries they are stored in stay open, and the humans that would read and interpret their contents survive.



from The Atlantic http://bit.ly/2mA48sj
via IFTTT

No comments:

Post a Comment