Internet Archive Reaches Semifinals in MacArthur Foundation’s Competition for $100 Million Grant

by Wendy Hanamura

The Internet Archive headquarters: a temple to universal access to knowledge.

At the Internet Archive, we believe that libraries can be instruments of change.

So we are proud to announce that the Internet Archive is one of eight groups named semi-finalists today in 100&Change, a global competition for a single $100 million grant from the John D. and Catherine T. MacArthur Foundation. The competition seeks bold solutions to critical problems of our time. Here’s how we propose creating transformative, lasting change:

Our vision empowers libraries to unlock their rich analog collections for a new generation of learners, enabling free, long-term, public access to knowledge.

In today’s digital world, a new generation explores knowledge largely through their computers and phones. So as digital librarians, we worry when millions of books, representing a century of knowledge, are still not accessible online to scholars, journalists, students, and the public. Libraries have been stymied by huge costs, restrictions on eBooks, and missing technology. The legal path forward has not been clear. All of this means libraries haven’t been able to meet the digital demands of a new generation. And access to libraries is still not universal or equitable.

Our plan provides libraries and learners with free digital access to four million books. With our partners, we will curate, digitize, and enable digital lending of these digital volumes to any library in the country that owns the physical book. We plan to start with the books most widely held and used in libraries and classrooms. The scale of the project will help reduce digitization costs by 50 percent or more. How do we know this can work? We’ve been prototyping this model for six years at Open Library, digitizing 540,000 modern books originating from 100 partners.  Through Open Library, we lend books to the public in a manner that respects the rights of authors and publishers, in a process that mirrors the traditional way libraries circulate physical books.

What makes this a gamer-changer? Today, the Internet Archive already offers public access to 2.5 million books in the public domain, and 540,000 modern works. We need to be bigger and bolder. At the Internet Archive, we only lend one copy at a time, so in order to serve more learners, we seek thousands of libraries to join us. That can happen if we build the technical infrastructure that allows libraries everywhere to leverage those digital books. Plus, this is an issue of dollars-and-cents. Libraries should never pay to digitize a book more than once. Right now libraries pay an average of $17.50 for each interlibrary loan of a physical book. As books become electronic, those funds can be directed to more urgent needs. And above all, this grant will help all libraries become digital libraries, releasing the tremendous value in the collections they have curated over centuries.

With so many brilliant, effective thinkers applying to 100&Change, it always felt as if our chances were one in a hundred—and indeed they were! There was robust participation: 7,069 competition registrants submitted 1,904 proposals. Of those, 801 passed an initial administrative review and were evaluated by a panel of expert judges who each provided ratings on four criteria: meaningfulness, verifiability, durability, and feasibility. MacArthur’s Board of Directors made the final selection.  To be one of eight semifinalists from 800 qualified applicants is a tremendous honor.  

Eileen Alfaro, San Francisco fifth-grader. One day she could be carrying 4 million eBooks under her arm.

And as we work hard to hone our plans in the months ahead, here’s what propels us forward: Eileen Alfaro, the Internet Archive’s brightest rising star. Every day after school, this San Francisco fifth-grader does her homework at the Internet Archive, while her mother Roxana works. A straight-A student, Eileen loves nothing more than reading. We can put four million of the best books into her hands. Forever. For free.

Our proposal? Making libraries instruments of change for a new generation of learners like Eileen.

 

 

 

 

A summary of the Internet Archive’s solution, an overview video of its project and a MacArthur video describing our proposal is available here www.macfound.org/InternetArchive.

This entry was posted in Announcements, News. Bookmark the permalink.

7 Responses to Internet Archive Reaches Semifinals in MacArthur Foundation’s Competition for $100 Million Grant

  1. Pingback: Internet Archive Reaches Semifinals in MacArthur Foundation’s Competition for $100 Million Grant | LJ INFOdocket

  2. Congratulations! Your work is so valuable for us! Thank you!

  3. Nice Post & good work you people doing an amazing work for users thanks alot for sharing these kind of information….:)

  4. Philipp says:

    Whoa congratulations to you guys for making there! Wish you good luck and I really hope you can get the prize!

  5. Our future is defined by the course of our actions, is very important the digitation job that open library is doing for us.

  6. Nicholas says:

    I don’t know what the other semifinalists are, but I couldn’t imagine a more deserving organization than the Internet Archive. Great job for all you have done, and good luck for your ambitious goals! I’m with you 100% of the way!

  7. Congrats on becoming a semifinalist — scanning and legally sharing digital versions of library books across a network of libraries is a great idea. That approach will increase access to many proprietary printed works and is well worth doing. But, much recent user-generated content from the last decade was born digital yet is now effectively hidden from systematic free access by being behind robots.txt files and unclear copyright status. This is content posted to websites by people with the intent to widely share their contribution and who generally retain the copyright to share it elsewhere. An additional option for helping libraries make accessible otherwise proprietary web content is a browser plugin (and supporting infrastructure) to make it easy for people to reshare their various website submissions via their local libraries as part of a social semantic web. Maybe the IA could make it happen as a cultural change in how people interact with big proprietary web services or even just the IA blog — piggybacking such a system on top of the infrastructure to share digitized versions of printed books proposed for this grant?

    Some more details from a Knight News Challenge entry on libraries I submitted last year about this idea:
    https://www.newschallenge.org/challenge/how-might-libraries-serve-21st-century-information-needs/submissions/libraries-as-distributed-digital-knowledge-repositories
    “There are [too] many single points of failure on the internet for collections of important knowledge. For example, years of posts to Facebook, Reddit, Slashdot, MetaFilter, or SoylentNews would all be lost if those websites were to be shut down. We have an answer to that challenge.
    While the Internet Archive is backing up some of the internet, it is another single point of failure. We propose developing data standards, software applications, coordination protocols. and hardware specifications so every local library in the world can participate in backing up part of the internet. While that brings up many copyright concerns, we have an approach to deal with that.
    We propose making web browser addon applications major web browsers. This browser addon would make it easy for people posting content to any website on the internet to send a copy for safe keeping to their local library (or other access gateway). From there, the content would be distributed across the distributed library network. Any previously published content they have written could also be added to this system using that browser app. The content would be sent a standardized form for indexing and linking with other content using semantic information. Users would specify a Creative Commons license or similar free license for their content when they contributed the content. Each data item would be assigned a unique hash for its content to help ensure its integrity and retrievability (similar to how the Git source control system stores information).
    Each local library might only store terabytes of information (likely using Apache Hadoop and perhaps Apache Accumulo or similar software). But, together as a network, thousands of local libraries could store the world’s knowledge in a reliable distributed way. Even one library would have the absolutely most important data for that locality, and any few libraries would have most of the popular data across the network. …”

Comments are closed.