The World Wide Web turns 25 this week. On March 12, 1989, Tim Berners-Lee wrote a short paper called “Information Management: A Proposal” that invented the web. Berners-Lee was prompted to do this by the need to make things more efficient at the European Center for Nuclear Research (CERN) atom smasher where he worked. The complexity of projects, combined with frequent staff turnover and general human inefficiency, meant that things got lost. Large-scale experiments were difficult to coordinate, files hard to find, information, sometimes, just plain gone. As a result, work had to be repeated and atoms resmashed, sometimes more than once.
To fix this, Berners-Lee sketched a non-hierarchical system of files stored on linked computers. Anybody could access any file, any time, or jump from file to file, not following predetermined pathways, but in any order. The system would be open and unregulated, and since the goal was to share information, not hide it, Berners-Lee didn’t care that much about locking-up the data or protecting intellectual property:
[C]opyright enforcement and data security…are of secondary importance….information exchange is still more important than secrecy.
Berners-Lee initially called his system “the Mesh.” He later changed that to the World Wide Web, a name which stuck. It took a couple of years for the web to leap off the page and become an actual information storehouse. CERN’s first web site went live in 1991. Before the decade ended, the web had become indispensable, not just to atomic scientists, but to everyone.
The web is often confused with the older internet, which is pushing 45. The ’net, launched in 1969, is a way of connecting computers so they can talk with one another. The World Wide Web harnesses the internet so users can access hyperlinked documents stored on remote web sites, and for many people it is the most important aspect of the information age. Today we speak of the web and the ’net interchangeably. Hence, the interwebs.
The web connects. But not everybody thinks connectivity is so great. The telegraph, the nineteenth century’s internet, pioneered instant communication across vast distances, but Henry David Thoreau complained in Walden (1854),
We are in great haste to construct a magnetic telegraph from Maine to Texas; but Maine and Texas, it may be, have nothing important to communicate.
Despite Throreau’s skepticism, the telegraph thrived—it turned out that, given a connection from somewhere to somewhere else, people had lots to communicate. But Samuel Morse, the telegraph’s inventor, saw no use for an even newer gadget, the telephone, which turned 138 this week, because there was no app for writing down conversations. That didn’t stop telephone users, who soon were making more calls and sending fewer telegrams. Perhaps they thought not writing things down was a plus.
Early critics of the web warned that digitizing information was dehumanizing. In 1995, the writer Kirkpatrick Sale smashed a computer with a sledgehammer at New York City’s Town Hall, in front of a paying crowd. Even more extreme, Unabomber Theodore Kaczynski showed his hatred of dehumanizing technology by blowing people up. Publisher Bill Henderson simply urged people to join his Lead Pencil Club, where they signed a promise to abandon their computers and go back to writing with pencils. But resistance was futile. Internet use in the United States jumped from 14% in 1995, when the web was young, to 87% today. According to a PEW Research survey released on the eve of the web’s birthday, 90% of users now think the web is a good thing. It’s not clear why the other 10% are still online.
The internet requires some technical skill. In contrast, the web is easy to use. Anyone who can type with two fingers and click a mouse can find information online. In fact, the web’s so easy and so fast that we seldom check to see if that online information is valid. We trade convenience for accuracy, and if we can’t find what we’re looking for on the web, we either assume it doesn’t exist or it’s not worth searching for.
We trade convenience for privacy as well. The web is our window on the world. Sitting at home, we can see anything, go anywhere, buy whatever we want, with just a few clicks. But our every click and keystroke is seen by data brokers hoping to sell us things we don’t really need, and government snoops waiting for us to break the law. The window works two ways, except we can’t see who’s watching us.
Tim Berners-Lee’s open, unregulated web still exists. But woven into it there’s an anti-web where corporate and government controls increasingly limit what we do online, where copyright and secrecy threaten information sharing, where profit and digital rights management have become the name of the game.
The web remains a force for good in the world. Social activists praise the #twitterrevolution for the Arab Spring, the Green Revolution, and the overthrow of autocrats. But the web becomes a counterrevolutionary technology when autocrats use it to track dissidents, throw them in jail, or make them disappear. Perhaps that’s not surprising: the equally-revolutionary printing press offered people a means for liberation and the expansion of knowledge, but it also served as a vehicle for propaganda and a tool for population control.
After twenty-five years, the web has become so much a part of life’s infrastructure that we can’t go off the grid. Even if we unplug for a day or a week or a month, too many aspects of our lives are mediated online. How would we work? Eat? Play? Buy things? Pay bills? Communicate? And then there’s the growing internet of things for us to come to terms with, but fortunately, no one’s celebrating its birthday, at least not yet. Face it, we can't do without the web. If we washed up on a desert isle, the first thing we’d ask is, “Do they have free wi-fi?”