The internet wasn't born in a flash of innovation. It emerged from a deliberate quest to solve a Cold War problem: how do you build a communication system that survives catastrophic damage?
Pick any topic. VocaCast researches it, writes it, and reads it to you.
The internet wasn't born in a flash of innovation. It emerged from a deliberate quest to solve a Cold War problem: how do you build a communication system that survives catastrophic damage? In the 1960s, researchers grappled with that question and landed on an elegant solution called packet switching — chopping data into small, independent pieces that could route themselves around network failures. This approach became the foundation for the first major implementation: ARPANET. [1]
The Defense Advanced Research Projects Agency, or DARPA, initiated ARPANET in 1969. [2] It connected just four university nodes at first, but it proved something radical — that computers separated by physical distance could reliably exchange information across a wide-area network. It was a proof of concept that transformed from theoretical to tangible.
But here's where the story shifts. ARPANET worked. Yet it operated under a critical assumption: that a single organization controlled the whole thing. This worked fine when there was only one network. By the early 1970s, though, researchers realized the real breakthrough would come not from perfecting ARPANET itself, but from connecting it to other networks. In 1973, DARPA started a research program to develop communication protocols that would allow computers to communicate across multiple, linked packet networks. [3] This was the pivot — from one network to a network of networks.
That work crystallized in 1974 when Vinton Cerf and Robert Kahn published their foundational paper, "A Protocol for Packet Network Intercommunication." [4] TCP/IP wasn't just another protocol. It was designed from the ground up for the inter-net — a fundamentally different architecture than the original ARPANET protocols, which assumed a network run by a single entity. [8] By 1975, the approach was testable. A two-network communications test using an early version of TCP/IP happened between Stanford University and University College London. [5] It worked.
Yet adoption wasn't instant. ARPANET still ran on its older protocols. The real turning point came on January 1, 1983, when DARPA required that TCP/IP replace the Network Control Protocol for all machines connected to ARPANET. [6] That single mandate — a forced transition across the entire network — is often cited as the beginning of the modern internet. [7] It unified what had been fragmented, and suddenly, the architecture existed to link not just four universities, but eventually, the world.
But there's a catch. The internet existed in the late 1980s—a network of researchers and institutions trading files through obscure protocols. It was powerful, yes, but it might have stayed that way forever, confined to the technical elite, if not for what happened next.
The invention that changed everything was almost mundane in its conception. An English computer scientist named Tim Berners-Lee invented the World Wide Web in 1989 [9]. He wasn't trying to revolutionize communication. He was trying to solve a specific problem at his workplace: how to share information across different computers without everyone having to learn separate systems.
While working at CERN in 1990, Berners-Lee created the first web server and the first graphical web browser, which he called WorldWideWeb [10]. This wasn't just code—it was a complete ecosystem. By December 1990, he and his team had built all the necessary tools for a working Web, including HTTP, HTML, the first browser, and the first web server [11]. The infrastructure was elegant: a simple protocol for transferring documents, a markup language for formatting them, and a browser to view them.
Here's what made this revolutionary: before the Web, the internet felt like a filing cabinet. After it, the internet became a conversation. But adoption didn't happen overnight. Those were years of deliberate groundwork. To accelerate development, Berners-Lee took a crucial step—he developed the libwww library in 1992, an API package designed as a starting point for other developers to build their own web browsers [14]. By releasing this toolkit, he transformed the Web from a closed invention into an open platform.
That decision—to build tools for others—turned what could have been a proprietary system into something collaborative and explosive. The impact rippled outward. Turing Prize on April 4, 2017 [15]. The Web had moved from the margins to the center of human communication—all because one engineer chose to build the scaffolding that let others build higher.
Thanks for listening to this VocaCast briefing. Until next time.