Read the full transcript & sources ↓
Pick any topic. VocaCast researches it, writes it, and reads it to you. Coming soon to iOS — be the first to know when it's live.
The smartphone in your pocket contains more computing power than the most advanced supercomputers of the early 1990s. This is your VocaCast briefing on smartphones for Thursday, April 23.
We start with the constraints that came before, then trace how separate technologies converged into one device.
In the 1980s and 1990s, mobile computing meant choosing between devices — you carried a handheld calculator for math, a personal digital assistant for organizing information and connecting to networks, or an early mobile phone to make calls. [1] [2] Personal Digital Assistants, or PDAs, were designed specifically to store and retrieve information while maintaining network connectivity. [2] These machines were bulky and did one thing well. Then BlackBerry's emergence in the 1990s marked the beginning of the smart technologies era for mobile devices, hinting that phones could become something more.
But the real transformation required infrastructure. The growth of the internet in the 1990s, enabled by telecommunications connecting homes and offices, laid the groundwork for what was coming. [3] The internet's rapid uptake created demand for mobile access — not just calling, but data. Meanwhile, mobile networks themselves were evolving through successive generations. The evolution of mobile technologies progressed from 1G to 2G, then to 3G, and subsequently to 4G LTE. [4] The emergence of 3G networks was particularly significant as it represented the concept of mobile internet — the ability to browse and transmit information wirelessly, not just voice calls.
Device advances in the 2000s proved crucial to the final breakthrough. The 1990s had already seen advancements in computers and increasing internet access, leading to improvements in user interaction technologies like drop-down menus, widgets, pen-based interfaces, unique icons, and touch screen interfaces. [5] But it was the convergence of mobile networks, processing power, and battery life improvements that made the smartphone possible. Advancements in mobile technologies provided processing power greater than the most advanced supercomputers of the early 1990s, shrinking supercomputer capability into your palm. [4] Battery life advancements in mobile devices enabled them to power complex functions with less energy, removing the constraint that had made early portable computers impractical.
By the late 1990s and early 2000s, the stage was set. The gradual convergence of mobile phones and PDAs led directly to the smartphone revolution — one device replacing five.
That foundation in mobile technology created the conditions for something transformative. The release of the iPhone in 2007 had a profound impact on the dominant design of smartphones and consumer expectations. [6] What Apple did was fundamentally reshape what people believed a phone could be — not just a communications device, but a portal to applications, media, and services. This moment revealed a crack in the existing market leader. Symbian OS had dominated the pre-smartphone era with remarkable reach. [7] By 2007, Symbian OS had powered over 100 million devices due to its creation of scalable, cross-platform software. Around mid-2007, the platform was running on approximately 65 percent of cell phones.
The architecture itself was sophisticated — built around a microkernel design, optimized for ARM processors and developed predominantly in C++, facilitating efficient resource management on devices with limited capabilities. [7] [8] Symbian OS provided impressive battery life and required lower hardware requirements, yet it was criticized for a late response compared to iOS and Android. [9] The problem wasn't engineering; it was adaptation. What changed was not just the phone itself, but the ecosystem around it.
The introduction of sophisticated applications and high-resolution touchscreens were significant differentiators for smartphones, running on operating systems like Android and iOS. [10] To turn this into a sustainable market, though, you needed developers building software at scale. The Apple App Store was launched in 2008, quickly becoming an 'Eldorado for developers' due to the creation of a new smartphone applications ecosystem. [11] That same window opened for Android. The Android operating system was open-sourced in October 2008. [10] Within weeks, the first commercially available Android device, the HTC Dream, was released on October 22, 2008. [12] Open source mattered. Android's open-source model fostered rapid innovation and a vast developer community, driving its widespread adoption.
Where Symbian had demanded tight control and managed licensing, Android invited anyone to build on it. [13] The difference was almost philosophical — one system tried to preserve itself; the other was designed to grow.
To wrap up, the smartphones in your pocket today carry processing power that would have filled entire rooms just decades ago. Modern mobile chips now include diverse AI accelerators like GPUs and NPUs, enabling on-device AI capabilities that process information directly on your phone rather than sending it to distant servers. [13] This shift matters enormously. When your phone handles AI tasks locally, your data stays private and responses arrive instantly, without the lag that comes from round-tripping information to the cloud. That privacy-first architecture represents something fundamental about where mobile technology has landed.
The dream of the smartphone wasn't just a device that could call someone — it was a truly intelligent companion, one that understands what you need without compromising your trust. We're living in that moment now.