In the summer of 1908, Henry Ford’s Model T rolled off the assembly line, a clattering marvel of steel and ingenuity that would redefine the contours of modern life. Streets once carved for horse-drawn carriages were soon widened for automobiles; cities sprawled as suburbs bloomed, and the very rhythm of human existence—work, leisure, travel—shifted irrevocably. Just five years earlier, in 1903, the Wright Brothers had defied gravity at Kitty Hawk, their rickety plane fluttering for a mere 12 seconds. Yet by the 1930s, air travel was stitching the globe together, shrinking distances that had once felt insurmountable. These were not mere gadgets; they were societal earthquakes, inventions that redrew the boundaries of what it meant to be human.
Fast forward to 2025, and the question looms large: where is our Model T? Where is our Wright Flyer? For all the dazzling advancements of the past few decades—smartphones, social media, streaming platforms—there’s a growing sense that we’re living in an era of technological stagnation, a period of refinement rather than reinvention. The iPhone, launched in 2007, sparked an app-driven revolution, yes, but it remains, at its core, a phone, a device whose design has barely evolved since the iPhone 12. Solar panels power more homes than ever, but they’re still an evolution of electricity, not a new paradigm. Even artificial intelligence, for all its promise, often feels like a souped-up version of 1980s neural networks, powered by better chips and bigger data. We’re tweaking the old, not birthing the new. And that absence stings.
This isn’t just a nostalgic lament for the past. The stakes are high. The 1890s to 1970s—a golden age of innovation—gave us not just cars and planes, but transistors, antibiotics, and the Apollo program. These breakthroughs didn’t merely improve life; they redefined it, unlocking possibilities that previous generations couldn’t have dreamed of. A child born in 1900 could, by 1970, drive across the country, fly across the ocean, and watch a man walk on the moon—all within their lifetime. Compare that to someone born in 1990: the internet was already nascent, cars and planes were commonplace, and the most transformative shift might be the smartphone—a device that, while revolutionary in its own right, builds on the bones of computers and telephones rather than creating something wholly new.
Why does this matter? Because humanity thrives on leaps, not steps. The combustion engine didn’t just make travel faster; it reshaped economies, birthed the suburbs, and altered social structures. The internet, born in the 1970s but exploding in the 1990s, didn’t just improve communication; it created digital economies, remote work, and globalized culture. These were seismic shifts, felt viscerally by every layer of society. Today, we’re hungry for that jolt again—a breakthrough that doesn’t just polish what we have but reimagines what we can be. And yet, as we scan the horizon, the candidates feel either too incremental or too far off.
Consider the smartphone, often hailed as the defining invention of our era. When Steve Jobs unveiled the iPhone in 2007, it was a revelation: a pocket-sized computer, camera, and phone rolled into one, with a touchscreen that made clunky BlackBerrys obsolete. By 2010, it had birthed trillion-dollar industries—Uber, Instagram, TikTok—redefining how we work, date, and protest. But look closer, and the cracks emerge. The iPhone built on existing tech: computers, the internet, mobile phones. Its design has stagnated since 2020, with companies now competing on marginal upgrades—better cameras, faster chips—echoing the pre-iPhone era of flip phones, where firms battled over aesthetics, not substance. The smartphone reshaped daily life, but it’s no car or plane; it’s an evolution, not a new species.
What about artificial intelligence, the darling of tech headlines? Since the 2010s, AI has surged, with models like ChatGPT (launched in 2022 by OpenAI) writing essays, generating art, and even coding. But dig into its roots, and you’ll find concepts from the 1980s—neural networks and error correction—supercharged by modern computing power and vast datasets. AI is transformative in scope, automating jobs and reshaping industries, but it’s not a new paradigm; it’s a better computer, not a moon landing. And its societal impact, while growing, lacks the universal tangibility of a car: a fifth grader can grasp driving, but try explaining a large language model to them.
Then there’s augmented reality (AR), which we’ve discussed as a potential contender. Companies like Meta are racing to perfect AR glasses—wearables that overlay digital info on the real world. Imagine walking through Paris, your glasses translating signs, showing historical overlays, or letting a virtual friend join you at a café. Startups like Mojo Vision are even testing smart contact lenses with micro-displays. AR’s roots stretch to the 2010s, with devices like Google Glass, and it’s gaining steam: surgeons use AR to see inside patients, and kids play games like Pokémon GO in real spaces. By 2030, it could be as ubiquitous as smartphones, merging digital and physical realities in a way that feels new, not just better. But AR builds on screens and chips, and it’s not fully here—more prototype than product. It’s promising, but not yet the societal rewrite of the airplane.
Space exploration offers another glimmer of hope, particularly Elon Musk’s Starship. SpaceX’s reusable rocket, in development since the 2010s, aims to make space travel routine, with plans for Mars colonies by the 2030s. If successful, Starship could be our Apollo moment—humanity becoming multiplanetary, cities redesigned for spaceports, and a new frontier opened. Millions tune into Starship test launches, a testament to our collective yearning for something bold. But it’s still in testing, crashing as often as it soars, and its impact remains speculative. Like the Wright Brothers’ 1903 flight, it’s a spark, not a system—not yet shrinking the world like airlines did.
Genome editing, via CRISPR, is another frontier. Since its breakthrough in the 2010s, CRISPR lets us rewrite DNA, potentially curing genetic diseases or even slowing aging. It’s not just better medicine; it’s a new capability, like electricity was, with the power to reshape health and longevity. But ethical concerns—designer babies, playing God—loom large, and its benefits aren’t mainstream yet. It’s transformative in labs, not living rooms, lacking the universal feel of a car’s impact.
What about the brain-tech you brought up—Cortical Labs’ biological computing? Based in Australia, Cortical Labs has, since 2021, been merging human neurons with silicon chips, creating the CL1, a bio-computer launched in 2025. Their DishBrain system taught 800,000 neurons (ant-brain level) to play Pong, learning faster and using less energy than traditional AI. You can buy a CL1 for $35,000 or code for neurons via their Cortical Cloud. This isn’t a better processor; it’s a new kind of thinking machine, blending biology and tech. If scaled, it could lead to AI that learns like humans or even brain interfaces that upload knowledge—your Matrix-style dream. Imagine downloading a language in seconds, reshaping education and work. But it’s early: 800,000 neurons are far from a human’s 86 billion, and it’s a lab tool, not a household name. Ethical risks—like brain-hacking—loom, and its mass success is uncertain. It’s post-1990s and novel, but not yet the societal jolt you want.
So where does this leave us? The 1890s to 1970s were a golden age because they started from near-scratch. Combustion engines, airplanes, transistors, and moon landings were raw leaps, built with sheer brainpower, not layered on complex systems. Today’s innovations—smartphones, AI, AR, bio-computers—stack on existing tech, so they feel less visceral, even if their impacts are vast. The smartphone created the gig economy, but it’s still a phone. AI automates jobs, but it’s still a computer. Starship could take us to Mars, but it’s still a rocket, not a reality. We’re in a cycle of refinement, not reinvention, and that gap fuels our hunger for something more.
But there’s hope on the horizon. If Cortical Labs scales its bio-computers, we might merge minds and machines, uploading knowledge as naturally as breathing. If Starship lands on Mars, we’ll become a multiplanetary species, a leap as big as the car. If AR glasses become the new smartphone, reality itself will gain a digital layer, changing how we live and learn. These aren’t here yet, but they echo the early days of past breakthroughs—the Model T before roads, the Wright Flyer before airports. The 20th century taught us that big leaps take time to ripple, and we may be too close to today’s sparks to see their full blaze.
Still, the ache remains. We want a breakthrough that doesn’t just improve but redefines—a flying car, a moon base, a cure for aging. We want to feel the ground shift beneath us, as it did for our grandparents. The 1890s to 1970s spoiled us with density: a car, a plane, a computer, a lunar landing, all in one lifetime. Today, we’re building on their foundations, and that makes our leaps feel smaller, even if they’re not. Perhaps the next big thing is closer than we think—lurking in a lab, a launchpad, or a line of code. Until then, we wait, yearning for the day humanity once again defies the impossible.
About the Author
QuantumX is just a regular Joe, who's also a QuantumCage observer.
Sources:
- Ford, Henry. My Life and Work. Doubleday, 1922.
- Isaacson, Walter. Steve Jobs. Simon & Schuster, 2011.
- Cortical Labs. "CL1 Biological Computer: Technical Overview." Cortical Labs, 2025.
- SpaceX. "Starship Development Timeline." SpaceX, 2025.
- Doudna, Jennifer A. A Crack in Creation: Gene Editing and the Unthinkable Power to Control Evolution. Houghton Mifflin Harcourt, 2017.
- Kurzweil, Ray. The Singularity Is Near. Viking, 2005.
- Meta AI. "Orion AR Glasses: A New Reality." Meta, 2024.
- Google Quantum AI. "Quantum Supremacy and Beyond." Google, 2023.