~/home~/résumé~/projects~/blog~/contact

Technology Is Changing Faster Than Ever (But Only in One Place)

March 28, 2026trend-commentaryai-codingdeveloper-tools
  • ›The Adoption Curve Went Vertical
  • ›AI Compute Is Outrunning Moore's Law
  • ›The Agent Horizon Nobody Is Talking About
  • ›Your Skills Have a 2.5-Year Half-Life Now
  • ›The Counterargument Nobody Wants to Hear
  • ›What to Actually Do About It

Technology adoption has gone vertical. ChatGPT hit 100 million users in 2 months. AI training compute doubles every 5 months. But zoom out from silicon and the pace of technology changing faster tells a completely different story. One domain is sprinting. Everything else is walking.

The Adoption Curve Went Vertical#

The telephone took 75 years to reach 100 million users. Electricity took 46. The internet compressed that to 7 years, which felt impossibly fast at the time. Then came ChatGPT. Two months. The AI adoption rate shattered every precedent in the history of consumer technology, and the numbers kept climbing. By mid-2025, ChatGPT had 700-800 million weekly active users, roughly 10% of the global population.

Time to 100 Million Users

The scale here is hard to internalize. Instagram took 2.5 years. TikTok took 9 months. ChatGPT did it in 60 days. According to Epoch AI's research, AI may hit 90% adoption in roughly 3 years, a milestone that took the internet 23 years to reach. 40% of US businesses now pay for AI products. That number is projected to reach 80% by 2028. OpenAI's token volume grew 50x between November 2023 and October 2024. Not 50%. Fifty times.

AI Compute Is Outrunning Moore's Law#

Moore's Law gave us 2x every 2 years, steady and predictable. The foundation of five decades of computing progress. AI compute growth has obliterated that pace. Training compute for frontier models is growing 4.5x per year, doubling every 5.2 months. Since 2020, frontier model compute has increased roughly 10,000x. That is not a typo. The money tells the same story. Gartner estimated $475 billion in AI datacenter spending for 2025, up 42% from 2024. Big tech is pouring capital into infrastructure at a rate that makes the dot-com buildout look restrained. Algorithmic efficiency is improving 3x per year on top of the hardware gains. The effective AI compute growth is even steeper than the raw numbers suggest. I spent years working with hardware ecosystems that move slowly. GPU driver support, platform compatibility, thermal constraints. Physical atoms resist change. Bits do not. That gap explains a lot about where we are.

The Agent Horizon Nobody Is Talking About#

Here is the metric that keeps me up at night. Researchers at METR have been tracking what they call the "AI agent task horizon," the longest task an AI agent can complete autonomously. In 2022, that horizon was about 30 seconds. Simple lookups. Basic completions. By early 2025, it reached roughly 14 hours. The doubling time is about 7 months, and recent data suggests it is accelerating to every 4 months. Extrapolate that forward and AI agent capabilities reach 8-hour workday tasks by 2027. Month-long projects by 2028-2029. The benchmarks back this up. Claude scored 72.5% on OSWorld in 2026, up from 28% just one year earlier. GPT-5.2 hit 40.3% on FrontierMath, 10x better than previous models. These are not incremental improvements. They are vertical jumps.

AI Pace: Common Questions

Both. AI and compute are genuinely on an exponential curve steeper than anything before. But most physical-world technology (energy, transport, construction) is advancing at roughly the same pace it has for decades. The perception of universal acceleration comes from conflating software speed with everything else.
AI training compute doubles every 5.2 months, roughly 4.5x per year. That is about 5x faster than Moore's Law. Combined with 3x annual gains in algorithmic efficiency, the effective improvement rate dwarfs anything the semiconductor era produced.
Based on METR's research, the AI agent task horizon is doubling every 4-7 months. If that trend holds, 8-hour autonomous tasks arrive around 2027 and month-long projects become feasible by 2028-2029. These are projections, not guarantees, but the trend line has been remarkably consistent.

Your Skills Have a 2.5-Year Half-Life Now#

The technology skills half-life has collapsed. According to TechJury's industry analysis, the average technical skill now loses half its value in 2.5 years. That is down from roughly 5 years a decade ago. I can feel this in my own work. The JetBrains 2025 developer survey found that 29% of all code is now AI-generated, up 45% year over year. Skills I spent months building (memorizing API surfaces, debugging obscure syntax, scaffolding boilerplate) are worth less every quarter. The framework wars are ending. Not because React won or Svelte lost. Because syntax stopped mattering as much as system design. When an AI can generate a working component in any framework in seconds, your value shifts from "knows the API" to "knows what to build and why." That is a fundamental change in what it means to be a developer.

The Counterargument Nobody Wants to Hear#

Northwestern economist Robert Gordon makes a compelling case that we are fooling ourselves. His "Special Century" thesis argues that the period from 1870-1970 produced every technology that defines modern life:

  • Indoor plumbing
  • Electrification
  • Internal combustion engines
  • Antibiotics
  • Air conditioning
  • Interstate highways
  • Commercial aviation Since 1970, computing is the only domain with genuinely transformative breakthroughs. We fly in planes designed 50 years ago. Our cars still run on combustion (mostly). Construction productivity has actually declined. The global cybersecurity workforce is 3.7 million professionals short by 2026 estimates, not because the field is new, but because we still cannot train people fast enough for decade-old threats. Gordon's point is not that AI is not impressive. His point is that one fast domain does not mean everything is fast. And he might be right. The technology changing faster narrative applies almost exclusively to software and the compute underneath it.

What to Actually Do About It#

If the acceleration is real but narrow, the playbook for developers is clear. Stop optimizing for knowledge that decays in months. Start building the things that compound. Three concrete shifts:

  1. Bet on system design over syntax. Learn distributed systems, data modeling, security architecture. These change slowly. The framework you memorized will be obsolete before you master it. The ability to design a system that handles 10x traffic will not.
  2. Build judgment, not just knowledge. AI can generate code. It cannot yet decide what to build, when to ship, or which tradeoff to accept. Product instinct and engineering judgment are appreciating assets.
  3. Treat AI tools as infrastructure, not novelty. Stop "trying out" AI coding assistants. Integrate them into your daily workflow the way you integrated version control and CI/CD. They are not optional extras anymore. The pattern I see: developers who adapted early are pulling ahead fast. Not because they are using AI to write more code, but because they are using it to think at a higher level of abstraction. Less time on syntax. More time on architecture and product decisions. Technology is changing faster than ever. Just not everywhere. The developers who thrive will be the ones who understand where the acceleration is real, where it is not, and how to position themselves on the right side of that divide.

Feedback

No comments yet. Be the first!