We Normalized Broken Software—and Physics Won’t Bail Us Out

Read Articleadded Oct 9, 2025
We Normalized Broken Software—and Physics Won’t Bail Us Out

Software quality has deteriorated into a normalized crisis, with routine memory leaks, OS-level regressions, and catastrophic outages stemming from basic engineering lapses. AI has become a force multiplier for these failures, producing more vulnerabilities while eroding the junior talent pipeline and deepening reliance on brittle abstractions. Big Tech’s massive infrastructure spend is a stopgap that won’t overcome energy and physical limits; only a return to fundamentals and efficiency will avert collapse.

Key Points

  • Software reliability and efficiency have collapsed across mainstream apps and OS updates, with severe memory leaks and recurring regressions normalized as routine.
  • A single unchecked input or missing bounds check can trigger global outages and multi‑billion‑dollar losses, as shown by the 2024 CrowdStrike incident.
  • AI tools magnify existing quality problems: they produce more vulnerabilities, enable faster harm by inexperienced developers, and are often overtrusted by management.
  • Compounded abstraction layers impose multiplicative overhead, now colliding with physical limits—energy, cooling, and grid capacity—that money alone can’t fix.
  • Big Tech’s $364B infrastructure spend is framed as a non-solution that masks fundamental engineering failures while a junior-developer pipeline collapse jeopardizes future expertise.

Sentiment

The overall sentiment of the Hacker News discussion is predominantly skeptical and oppositional to the article's central claim of a *recent* "historic decline" in software quality. While some specific examples of egregious failures are acknowledged, the general consensus is that these concerns are perennial, that 'perfect software' is an unrealistic ideal due to business realities, and that perceived declines often reflect evolving hardware capabilities or long-standing challenges rather than a new crisis. There's a cynical undertone regarding the commercial value placed on software quality.

In Agreement

  • Egregious failures like the CrowdStrike outage indicate a real lack of basic engineering safeguards and competence.
  • There's a significant gap between reasonable software quality and severe issues such as 32GB memory leaks, which should be a cause for concern.
  • Software quality often plays a minimal role in commercial engineering, as businesses prioritize making money, and sometimes even benefit from a continuous need for fixes (e.g., subscription models).

Opposed

  • Arguments about declining software quality and resource bloat are not new; these concerns have been raised repeatedly for decades, predating the article's suggested timeline.
  • Current high resource consumption (e.g., gigabytes of RAM) is largely a result of increased hardware availability and capacity, rather than a decline in engineering quality compared to eras with limited resources.
  • Chasing a "platonic ideal of perfect software" is unrealistic; software must exist in the real world with trade-offs dictated by business objectives, economics, and the transient nature of many applications.
  • The article's specific timeline for a 'decline' is incorrect, with many believing any 'downward spiral' or complaints about quality started much earlier, effectively making it an ongoing challenge rather than a recent collapse.
  • The perceived 'decline' is often 'false nostalgia,' as past software was not inherently better but simply had different constraints.
We Normalized Broken Software—and Physics Won’t Bail Us Out