We Normalized Broken Software—and Physics Won’t Bail Us Out

Software quality has deteriorated into a normalized crisis, with routine memory leaks, OS-level regressions, and catastrophic outages stemming from basic engineering lapses. AI has become a force multiplier for these failures, producing more vulnerabilities while eroding the junior talent pipeline and deepening reliance on brittle abstractions. Big Tech’s massive infrastructure spend is a stopgap that won’t overcome energy and physical limits; only a return to fundamentals and efficiency will avert collapse.
Key Points
- Software reliability and efficiency have collapsed across mainstream apps and OS updates, with severe memory leaks and recurring regressions normalized as routine.
- A single unchecked input or missing bounds check can trigger global outages and multi‑billion‑dollar losses, as shown by the 2024 CrowdStrike incident.
- AI tools magnify existing quality problems: they produce more vulnerabilities, enable faster harm by inexperienced developers, and are often overtrusted by management.
- Compounded abstraction layers impose multiplicative overhead, now colliding with physical limits—energy, cooling, and grid capacity—that money alone can’t fix.
- Big Tech’s $364B infrastructure spend is framed as a non-solution that masks fundamental engineering failures while a junior-developer pipeline collapse jeopardizes future expertise.
Sentiment
The community is deeply divided. While many sympathize with the general concern about declining software quality, the article itself is treated with significant skepticism. The top comments focus on detecting AI patterns in the writing rather than engaging with the thesis, and the post was flagged. Experienced engineers are split between those who see a real and worsening problem driven by market incentives and AI-amplified incompetence, and those who argue this is the same complaint recycled every decade with selective nostalgia for a past that was equally buggy. The discussion generates more heat than light, with the irony of a potentially AI-written article decrying AI being the dominant takeaway.
In Agreement
- Software quality has visibly declined in consumer apps, OS updates, and web services, with the CrowdStrike outage as a landmark example of missing basic engineering safeguards
- Market concentration allows dominant companies like Microsoft and CrowdStrike to ship poor-quality software with impunity because switching costs trap users
- The elimination of dedicated QA roles and the shift to beta-testing on end users via 'Insiders' programs has degraded release quality
- AI and vibe coding amplify existing incompetence by letting less-skilled developers produce damage faster without understanding what they're building
- Replacing junior developers with AI tools will destroy the career pipeline that produces experienced senior engineers
- VC culture prioritizes prototyping speed and exit value over sound engineering, and since VC-backed companies set industry norms, this infects the entire ecosystem
- The ability to push constant updates has removed the incentive to ship reliable software, since bugs can always be 'fixed later'
Opposed
- Software has always been buggy—Windows 95/98 crashed constantly, BSODs were routine, and early web development was a chaotic mess; claims of decline are false nostalgia
- The article itself appears to be AI-generated or AI-assisted, which deeply undermines its credibility as a critique of AI-driven quality collapse
- Modern tooling like Git, CI/CD, automated testing, and better frameworks has actually raised baseline software quality compared to the SSH-into-production era
- The energy consumption argument is overblown—data centers use a tiny fraction of global energy and the concern is a red herring
- Safety-critical software is better regulated and higher quality than ever, with standards like DO-178C; the examples of deadly software bugs are decades old
- Software quality is fundamentally an economic tradeoff, not a collapse—companies rationally choose velocity over perfection because that is what markets reward
- Abstraction layers trade performance for genuine gains in developer productivity, safety, and iteration speed; framing them as purely wasteful is misleading