New Payroll Data Ties AI to Early-Career Job Losses in Exposed Fields

A new Stanford study using ADP payroll data finds a 13% employment decline for 22–25-year-olds in highly AI-exposed jobs since ChatGPT, while older workers and less-exposed occupations saw steady or rising employment. The effect is stronger where AI automates tasks and persists even within firms, suggesting targeted displacement rather than economy-wide weakness. Thompson concludes this is the strongest evidence so far that AI is already reshaping early-career white-collar work, warranting curricular changes and continued scrutiny.
Key Points
- New Stanford analysis of ADP payroll data shows a roughly 13% employment decline since ChatGPT for 22–25-year-olds in highly AI-exposed roles, especially software development and customer service.
- Employment has risen or held steady for older workers and for less-exposed occupations such as home health aides, indicating the effect is concentrated, not economy-wide.
- Within the same firms, highly AI-exposed jobs decline relative to less-exposed jobs, suggesting the pattern is not driven solely by firm-level shocks like rates or macro conditions.
- Jobs where AI usage is automative show youth employment declines; where AI is augmentative, similar declines are not observed.
- Younger workers’ tasks overlap more with what LLMs can replicate (codified, short-horizon, easily evaluated work), while older workers rely more on tacit, strategic capabilities; colleges should teach AI tool use and emphasize capabilities AI lacks.
Sentiment
The Hacker News community is broadly skeptical of the article's claim that AI is the primary driver of early-career job losses. While commenters acknowledge AI may be a contributing factor, the dominant view is that macroeconomic forces — particularly the end of ZIRP, Section 174 tax changes, post-COVID overhiring corrections, and the Musk effect — are far more significant. Several commenters argue AI is being used as a convenient corporate narrative to justify cost-cutting decisions that were already underway. Methodological critiques of the underlying Stanford paper further erode confidence in the causal claims.
In Agreement
- Some commenters report firsthand accounts of companies proactively cutting staffing projections in 2022 in anticipation of AI efficiency, adjusting hiring to prioritize AI and big data backgrounds over traditional roles.
- AI has clearly displaced real work in specific fields like translation, copywriting, illustration, and customer service, where good enough AI output has replaced human workers for routine tasks.
- Companies are pausing junior hiring because AI tools can handle much of the entry-level work, and investor pressure to show AI efficiency is actively discouraging new headcount.
- Even if AI is not directly replacing workers yet, the massive capital investment in AI infrastructure is diverting funds that would otherwise go toward hiring and training new workers.
- The fact that employment in exposed occupations has not recovered even after Section 174 tax changes were reversed suggests AI is now sustaining the job losses that macro factors initially triggered.
Opposed
- The timeline does not add up: employment declines in AI-exposed fields began in 2022, before LLMs and AI tools were widely deployed or capable of replacing workers at scale.
- The end of ZIRP, Section 174 tax code changes requiring R&D expense capitalization, and post-COVID overhiring corrections are far more plausible explanations for the observed job losses, especially in software engineering.
- The Musk effect from Twitter's dramatic layoffs showed CEOs that companies were dramatically overstaffed, triggering industry-wide headcount reductions unrelated to AI capability.
- Offshoring and outsourcing, accelerated by COVID normalizing remote work, are the real drivers of job losses. BPO industry growth data shows these jobs are moving overseas, not disappearing.
- A simple demographic aging model demonstrates that the paper's age-based employment patterns can emerge from uniform hiring declines without any AI-specific effects, suggesting the methodology may be fundamentally flawed.
- The paper examines a narrow slice of the job market during an extremely confounded economic period, making causal attribution to AI essentially impossible.