Thirty-three months after ChatGPT's release, Yale Budget Lab's current-state tracker finds no economy-wide AI-driven employment disruption in the United States. The occupational mix is shifting only about one percentage point faster than during early internet adoption, and real usage is concentrated in coding-heavy roles rather than across white-collar fields like marketing. Task-based exposure measures and real usage signals correlate weakly, so exposure does not equal displacement yet. Unemployment dynamics show no clear AI fingerprint to date [S1][S2][S3][S4]. See Yale's ongoing research for details.
AI tools eliminating jobs: economy-wide evidence to date
Yale's Budget Lab combines BLS labor microdata with exposure metrics and vendor usage data to assess AI's labor impact since late 2022. The latest read shows aggregate stability and a sizable gap between where AI could apply and where it is actually used [S1].
Executive snapshot
- Economy-wide disruption: No discernible aggregate AI-driven employment disruption 33 months after ChatGPT's release [S1].
- Occupational churn: The occupational mix is shifting on a path only about 1 percentage point faster than during early internet adoption [S1].
- Exposure vs usage: OpenAI's exposure metric and Anthropic's usage show limited alignment; usage is heavily concentrated in Computer and Mathematical occupations, with Arts/Design/Media also overrepresented [S1][S2][S3].
- Unemployment signal: Unemployed workers' prior occupations had about 25-35% of tasks performable by genAI, with no clear upward trend across durations; occupation-level AI usage shows no relationship with changes in employment or unemployment [S1].
- Sectoral nuance: Information, Financial Activities, and Professional/Business Services show larger shifts, but these trends predate ChatGPT's release [S1].
Implication for marketers: Near-term job displacement risk appears limited. The practical lever is targeted adoption and workflow redesign, not headcount cuts.
Method and source notes on AI labor impact
- What was measured: Changes in the U.S. occupational mix and unemployment dynamics by duration, linked to task-level exposure and real adoption since November 2022, benchmarked against historical tech shifts and sectoral patterns [S1].
- Data and methods:
- BLS Current Population Survey (about 60,000 households monthly) for occupational shares and unemployment by duration [S4].
- OpenAI's exposure metric (O*NET task-level susceptibility) to proxy theoretical impact by occupation [S3].
- Anthropic's usage index for cross-occupation AI adoption intensity [S2].
- Limitations:
- Usage data is single-provider and may undercount private or enterprise deployments; mapping to occupations may rely on self-classification [S2].
- Exposure scores reflect task susceptibility, not adoption or outcomes, and may overstate short-run impact [S3].
- Aggregate stability can mask firm-level restructuring; CPS data has sampling error and classification lag [S4].
- The tracker is descriptive, not predictive; effects may materialize with a lag [S1].
Findings on exposure, real usage, and employment dynamics
- No aggregate displacement signal: Yale reports no discernible disruption in the broader labor market despite widespread availability of genAI tools [S1].
- Pace of change: The occupational mix is shifting only about 1 percentage point faster than during early internet adoption [S1].
- Usage concentration: Real usage is dominated by Computer and Mathematical occupations, with Arts/Design/Media also overrepresented. Adoption remains concentrated in coding and adjacent creative workflows [S1][S2].
- Exposure-usage gap: OpenAI's exposure metric and Anthropic's usage measure different constructs and correlate only weakly. High exposure does not imply near-term adoption or displacement [S1][S2][S3].
- Sectoral patterns: Information, Financial Activities, and Professional/Business Services show above-average occupational shifts, largely continuing pre-ChatGPT trends [S1].
- Unemployment dynamics: Across durations, unemployed workers tended to come from occupations with roughly 25-35% of tasks performable by genAI, with no clear upward trend over time. Occupation-level AI usage shows no relationship with employment or unemployment changes to date [S1].
- Historical context: Broad workplace disruption typically unfolds over decades. Prior general-purpose technologies like office computing exhibited long adoption lags before transforming workflows [S1].
Interpretation and implications for marketing leaders
- Likely: Exposure is not outcome. High task exposure in marketing does not imply short-run displacement without adoption, integration, and process change. Monitor real usage and performance gains before structural workforce moves [S1][S2][S3].
- Likely: Focus investment on task-level augmentation (drafting, QA, analytics support, basic code for tagging and feeds) and measurable productivity lift, not headcount assumptions [S1][S2].
- Tentative: Entry-level roles built around narrowly defined, high-exposure tasks may see faster task reconfiguration. Skills in prompt-driven analysis, data hygiene, and tool orchestration are prudent hedges [S1].
- Tentative: Adoption is strongest in coding-adjacent tasks. Marketing teams may see the biggest near-term return in analytics engineering, experimentation frameworks, and content tooling tied to dev workflows [S2].
- Speculative: If multi-provider usage broadens beyond coding roles and enterprise integrations deepen, the exposure-usage gap could narrow, increasing pressure on content-heavy workflows. Track cross-platform usage indices and BLS occupation shifts quarterly [S1][S2].
Contradictions and gaps to monitor
- Company announcements vs aggregates: Publicized AI-related layoffs exist, but tracked AI-cited job cuts have been a small share of total U.S. layoffs; firm-level anecdotes are not yet reflected in aggregate dynamics [S6][S1].
- Measurement gaps: Single-vendor usage data under-represents private models and internal tools; exposure scores are static and may not reflect redesigned tasks. Enterprise-grade usage telemetry remains limited [S2][S3].
- Lag risk: General-purpose technologies often show multi-year lags between exposure and labor effects. Current stability does not preclude medium-term reorganization [S1].
- Occupational coding noise: CPS codes and self-reported roles can blur boundaries between marketing, product, and analytics, softening occupation-level signals [S4].
- International scope: Findings are U.S.-centric; adoption and displacement could differ in other labor markets with different regulations and industry mixes [S1].
Data appendix: referenced metrics
- Aggregate disruption: No discernible economy-wide AI employment disruption identified over 33 months post-ChatGPT [S1].
- Occupational churn pace: On a path roughly 1 percentage point higher than early internet adoption [S1].
- Usage concentration: Heaviest usage in Computer and Mathematical occupations; Arts/Design/Media above workforce share [S1][S2].
- Unemployment signal: Prior occupations of unemployed workers had about 25-35% of tasks performable by genAI, with no clear upward trend [S1].
Method and source IDs
- [S1] Yale Budget Lab - "Evaluating the Impact of AI on the Labor Market: The Current State of Affairs." Ongoing monthly analysis using BLS CPS, OpenAI task exposure, and Anthropic usage; compares occupational mix shifts and unemployment dynamics; latest update referenced in 2025. See research.
- [S2] Anthropic - Economic Index of AI. Aggregated Claude usage by occupation and industry over time; adoption and intensity indicators across SOC groups; 2024-2025 releases. See Anthropic's usage.
- [S3] OpenAI (Eloundou, Manning, Mishkin, Rock, 2023) - "GPTs are GPTs: An Early Look at the Labor Market Impact Potential of Large Language Models." Task-level exposure estimates using O*NET; occupation exposure scoring.
- [S4] U.S. Bureau of Labor Statistics - Current Population Survey (CPS). Monthly labor force microdata (about 60,000 households), unemployment by duration, occupation coding.
- [S5] Indeed Hiring Lab - Generative AI exposure analyses of U.S. job postings. Task-based exposure estimates by occupation and industry; methodology based on posting-level task content.
- [S6] Challenger, Gray and Christmas - Monthly Job Cuts Reports (2023-2024). Counts of layoff announcements including categories citing AI; context for scale relative to total cuts.