Coding was supposed to be a safe career. Now, software job postings are down 35%, bootcamps are shutting down, and Amazon just cut 16,000 roles. Here’s what actually works — and what the ‘just upskill’ crowd won’t tell you.

Everyone is having the wrong conversation about AI and jobs.
LinkedIn is full of “learn prompt engineering” and “upskill to AI” advice. There are now over 2,000 prompt engineering courses online, and Udemy’s AI enrollments have grown fivefold to 11 million learners. Career coaches are selling courses on “thriving in the AI economy” — the AI career coaching market alone hit $4.2 billion last year. Your company’s HR department just sent an email about “embracing AI as a tool for growth,” even though only 7.5% of employees report receiving extensive AI training, and 56% of workers globally say they’ve received no skills development at all.
Most of this advice is wrong. Not maliciously. Just disconnected from what’s actually happening on the ground.
Here’s what I mean. Nobody gets fired by AI on a Tuesday. What happens is quieter than that. AI erodes your tasks one by one until your role is hollow. You gradually become the person who reviews what the AI already did. Your title stays the same. Your job doesn’t. And the most ironic advice of all — “learn to code” — is especially bad right now, because coding is one of the most automatable white-collar skills we have in 2026. But we’ll get to that.
The question worth asking isn’t “will AI take my job?” It’s more like: which of my tasks will AI absorb over the next couple of years, and what’s left of my role when it does?
Track Task Erosion, Not Job Titles
The Bureau of Labor Statistics tracks occupations. AI doesn’t care about occupations. It eats tasks — specific, repeatable tasks inside occupations. That distinction matters enormously, and most people miss it.
McKinsey’s 2025 report “Agents, Robots, and Us” found that current AI could theoretically automate 57% of US work hours, with AI agents alone covering 44%. That’s not a projection for 2030. That’s what today’s technology can do if companies fully redesign their workflows around it. Whether they actually will, and how fast, is a different question — but the capability is already here.
“Automate” doesn’t mean “eliminate,” though. It means transform. And the transformation happens task by task, not job by job.
Paralegals are a good example. AI isn’t replacing paralegals. It’s replacing document review, contract analysis, and case law research — tasks that used to eat 60–70% of a paralegal’s week. What remains? Client interaction, judgment calls on case strategy, courtroom prep. The title survives. The actual work is fundamentally different.
Same story with financial analysts. AI handles data gathering, trend identification, report drafting. What’s left is the stuff that requires a human who actually knows the client: stakeholder communication, strategic interpretation, the ability to look at a model and say “this doesn’t make sense because the CFO just changed their accounting methodology last quarter.” That kind of contextual knowledge doesn’t live in any training dataset.
Or radiologists. AI reads scans faster and often more accurately than humans do. But complex cases where the scan doesn’t tell the whole story, patient communication, treatment planning, the liability of signing off on a diagnosis — those stay human. At least for now.
The pattern, if you’re looking for one: routine cognitive tasks get absorbed first. Judgment, relationships, tolerance for ambiguity — those survive longer. I’d be careful about saying “survive forever,” because AI judgment is improving faster than most people realize. But the timeline is measured in years, not months.
Goldman Sachs estimates 300 million jobs globally are “exposed” to AI automation, with 6–7% of workers potentially displaced outright. Anthropic CEO Dario Amodei went further, warning that AI could eliminate roughly 50% of entry-level white-collar jobs within five years and push unemployment as high as 10–20%. It’s worth noting that Amodei runs one of the companies building the technology he’s warning about, so take the specific timelines with a grain of salt. But the directional warning is hard to dismiss from someone who sees the capability curve from the inside. Even if he’s off by half, the implications are enormous.
“Exposed” doesn’t mean “eliminated” for most people, though. It means the job you’re doing in two years will look different from the one you’re doing now.
Here’s a practical exercise worth doing this week. List every task you do in a typical week. For each one, ask honestly: could an AI do this 80% as well as I can? The tasks where the answer is “yes” — those are your vulnerable tasks. The ones where the answer is “no” — that’s your moat. Most people who actually do this exercise are surprised by how much of their week falls into the first category.
The Canary in the Coal Mine: What’s Happening to Coders Right Now
If you want to see where task erosion leads when it accelerates, look at software engineering. It’s the clearest case study we have, and the irony is brutal.
For a decade, “learn to code” was the universal career advice. Bootcamps promised six-figure salaries in 12 weeks. Parents pushed CS degrees. Politicians cited coding as the path to the middle class. And for a while, it worked. I watched it work. I’m an engineer myself — I benefited from that wave.
Now coding is one of the first white-collar professions getting hollowed out by AI.
The numbers are stark. Software developer job postings on Indeed are down 35% from five years ago — a five-year low. Revelio Labs data shows postings shrunk over 70% between Q1 2024 and Q1 2025. Entry-level tech hiring at major US companies has dropped more than 50% compared to pre-2020. Stanford’s Digital Economy Lab found entry-level software engineering postings dropped 67% between 2023 and 2024 alone. The tech industry shed 245,000 jobs in 2025, and 2026 is tracking worse — 59,000 cuts in Q1 alone, averaging 704 jobs lost per day.
And it’s not just jobs disappearing. Pay is eroding for the jobs that remain. The Dice 2025 Tech Salary Report found a record 59% of tech professionals feel underpaid — up from 54% in 2023. Less than half got any raise in 2024, down from 55% the year before. Motion Recruitment’s 2026 guide found senior software developers took a 10% year-over-year hit to base compensation. Mid-level SQL developers dropped 7%. Meanwhile, LLM developers averaged $209,000. Generalist coding skills are deflating while AI-specialist skills inflate. Engineer real wages at Big Tech are down an estimated 7% even as those companies posted $327 billion in combined profits and CEO pay climbed 35%. I’ll let you sit with that one for a second.
Globally, average developer salaries have dropped 9–15%, with high-paying Big Tech positions seeing cuts as steep as 30%. The salary growth that defined the 2010s has flatlined — developer pay now rises at roughly the rate of inflation, not above it.
Amazon cut 16,000 corporate roles in January 2026 — its largest single layoff since October 2025, when it axed 14,000. Meta, Google, Epic Games — all restructuring to fund AI pivots. These aren’t pandemic corrections anymore. The pandemic ended years ago. This is something else.
And the companies that aren’t firing? They’re not hiring either. That’s the part nobody talks about.
Shopify’s CEO Tobi Lutke posted a memo — publicly — saying that before requesting new headcount, teams must demonstrate AI can’t do the work first. “Reflexive AI usage is now a baseline expectation at Shopify,” he wrote. That’s a hiring policy at a $100 billion company. Salesforce CEO Marc Benioff told Bloomberg that “AI is doing 30–50% of the work” at Salesforce, and the company might not hire software engineers in 2025 at all.
The math isn’t complicated. GitHub’s controlled experiment found developers finished coding tasks 55% faster with Copilot — 1 hour 11 minutes versus 2 hours 41 minutes. Across the industry, 78% of developers report improved productivity with AI assistants, and 41% of all code produced globally is now AI-assisted. If each developer is that much more productive, you need fewer of them. That’s just arithmetic.
Layoffs make headlines. A job that never gets posted doesn’t show up in any displacement statistic. But the effect is identical — fewer people in the role, fewer entry points for the next generation. The hiring freeze is the displacement mechanism nobody tracks.
The bootcamp pipeline is collapsing alongside it. Kenzie Academy, Rithm School, Code Fellows, Momentum, Codeup — all shut down in recent years. When SNHU closed Kenzie, a university spokeswoman explicitly cited AI as a factor. Enrollments dropped roughly 20% year-over-year starting in 2022. Then in December 2024, 2U — the company that powered university-branded bootcamps at dozens of schools — shut down its bootcamp programs entirely and filed for bankruptcy. The entry-level coding job — the one that was supposed to be the on-ramp to a tech career — is evaporating.
The roles that survive in engineering are the ones AI can’t touch yet: architecture, where you need to understand the business to make system-level tradeoffs. Platform engineering, where infrastructure decisions require organizational context that doesn’t fit in a prompt. Staff-level engineers who own outcomes rather than code output.
The roles getting compressed: feature implementation, bug fixes, boilerplate, CRUD endpoints, test writing. Exactly the tasks AI coding tools handle well. Exactly the tasks junior and mid-level engineers spend most of their time on.
If your value proposition is “I write code,” you’re in a tough spot. That’s a task, not a role. If your value proposition is “I understand the business domain and translate that into technical decisions the team — including AI tools — can execute,” you’re in a different position entirely.
The broader lesson for non-coders: if the “safe” career is getting disrupted this fast, yours probably is too. Software engineering is just ahead of the curve. I wish I had a more comforting way to say that.
The “AI-Adjacent” Strategy
So what actually works? I keep coming back to the same idea: position yourself next to AI, not in front of it.
Being AI-replaceable means doing the thing AI can do — writing reports, crunching data, generating code. Being AI-adjacent means doing the thing that makes AI useful. Deciding what to analyze. Interpreting what comes back. Figuring out what to do with the output. Dealing with the humans affected by those decisions.
This isn’t about becoming a “prompt engineer.” That’s a transitional role — like being a “Google search expert” in 2005. As AI gets better at understanding what you actually want, the specialized skill of talking to it becomes less special. I give it maybe two more years before “prompt engineering” sounds as dated as “webmaster.”
What AI-adjacent looks like depends on your field, and I think the specifics matter more than the abstraction. In marketing, the copywriting is increasingly automated, but the brand strategy and audience insight that tells the AI what to write? That requires someone who actually understands the market. In finance, building the model is table stakes now, but interpreting the model for the board and making the call — that’s where the value sits. In legal, the contract review is getting automated fast, but advising the client on the risk the AI flagged requires judgment that comes from years of practice. In engineering, I’ve watched this firsthand: writing code is becoming commodity work, but architecting the system and owning the tradeoffs nobody else wants to own — that’s still deeply human.
The thread running through all of these: domain expertise combined with judgment. AI generates. Humans decide what to do with what it generates. At least for now.
Here’s where I think this is heading. As AI agents get more capable — handling multi-step workflows, making decisions, executing tasks on their own — the role that emerges is something closer to AI manager. HubSpot CTO Dharmesh Shah put it simply: “The goal is to build with the machine.” OpenText’s chief strategy officer Tom Jenkins went further — in the near future, professionals will manage hundreds of AI agents the way managers today oversee teams of people. The shift is from doing to directing. Defining objectives, evaluating outputs, course-correcting when the AI gets it wrong, owning the outcomes.
That’s not a futuristic abstraction. It’s already happening. I know companies where one senior analyst now oversees AI-generated reports that used to require a team of four. The analyst’s job didn’t disappear — it changed from producing analysis to directing and quality-checking AI-produced analysis. The value moved upstream. Whether that’s good or bad depends on whether you’re the analyst who adapted or one of the three who didn’t.
ATMs didn’t kill bank tellers. ATMs handled cash, so tellers became relationship managers. Banks actually hired more tellers after ATMs — but the job changed completely. Excel didn’t kill accountants. It killed the ones who only did arithmetic. The ones who understood what the numbers meant? They thrived. I suspect we’ll see the same pattern with AI, just faster and across more professions simultaneously.
Domain Expertise Plus Judgment Is Your Moat
AI is very good at pattern matching across large datasets. AI is terrible at understanding organizational politics, reading a room, knowing which rules can be bent, figuring out why the data looks weird this quarter.
“Judgment” gets dismissed as a soft skill. I think that’s a mistake. It’s accumulated pattern recognition from years of domain experience, applied to situations where the answer isn’t obvious. The nurse who knows something is off before the vitals confirm it. The project manager who can tell a timeline is slipping from the tone of the standup. The sales rep who knows this particular client needs to hear the ROI story, not the feature story.
AI can’t replicate these yet — not because it’s theoretically impossible, but because the contextual understanding required doesn’t exist in any training dataset. Your organization’s unwritten rules, your client’s personality, your industry’s informal power structures — you learned those through years of experience, not from the internet. That said, I want to be honest: AI judgment is improving. The moat is real but I wouldn’t bet my career on it being permanent. Five years from now, some of what I just described as “deeply human” might not be.
The play: go deeper in your domain while AI handles the commodity knowledge work. The sweet spot is T-shaped — deep in one area, broad enough to connect across several. Pure generalists are more vulnerable than specialists right now. But hyper-specialists in narrow, automatable niches are vulnerable too. The resilient position is deep expertise combined with the ability to apply it in different contexts. I realize that sounds like a platitude, but I’ve watched it play out in my own field — the engineers who survived the last two years of cuts are the ones who understood the business, not just the codebase.
Build a Career That Bends, Not Breaks
Your career needs to be modular, not monolithic. I think about this in terms of three types of optionality: skill portability, financial flexibility, and structural adaptability.
Skill Portability
The old model was straightforward — pick a career, get credentials, climb the ladder for 30 years. That model assumed the ladder would still be there in 30 years. The new reality probably requires expecting 3–5 major transitions. Not just job changes — role-type changes. The kind where your LinkedIn headline looks completely different.
Credential portability matters more than credential prestige now. A law degree is valuable. A law degree plus project management plus data literacy is resilient. One is a single point of failure. The other is a diversified portfolio. I’ve started thinking about careers the same way I think about investment portfolios — concentration risk is the enemy.
Skills that transfer across industries are your most durable assets: communication, stakeholder management, systems thinking, quantitative reasoning, negotiation. Skills tied to a specific tool or platform — “I’m an expert in Salesforce” or “I know SAP inside and out” — have a shorter shelf life than they used to. That doesn’t mean they’re worthless. It means they’re not enough on their own anymore.
Financial Flexibility
Unsexy but critical. Six months of expenses saved isn’t just an emergency fund. It’s career flexibility — the ability to say no to a bad transition and wait for a good one. I’ve watched people take terrible lateral moves because they had two weeks of runway. Don’t be that person.
You don’t need to overhaul your life to start. Target $500 first, then build. Automate $25–50 per paycheck into a separate high-yield savings account. Pack lunch twice a week. Cut the daily coffee run to twice a week — that’s $70–80/month. Delay the new car or buy certified pre-owned. Cancel the subscriptions you forgot you had. None of these are dramatic. All of them compound. Pennies become quarters, quarters become dollars, dollars become the three months of runway that let you make a strategic career move instead of a desperate one.
For a deeper playbook, NerdWallet’s emergency fund guide and Ramit Sethi’s “I Will Teach You to Be Rich” framework are both practical and no-nonsense.
Structural Adaptability
Be skeptical of “just retrain” narratives. Most corporate reskilling programs are performative — they exist so the company can say they offered training, not because they expect it to work. Six-week bootcamps don’t replace ten years of domain expertise, and as we’ve seen, the bootcamp model itself is collapsing.
What actually works is incremental skill stacking over time. Side projects that expose you to adjacent domains. Cross-functional work that builds relationships outside your silo. Internal transfers that broaden your organizational context. Volunteering for the messy, ambiguous projects nobody else wants — those are the ones AI can’t handle, which is exactly why they’re valuable career moves.
You’re not trying to reinvent yourself overnight. You’re trying to make yourself incrementally harder to replace, quarter by quarter. That’s a less inspiring sentence than “transform your career,” but it’s more honest about how this actually works.
The Multiple Transitions Reality
Previous tech disruptions had a beginning and an end. Factories automated. Workers retrained or didn’t. A new equilibrium was reached. You could point to a decade and say “that’s when it happened.”
AI is different because it keeps getting better. The tasks that are safe today may not be safe in three years. “Reskilling” isn’t a one-time event anymore. It’s a permanent posture, which is exhausting to think about, but I don’t see a way around it.
A practical framework: every 18–24 months, reassess your task portfolio. Which tasks moved from “human advantage” to “AI can do this”? Build learning into your schedule — not courses for the sake of courses, but targeted skill acquisition based on where your task portfolio is actually shifting. I do this myself. It’s not fun. But it’s kept me relevant through two major waves of change in my own field.
The people who will struggle most are the ones who optimized for stability and routine. The 20-year veteran who does the same thing every day and does it well — that consistency used to be an asset. Now it’s a liability, and that’s a genuinely unfair thing to say about people who did exactly what they were told would lead to a good career. But the world changed.
The people who will do better are the ones comfortable with ambiguity. People who have changed roles before. People who see their career as a portfolio of capabilities rather than a single identity.
I’m not going to pretend continuous adaptation is fun. It’s not. It’s tiring and sometimes demoralizing. But it’s necessary.
What NOT to Do
Most career advice about AI is wrong, or at least incomplete. Here’s a quick tour of the stuff I’d ignore.
“Learn to code.” We covered this. Software developer postings down 35%, bootcamps shutting down, entry-level hiring collapsed by 50%. Learning to code for career safety in 2026 is like learning to drive a horse-drawn carriage in 1910. If you enjoy coding, by all means learn it. But don’t do it because you think it’s a safe career bet.
“Become a prompt engineer.” This one drives me a little crazy. It’s a transitional role. As AI gets better at understanding intent, prompt engineering becomes less specialized. I’d give it a short shelf life — maybe two to three years before it’s just a normal part of using a computer, like knowing how to use a search engine.
“Get an AI certification.” Most certifications teach tool usage, not judgment. The tool will change. The judgment won’t. If you’re going to spend money on education, spend it on something that’ll still be relevant when the current crop of AI tools is obsolete.
“Just be creative.” AI is increasingly creative. Creativity alone isn’t a moat. Creativity applied to a specific domain with judgment? That’s closer to a moat. But “just be creative” as career advice is about as useful as “just be smart.”
“Soft skills will save you.” Partially true, mostly oversimplified. Which soft skills? Communication, yes. Empathy in client-facing roles, yes. “Being a people person” in a back-office role where you mostly work alone? Probably not enough.
The thing all this bad advice shares: it treats AI disruption as a single event requiring a single response. Take one course, get one certification, learn one skill, and you’re safe. That’s not how this works. It’s an ongoing shift requiring ongoing adaptation, and anyone selling you a one-time fix is either confused or trying to take your money.
What Training IS Worth Your Money
Mark Cuban has said that investing in yourself is the best investment he ever made — and that if he were starting out today, he’d spend every free minute learning about AI. But here’s the nuance people miss when they quote him: Cuban isn’t saying “take a prompt engineering course.” He’s saying learn how to apply AI to real business problems. “Companies don’t understand how to implement AI right now,” he told TBPN. “Learn to customize a model, walk into a company, show the benefits.”
So what’s actually worth your money and time? I use a simple filter: will this training still be valuable if the specific AI tools change completely in 18 months? If yes, invest. If no, skip it.
Four categories pass that test, in my experience.
Go deeper in your own field. A supply chain manager who understands demand forecasting theory will thrive regardless of which AI tool runs the model. An HR professional who masters organizational design will always be needed to decide what to restructure — AI just speeds up the analysis. Whatever your field’s equivalent of “the hard stuff nobody wants to learn” is — that’s your investment. Certifications like PMP, CFA, Six Sigma, or clinical specializations hold value precisely because they represent depth that can’t be shortcut.
Learn to read data, not build models. You don’t need a data science degree. You need data literacy — the ability to look at a chart and ask “what’s missing here?” or “why does this quarter look different?” Courses in statistics fundamentals, business analytics, or even Excel-level data manipulation pay off because they make you the person who catches what the AI got wrong. Every field needs people who can interrogate outputs, not just consume them.
Build the skills that happen in rooms, not on screens. Negotiation. Conflict resolution. Presenting to a hostile audience. Managing up. These compound over a career and become more valuable as AI handles the prep work. A negotiation course from Harvard’s Program on Negotiation or even a local Toastmasters chapter isn’t glamorous, but it builds capability no AI can replicate — because it requires reading humans in real time. I’ve gotten more career mileage out of learning to present to skeptical executives than out of any technical certification I’ve earned.
Understand how AI changes your industry’s workflow — not how AI works under the hood. You don’t need to know transformer architecture. You need to know that AI is changing how insurance claims get adjudicated, how architectural drawings get reviewed, how clinical trials get designed. Industry conferences, trade publications, and professional association workshops are often better investments than generic AI courses because they’re taught by people who actually understand your context.
What to Do This Month
Everything above is strategy. Here’s what to do with it right now — not next quarter, not when you “have time.” This month. I’m being specific on purpose, because vague advice is the same as no advice.
Week 1: The Task Audit
Open a spreadsheet. List every task you do in a typical week. Be specific — not “manage projects” but “update status reports,” “run standup meetings,” “review vendor proposals,” “write project briefs.” Aim for 15–25 tasks.
Score each one on two dimensions. First: could AI do this 80% as well as me today? Yes, partially, or no. Second: does this task require context only I have — relationships, institutional knowledge, judgment about ambiguous situations? Yes or no.
Tasks that score “yes AI can do it” and “no unique context” are your red zone — that’s where your role is most exposed. Tasks that score “no AI can’t” and “yes unique context” are your moat. Everything in between is the battleground.
Then do something most people skip: for every red zone task, actually try using an AI tool to do it this week. Not hypothetically. Literally. Use ChatGPT, Claude, Copilot, whatever you have access to. Draft that report. Summarize those meeting notes. Generate that first pass of analysis. Two things will happen. You’ll see exactly how close AI already is to replacing that task — which makes your audit real instead of theoretical. And you’ll become the person in your organization who knows how to use these tools, which is itself a form of career insurance. The people who get displaced aren’t the ones using AI. They’re the ones ignoring it while someone else figures out how to automate their work.
Week 2: Map Your Moat
Look at your red zone tasks. How much of your week do they eat? If it’s more than 40%, you have a timeline problem — not existential, but worth taking seriously.
Now look at your moat tasks. Those are the ones to double down on. How do you spend more time there? Can you volunteer for projects that lean on those skills? Can you delegate or automate the red zone tasks yourself — before someone else decides to do it for you?
The goal isn’t to eliminate vulnerable tasks overnight. It’s to shift the ratio. If 60% of your week is red zone today, try to get it to 40% within six months by actively migrating toward your moat. That’s a realistic target. It won’t happen by accident.
Week 3: Start One AI-Adjacent Move
Pick one concrete action that moves you toward AI-adjacent positioning in your field. Not a course, not a certification — an actual thing you do at work this week.
If you’re in marketing, volunteer to lead the strategy review for a campaign the AI tools are drafting. Own the “why” and “for whom” instead of the execution. If you’re in finance, offer to present the AI-generated analysis to stakeholders — the person who interprets and communicates the model is more valuable than the person who built it. If you’re in engineering, take on an architecture or system design task instead of another feature ticket. If you’re in legal, position yourself as the person who advises on the risks the AI flagged, not the person who does the initial review.
The pattern is simple: move one step up the value chain from execution to judgment. One step. This month. You can figure out the next step next month.
Week 4: Shore Up Your Optionality
Three things, none of which require quitting your job.
First, update your professional network — and I don’t mean “connect with 50 people on LinkedIn.” Reach out to three people in adjacent roles or industries. Have a real conversation about what’s changing in their world. Cross-pollination is how you spot opportunities before they’re posted. I’ve gotten two of my last three roles through conversations like this, not through job boards.
Second, check your financial runway. How many months could you sustain a career transition without income? If the answer is less than three, that’s your most important non-career priority right now. Even small improvements — an extra month of buffer — meaningfully expand your options.
Third, identify one skill-stacking opportunity at your current job. A cross-functional project. A stretch assignment. An internal transfer conversation. Something that broadens your context without requiring you to leave. The best career insurance is built inside the job you already have.
One more thing worth considering: a side income stream. Not “start a business” — that’s a different risk profile, not a lower one. But freelance consulting in your area of expertise, a small portfolio of clients, or a modest side project that generates some revenue. It doesn’t need to replace your salary. It just needs to exist — so that if your primary income gets disrupted, you’re not starting from zero. Career diversification, same principle as an investment portfolio.
The Honest Assessment
I’ve spent this article talking about what individuals can do. Individual agency is real. But I’d be dishonest if I didn’t acknowledge where it runs out.
Not Everyone Starts From the Same Place
The strategies here — skill stacking, financial optionality, career pivots — assume a degree of flexibility that not everyone has. A 28-year-old data analyst in Austin with no kids and six months of savings has very different options than a 54-year-old paralegal in rural Ohio supporting aging parents. I’m aware that I’m writing this from a position of relative privilege — I work in tech, I have savings, I have options. Not everyone does.
Age matters — not because older workers can’t adapt (many are the most experienced and judgment-rich people in their organizations), but because the labor market discriminates against them, and the runway for ROI on a major career pivot is shorter. Geography matters — remote work has expanded options, but plenty of industries and roles are still place-dependent. If the local economy is built around one employer or one industry, a structural shift hits differently than it does in a diversified metro. Financial situation matters — the advice to “save six months of expenses” is sound but irrelevant if you’re already living paycheck to paycheck. Career flexibility correlates with existing privilege. That’s just the reality, and pretending otherwise doesn’t help anyone.
The Structural Gap
This is where policy enters the picture — not as an abstraction, but as the difference between a manageable transition and a catastrophic one.
The US workforce policy stack was built for trade shocks. A factory closes, workers get Trade Adjustment Assistance, they retrain for a new industry. That model doesn’t work for AI displacement because AI doesn’t close one factory — it erodes tasks across every industry simultaneously. There’s no single factory to point at, no single retraining program that covers it.
What’s needed but largely absent: portable benefits that follow workers across jobs and industries. Wage insurance that bridges the gap during transitions. Geographic mobility support for workers in single-industry regions. Reskilling programs measured by employment outcomes, not enrollment numbers.
These aren’t policy talking points. They’re the infrastructure that determines whether the strategies in this article are available to most people or just the already-privileged few.
The Uncomfortable Middle Ground
Don’t let anyone tell you this is entirely your responsibility. The structural forces reshaping work are bigger than any individual’s career choices. Advocating for better transition policy — through your vote, your professional associations, your employer — matters as much as any personal career strategy.
But waiting for policy to save you isn’t a strategy either. The political timeline for workforce legislation is measured in years. AI’s impact on your task portfolio is measured in months.
Act on what you can control. Advocate for what you can’t. Hold both at the same time. I know that’s an unsatisfying answer. It’s the honest one.
The Uncomfortable Truth
Humans are remarkably adaptable when they see the threat clearly and have agency to respond. That’s the good news.
The bad news: not everyone will have equal agency, and the window for preparation is shorter than most people think.
Do the task audit. This week. Not because I said so, but because you’ll learn something about your own career that you probably don’t want to learn — and that’s exactly why it’s worth doing. List your tasks. Identify the vulnerable ones. Start building your moat around the ones that aren’t.
Key Takeaways:
- AI erodes tasks within jobs, not job titles. McKinsey estimates current AI could automate 57% of US work hours. Goldman Sachs puts 300 million jobs globally at “exposed”
- Software engineering is the canary: postings down 35%, entry-level hiring down 50%+, bootcamps shutting down, 59,000 tech jobs cut in Q1 2026
- Position yourself AI-adjacent — directing AI, interpreting outputs, making judgment calls — not AI-replaceable
- Domain expertise combined with judgment is your strongest moat, but it’s not permanent
- Build career optionality through skill portability, financial flexibility, and structural adaptability
- Expect multiple career transitions, not one reskilling moment
- Start using AI tools now — both to boost your productivity and to see firsthand which of your tasks are vulnerable
- Individual agency is real but limited by structural constraints. Act on what you can control, advocate for what you can’t
References
- McKinsey Global Institute, “Agents, Robots, and Us: Skill Partnerships in the Age of AI” (2025)
- Goldman Sachs Research, “How Will AI Affect the US Labor Market?” (2026)
- Indeed Hiring Lab / TechSpot, “Software engineering job openings hit 5-year low” (2025)
- Revelio Labs / Times of India, “Software developer jobs shrink by over 70% in the US” (2025)
- Talent500, “Entry-Level Developer Jobs: Graduate hiring dropped more than 50%” (2025)
- Stanford Digital Economy Lab / Substack, “The Junior Developer Is Going Extinct” — entry-level postings down 67% (2025)
- TrueUp / Inkl, “Tech Layoffs Surge to 59,000 in 2026” (2026)
- Indian Express, “Tech layoffs January 2026: Amazon cuts 16,000 roles” (2026)
- Storyboard18, “Tech Layoffs 2026: 123,941 laid off in 2025” (2026)
- Dice, “2025 Tech Salary Report: 59% feel underpaid, only 45% received raises” (2025)
- Motion Recruitment / Kelly Services, “2026 Tech Salary Guide: Senior devs down 10%, LLM devs at $209K” (2026)
- TechiExpert, “Developer Demand Dives, Salary Stagnation Looms: salaries down 9–15% globally” (2025)
- Jeffry.in, “US SWE Economy 2025: Engineer real wages down 7%, CEO pay up 35%” (2025)
- TechCrunch, “Shopify CEO tells teams to consider using AI before growing headcount” (2025)
- Business Insider, “This Chart Shows How Bad the Job Market for Software Engineers Is” (2025)
- GitHub / Worklytics, “Copilot productivity study: 55% faster task completion” (2025)
- TechLila, “AI Productivity Tools Adoption: 78% report improved productivity, 41% of code AI-assisted” (2026)
- Forbes, “The Tech Boot Camp Shakeout: Kenzie, Rithm, Code Fellows, Momentum, Codeup shut down” (2025)
- Inside Higher Ed, “Southern New Hampshire closing Kenzie Academy — AI cited as factor” (2023)
- Inside Higher Ed, “2U ends boot camps, shifts to microcredentials” (2024)
- TechLoy, “How AI is rewriting the software development industry — bootcamp enrollments down 20% YoY” (2025)
- Class Central, “2,000+ Prompt Engineering Online Courses” (2026)
- Quasa.io / Coursera-Udemy, “AI course enrollments grew fivefold to 11 million learners” (2025)
- Scoop Market, “AI Career Coach Market: $4.2B in 2024” (2025)
- WalkMe / SAP, “Only 7.5% received extensive AI training” (2025)
- Fortune / ManpowerGroup, “56% of workers received no recent skills development” (2026)
- Business Insider, “AI Is Already Taking Human Jobs — job postings for AI-doable tasks declined 19%” (2025)
- Epic Games, “Today’s Layoffs” — official announcement, 1,000+ employees, 20% of workforce (March 2026)
- NBC News, “Meta begins laying off hundreds of employees across five divisions” (March 2026)
- Entrepreneur, “Google Offers Buyouts — voluntary exit packages amid AI-focused restructuring” (2026)
- TechStartups, “Google eliminates 35% of managers as layoffs and buyouts reshape workforce” (2025)
- NerdWallet, “Emergency Fund Calculator: How Much Should I Have?” (2026)
- Ramit Sethi, “Emergency Fund: How a few thousand dollars can save your life” (2025)
- Inc., “Mark Cuban Says Living a Successful Life Comes Down to 1 Simple Habit — Invest in yourself” (2025)
- Inc., “Mark Cuban Says Young People Should Learn This Crucial AI Skill” (2025)
- Fortune, “Mark Cuban to Gen Z: ‘AI is never the answer; AI is the tool’” (2025)
- Business Insider, “Anthropic CEO Warns AI Could Wipe Out 50% of Entry-Level Office Jobs” (2025)
- Forbes, “AI-Proof Your Career In 2026: Make AI Your Ally, Not Your Enemy” (2025)
About the Author: Daniel Stauffer is an Enterprise Architect specializing in AI systems and platform engineering. He writes about the intersection of technology, work, and policy — and spends his days building the AI systems this article warns you about.
#AI #FutureOfWork #CareerAdvice #Technology #ArtificialIntelligence
How to AI-Proof Your Career (No, Not by “Learning to Code”) was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.