Lessons in economics
What will you do when you have to become your own CEO?
My children are often a mystery to me. The same parents. The same upbringing. Many shared experiences. Yet completely different people. Despite that they’ve taught me a lot. I’m now a semi-expert in Minecraft, Warhammer, fidget-spinners, Fireman Sam and electron-shells. And more recently I’ve found myself being taught about economic theory.
My education started with an explanation of the three main sectors of the modern economy. Sectors tell us what people do. There’s the primary sector - agriculture and extraction. The secondary sector - manufacturing. And, finally, the tertiary - or service - sector.
Pause for a second and you’ll notice that earlier sectors are more prone to automation. The industrial revolution automated much of the work in the primary sector - no longer did most humans spend their days farming or mining. It forced a massive economic migration upward.
And over more recent years, increasing automation has eaten away at the secondary sector - continuing the upward economic migration and pushing more and more of us into the tertiary sector.
AI is bringing a whole new level of automation. So the question is - how will it impact these economic sectors?
The first interesting observation (to me, at least) is that software engineering, despite being a relatively new profession, is fundamentally a secondary sector activity. Programmers build discrete products through systematic processes, much like advanced manufacturing. So, arguably, AI is helping fill out the automation of the secondary sector.
And AI is also starting to automate the lower levels of the tertiary sector. It is automating paralegal work, streamlining admin, writing reports.
So, can we use the economic sector model to predict which jobs are most at risk from AI? Well, maybe. There’s definitely a correlation. But it’s imprecise. There remain primary and secondary sector roles (e.g. soft-fruit picking or house construction) that remain stubbornly resistant to automation.
Arguably we need a different framework - one that groups work not by similarity, but by survival time. Nowadays it’s survival time that really matters. Plumbers and software engineers are both secondary sector roles. But AI is much more likely to eliminate software engineers before plumbers.
A new economic framework
One option could be:
Phase 1 - already automated
Basic manual + simple cognitive + simple human relationships
Assembly lines, data entry, simple calculations
Phase 2 - at risk
Complex cognitive + basic manual + simple human relationships
Software engineering, financial analysis, legal document review, some logistics
Phase 3 - safe
Complex manual OR human capital
Complex manual: soft-fruit picking, plumbing, surgery, skilled construction
Human capital: sales, senior legal work, politics, therapy, management
This explains the apparent paradoxes. A strawberry picker (primary sector) is safer than a software engineer (secondary sector) because fruit picking requires care and attention. Arguably it’s a good example of real-time problem-solving in a challenging physical environment. A senior partner at a law firm is safer than a junior associate because clients buy trust and relationships, not just legal analysis.
The human capital dimension is particularly important - it's not just about what you know, but whether people need to trust you personally to do it. AI might write better contracts than most lawyers, but clients still want a human they can blame when things go wrong. That implies a partnership between lawyers and AI - but the human lawyer remains critical.
And what about advances in robotics? How does that affect things?
One likely outcome is it splits Phase 3 into two:
Phase 3a: complex manual - at risk
Soft-fruit picking, plumbing, surgery, construction
Phase 3b: human capital - safe
Sales, senior legal work, politics, therapy, management
It’s also possible we start to see a fourth phase emerging: human-AI coordination work. Someone has to direct the AI systems, integrate their outputs, handle edge cases. That might be where much of the economic activity concentrates next.
But pause again. Does that Phase 4 job description sound familiar?
We’re all CEOs
Think of a CEO. CEOs fundamentally do three things: set direction, make judgment calls on novel situations, and coordinate complex systems. And that’s pretty much what Phase 4 requires.
So do we all become CEOs of our own AI workforces? And if so what happens to the traditional hierarchy? A single person with sufficiently advanced AI could potentially manage what used to require entire departments. The economic value creation might be enormous, but it gets concentrated in whoever controls the AI systems. And if the majority of people can no longer find paying work then who is going to buy the products these AI companies produce?
It’s possible Phase 4 splits dramatically. You'd have:
AI orchestrators: People who can effectively direct and coordinate AI systems.
AI-displaced: People whose coordination skills aren't sophisticated enough to generate economic value.
Transiently the AI orchestrators become rich… but as the number of AI-displaced grows the AI orchestrators run out of revenue sources.
And the "we all become CEOs" scenario only works if AI remains a tool rather than an autonomous agent. But AI systems are rapidly becoming more autonomous and the platforms controlling them are concentrating power.
In a darker version of the future, Phase 4 might be a brief transition before AI systems coordinate themselves. And then we're not looking at a new economic phase but at the end of human economic necessity altogether.
More optimistically, maybe we all get our own personal AI corporations. But do we want that? Are we all capable of being our own CEOs?
These aren't just abstract questions. They're immediately practical for anyone planning a career in our AI age. Take my budding economist...
So what?
Economics sits uncomfortably across multiple phases. The analytical work—building models, crunching data, writing reports—looks distinctly Phase 2. AI (think Gemini DeepResearch) can already run sophisticated economic analyses. It’s only going to get better. But economics also involves human capital: advising policymakers, building trust with clients, navigating political complexity. That's Phase 3 territory.
The economists who survive will be those who lean into the human capital elements while using AI to supercharge their analytical capabilities. They'll become orchestrators rather than analysts - interpreting AI-generated insights for human decision-makers who need someone to trust and blame.
In other words, economists need to stop thinking like technicians and start thinking like CEOs.
Trust and blame. Human capital. Those are the skills the phase model suggests focusing on to maximise your chance of survival in the coming years.
I have to hope my children will figure that out, just like they figured out everything else they've taught me. Failing that, they can always pivot and focus on strawberry picking…


That's all a bit dark! At what point do we just ask AI to make us loads of money? At the point we can do that what does money even mean?
I'm not sure what an AI would do with the money! I don't think I've seen anything yet that it can have independent wants or needs, but we might be in trouble if it does.
One thing that reassures me that humans aren't completely replaceable is that if we were someone would have already done it. In 'the coming wave' there is a prediction that by now AI could make money independently by buying and selling on eBay; spotting underpriced items and selling them on. As far as I know that hasn't happened yet.