← All thoughts

Everyone's Wrong on Employment

People say we’re all going to be jobless. Or that we’ll all be plumbers and electricians as they’ll be safe from the current iteration of AI (pre AI-powered robots and world-models, etc.)… what?? Who wants to do manual labor when we could be doing white-collar work? Or no work? Why not give everyone spoons instead of shovels to dig trenches? This whole area of thought is ridiculous.

Alex Karp is right that this is going to get regulated to hell if it comes to be but I don’t think he or anyone else is actually thinking clearly:

If a company has output of 100% today, with 10 people, sure, it could fire 5 and give AI to the other 5 and maintain or grow its output.

But why not keep all 10, give everyone AI, and produce 1000x more than we could? We are suddenly not resource constrained and can do so much more with what we have or even better — hire more people at this new, insane ROI on our people.

If the output of a company today can be done by just 1 person, than every company tomorrow should have the output of a hundred companies! Because there are 100 people in them, each with an output that matches a company pre-AI.

And the work we abstract to AI is horrible grunt work that numbs the mind — everyone will need to upskill, become much more capable, and autonomous, and do all the things that people wanted to do before but had to sacrifice away because of time and resources and people and all else.

Humans will ALWAYS need to be in the loop. The world is not simple. We try to kill each other, there’s geopolitics, lobbying, we try to prove we’re better, have lust, and different personalities and the butterfly effect makes us all different. We have terrorists and religions and a million other things that impact what we want to do and why that people need to be involved with and believe differently about. We need accountability, and we need people deciding what happens.

Someone needs to be held responsible at the end of the day.

If we let AI agents run us, this is the equivalent of humans telling the Gods of mythology what to do which is obviously not sane and not what transpired. AI agents are built to work for us. We will always be in control. We must — we have basic survival instincts, I’d hope. What will the world look like when work is more high-level, and human relationships are needed more than human brainpower and we act as gatekeepers and generals rather than soldiers in our economic armies?

But to go back — what will we do when we have this insane output for each person? Yes, GDP skyrockets, and for a future-multiplanetary species, that level of output might make sense. But really, we work on more interesting, bigger, more meaningful projects that benefit humanity rather than optimizing the exact size of an advertisement on a page designed to make people spend money they don’t have on things they don’t need. We do useful work because that’s where there’s ROI that remains and pushes humanity forwards.

We’ll probably be treated somewhat like the Gods of mythology, with AI making our lives fantastic. Rich kids that don’t need to work today do a variety of what I think are mind-numbing things, and maybe people will do some of that. Some will push boundaries positively because of a myriad of factors like Elon Musk has… and some will resort to art, culture, poetry, and other pursuits. It’s going to vary, but humans that work and create and dedicate themselves to improving humanity and the human experience will have unprecedented ROI on their efforts as we have force-multipliers with AI.

No matter what, economic incentives — market forces and personal incentives and other factors do not change in the future.

My biggest concern in the long-run is that we become soft. Hard people are needed to do what needs to be done and stare into the abyss and fight on. God forbid we ever have to fight a species from space more advanced than us, but the world has a way of throwing curve-balls at us. What will we do when the sun dies? EMPs and solar flares destroy data centers? None of the concerns people have about employment today will matter — people will just work with insane ROI… that’ll all sort itself out… we must see the bigger picture.

What is the point of human life? In the future, we’ll be able to start experimenting when we do things because we want to, and not because we have to.

As for all these layoffs being blamed on AI, I don’t buy it. Elon showed that you could run a company stably with what, 10% of the workforce in tech? What the hell were the other 90% doing? Sure, since firings, the platform maybe isn’t advancing as fast as it otherwise would but it’s improved along other metrics… but if each of those people were outputting at the rate each could with AI, maybe it wouldn’t make sense to fire them… imagine if each of those people were creating the same value as the 10% that stayed? That’s be like having 9x more output — and imagine where X would be in that case and all the things they could’ve done!!!

Upskill, and make the ROI of hiring you and of keeping you (especially when compared to peers inside and outside the country), be a no-brainer.

Transition period will be slower than we’d like because people are lazy, stupid, etc. (sometimes)… and won’t upskill… but ultimately this is where the world will go. We should start teaching AI use and leveraging agents in schools, and teaching older people too — after all, who doesn’t want to be an armchair general and do less grunt work and have more output and work on actually interesting things that we can actually make progress on for the first time?!