This originally began as one lengthy post but I’ve broken it into multiple posts that are more digestible. This is part 7.
If you are a heavy user of LLMs, or are considering it, I would like you to consider:
- How much are you spending per month on LLM services?
- How effective would you be if they were suddenly unavailable?
- How much of a cost increase could you realistically absorb before it became untenable? How much would be a hardship?
Enshittification
In Cory Doctorow’s 2022 (and recently re-printed!) book Enshittification, he proposes 3 stages that every online platform goes through:
- The platform is good to users
- The platform is bad to users, good to businesses
- The platform is bad to users, bad to businesses, good to itself
The progression occurs as a platform reaches critical mass for user lock-in (enough people are established as users that they cannot easily walk away). Consider, for example, the friction you would feel from walking away from using any particular online platform.
We are somewhere between #1 and #2.
When ChatGPT first came out, there was a rapid expansion of users finding different use-cases for it. The future seemed bright and exciting. But we are also seeing LLMs being the driving factor behind layoffs. We are seeing the job market dry up for Junior Devs, which AWS Ceo Matt Garman calls the dumbest thing he’s ever heard in light of entry level jobs disappearing because of “AI”.
Mark my words: as soon as the AI companies feel they have enough users and businesses dependent on their services, they will jack up prices, significantly.
Every single AI company is still operating at a loss (overall) despite raking in massive amounts of revenue. OpenAI and Oracle have signed a 5-year $300 Billion cloud deal. For a company that is operating at a loss, their belief that they will be able to convert that to a net gain of $300 Billion (whether through revenues or investment rounds) means they have a plan for something.
Additionally, they are planning on spending $1 Trillion over the next decade to expand their computing infrastructure.
By comparison, they are currently claiming 800 million regular users, with 5% (40 million) contributing to their $13 Billion of annualized revenue.
Covering the shortfall
Hypothetically speaking, let’s just say that OpenAI alone wants to satisfy their $60 Billion annual payments to Oracle. They need to make up for a $47 Billion shortfall. They would need to increase their subscription prices / costs PER ALL USER (even the 95% of the free tier users!) by $59 / user to recover that difference.
Looking at their current plans:
| Tier | Current price | Adjusted price |
|---|---|---|
| Free | $0 | $59 |
| Plus | $20 | $79 |
| Pro | $200 | $259 |
Granted, there are probably other ways they could re-distribute those values, perhaps by introducing a new tier between Free and Plus that costs $20 and requires a substantial portion of Free users to convert, but one way or another, the company needs to either extract $47 Billion from its 800 million users, find more users and also raise prices, or start screwing its business customers over too.
Enshittification points towards some combination of all of those.
Dependency
This brings me back to my original point:
If you’re currently using the Free tier, could you walk away? Could you afford a $59 increase?
If you’re using Plus or Pro, could you afford a 300% or ~35% (respectively) increase in fees? Could you do your job if you had to walk away?
The cynic in me thinks that the companies demanding their employees use LLMs at their jobs are trying to get users dependent on these services, fattening them up for the inevitable slaughter.
It then becomes an act of resistance to continue to be a strong developer in the absence of these generative tools.