I love this bs because the actual enterprise use case of AI is pretty much just coding which is why GPT5 neutered its language expressiveness and whatnot so much. And even for the enterprise customers openAI (and xAi probably too) is loosing money.
Right now it’s 6 bucks a million for personal therapy or having it write emails and essays for you. Soon it’s gonna be 20 bucks for 100,000 words that are rationed and a mess of scummy plans like mobile phone data companies do. Then it will be that they will explicitly add distinctions between the types of work that LLMs are used for and charge different rates for the tech oligarchs vs the “riff raff”.
Also, us “riff raff” will have to pay for AI whether we like it or not because the Telecom companies want to use AI models to improve data bandwidth (yes, it’s possible. You can increase the data rate of a connection at the cost of more errors, then you can use AI to correct for those errors).
Ok, admittedly that last one is just some dumb dystopia scenario I randomly cooked up. But I can swear upon God that 6G devices will consume unholy amounts of power and contain wierd ass proprietary and surveillance shit in them, even more than 5G.
You probably wouldn’t use an LLM for error correction like that, and it wouldn’t be pricing based on type of word, but rather API costs and MCP surcharges, but I think that’s a pretty accurate assessment of the situation.
Current pricing models are based on input and output tokens, but I expect a monthly minimum alongside the API costs soon. The
Programmers are probably getting the most out of LLMs right now and eventually Anthropic will realize that they are losing money off Claude code and start squeezing soon.
I love this bs because the actual enterprise use case of AI is pretty much just coding which is why GPT5 neutered its language expressiveness and whatnot so much. And even for the enterprise customers openAI (and xAi probably too) is loosing money.
Right now it’s 6 bucks a million for personal therapy or having it write emails and essays for you. Soon it’s gonna be 20 bucks for 100,000 words that are rationed and a mess of scummy plans like mobile phone data companies do. Then it will be that they will explicitly add distinctions between the types of work that LLMs are used for and charge different rates for the tech oligarchs vs the “riff raff”.
Also, us “riff raff” will have to pay for AI whether we like it or not because the Telecom companies want to use AI models to improve data bandwidth (yes, it’s possible. You can increase the data rate of a connection at the cost of more errors, then you can use AI to correct for those errors).
Ok, admittedly that last one is just some dumb dystopia scenario I randomly cooked up. But I can swear upon God that 6G devices will consume unholy amounts of power and contain wierd ass proprietary and surveillance shit in them, even more than 5G.
You probably wouldn’t use an LLM for error correction like that, and it wouldn’t be pricing based on type of word, but rather API costs and MCP surcharges, but I think that’s a pretty accurate assessment of the situation.
Yeah I doubt even LLMs charge by word. They charge by token and API calls.
Current pricing models are based on input and output tokens, but I expect a monthly minimum alongside the API costs soon. The Programmers are probably getting the most out of LLMs right now and eventually Anthropic will realize that they are losing money off Claude code and start squeezing soon.
It will be $88 for 14 words by next year