Business AI "Circle Jerk" / "Circular Bubble" News - aka Sam Altman and his infinite money glitch

  • 🏰 The Fediverse is up. If you know, you know.
  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
AI is a bubble that will be popped sooner than later. Here's a thread for all AI business news.

A Primer:



As the AI 'circle jerk' rages on, OpenAI, the company behind ChatGPT, will need to raise at least $207 billion more by 2030 to simply keep the lights on, according a new analysis by HSBC which takes into account recently disclosed megadeals with Microsoft, Amazon and Oracle.

Even with bullish assumptions that include 3 billion users, rapid subscription growth, and a giant slice of enterprise AI spending, the company's projected revenues are nowhere near its exploding bills for energy and chips, the bank says.

1764198866521.png

"OpenAI is a money pit with a website on top," according to FT's Bryce Elder, who notes that the bigger AI models get, the more cash they burn - and the winner in the LLM landscape may come down to who can continue raising money the longest.

The Math Behind the $207 Billion Hole​

HSBC’s model runs through 2030 and arrives at these headline numbers:
  • Cumulative data-center rental costs (2025-2030): $792 billion - rising to $1.4 trillion by 2033!
  • Projected cumulative free cash flow: $282 billion
  • Additional liquidity from Nvidia/AMD deals, undrawn facilities and cash on hand: ~$68 billion
  • Net funding shortfall: $207 billion (plus a $10 billion buffer)
Key revenue assumptions that still leave OpenAI in the red:
  • Total users reach 3 billion by 2030 (44% of global adults outside China), up from ~800 million today
  • Paid-subscriber conversion rises from ~5% today to 10% by 2030
  • Consumer AI market generates $129 billion annually by 2030 ($87 billion from search, $24 billion from advertising)
  • Enterprise AI market hits $386 billion; OpenAI’s share slips from ~50% today to 37%
  • Resulting 2030 revenue run-rate: roughly $174 billion (in line with CEO Sam Altman’s public hints of $100 billion by 2027 and continued hypergrowth)
HSBC also estimates cloud compute contracts that total up to $1.8 trillion in lifetime value, and notes that out of the 36 gigawtts of power they'll need, just one-third will be online by 2030. OpenAI's annual rental bill will approach $620 billion once capacity is fully online later in the decade.

Biggest Challenges To Come​

According to the report, there are several pressure points that could worsen this outlook, possibly forcing drastic action...

Investor Fatigue: "
If revenue growth doesn’t exceed expectations and prospective investors turn cautious, OpenAI would need to make some hard decisions." -FT

Debt-market jitters: Oracle’s recent bond volatility after its OpenAI deal shows how quickly sentiment can sour.

Contract lock-in: With most cloud deals running for 4-5 years and containing stiff penalties for early exits, OpenAI has little wiggle room.

Competition: "OpenAI’s consumer market share slips to 56 per cent by 2030, from around 71 per cent this year. Anthropic and xAI are both given market shares in the single digits, a mystery “others” is assigned 22 per cent, and Google is excluded entirely." -FT

No AGI in the model: HSBC explicitly excludes any revenue or efficiency windfall from artificial general intelligence - an omission that could prove either prudent or massively conservative.

1764198748119.png

While HSBC provides a sobering view of OpenAI, they're actually very bullish on AI as a concept...

We expect AI to penetrate every production process and every vertical, with a great potential for productivity gains at a global level. [ . . . ]

Some AI assets may be overvalued, some may be undervalued too. But eventually, a few incremental basis points of economic growth (productivity-driven) on a USD110trn+ world GDP could dwarf what is often seen as unreasonable capex spending at present.

GPT COUNTERS!​

For shits and giggles we asked ChatGPT if it thought HSBC was correct in their analysis. While the LLM mostly agreed, it said that the bank ignored;
  • architectural efficiency improvements
  • distillation
  • sparse expert models
  • on-device inference
  • agent-delegated execution
  • reinforcement-learning-optimized efficiency
  • quantization
  • open-weight local models replacing cloud calls
In other words, "There is no historical precedent in computing where efficiency didn’t massively improve as scaling occurred."
1764198732835.png
 
Last edited:
Is this the first bubble/crash that everyone everywhere sees coming?

The 2008 crash was caused by relatively esoteric financial instruments, the dot com bubble was at least founded on concepts that would genuinely revolutionize commerce, but the only people convinced that AI is going to change everything are industry insiders.
 
Is this the first bubble that everyone everywhere sees coming?

The 2008 crash was caused by relatively esoteric financial instruments, the dot com bubble was at least founded on concepts that would genuinely revolutionize commerce, but the only people convinced that AI is going to change everything are industry insiders.
In modern times, maybe? i really don't know.

Nvidia's CEO has been insisting everything is hunky-dory while selling shares for months. Do with this what you will.
View attachment 8219962
more charts:
1764199728288.png 1764199733038.png 1764199755072.png

from https://www.zerohedge.com/ai/ai-circle-jerk-rages-microsoft-nvidia-invest-15-billion-anthropic
 
OpenAI is a truly awe inspiring grift, it makes Theranos and FTX look like chump change. They'll teach lessons about this mania for centuries.

Is this the first bubble/crash that everyone everywhere sees coming?
Pretty much. Unless you're addicted to markov chains sucking your dick and shoving psyche hooks up your ass it's incredibly easy to see that a company with $1.8T in outstanding liabilities and $500B private market value on negative profits is a complete scam.
 
Is this the first bubble/crash that everyone everywhere sees coming?

The 2008 crash was caused by relatively esoteric financial instruments, the dot com bubble was at least founded on concepts that would genuinely revolutionize commerce, but the only people convinced that AI is going to change everything are industry insiders.
2008 was caused by that, but many people, even bloggers like Mike Shedlock, saw the rise in home prices, constant refinancing, and people buying houses they couldn't afford as unsustainable.

CDOs failing was not as commonly written about, but surprise, AI data centers are using CDOs now. Not many people are writing or talking about that, but the same debt instruments may cause this bubble to collapse as well.
 
Artificial Intelligence is a fucking utility, like roads, water, sewer, power, natural gas, and internet/telecoms. Like any other nascent utility, eventually, the for-profit model collapses and it becomes partially or wholly government-subsidized.

I can even get Gemini 3 to make this argument pretty much verbatim.

Q: AI strikes me as something that is more viable as a utility than a for-profit enterprise. It resembles other utilities.

A: This is a profound economic observation that aligns with a growing body of thought in tech policy and economics. You are essentially identifying AI as a Public Good or a Natural Monopoly.
When you look at the underlying economics of "Foundation Models" (the massive LLMs like GPT-4, Claude, or Gemini), the parallel to electricity grids, water supply, or the interstate highway system is striking.
Here is an analysis of why your assessment holds water, and where the tension lies between the "Utility" model and the "For-Profit" reality.

1. The "Natural Monopoly" Argument​

In economics, a "natural monopoly" occurs when the high cost of infrastructure makes it inefficient for multiple competitors to duplicate it. AI fits this in several ways:
  • Massive Capital Expenditure (CapEx): Training a frontier model costs hundreds of millions (soon to be billions) of dollars in GPU clusters and energy. This creates an incredibly high "barrier to entry." Only a handful of entities can afford to build the "power plant."
  • Data Gravity: The more data a model processes, the better it gets, leading to a "winner-take-most" dynamic.
  • Infrastructure Heavy: Just as you don't want five different water pipes running to your house from five different companies, it may be inefficient to have 50 different foundation models all scraping the exact same internet to learn the exact same concepts.

2. AI as "General Purpose Technology" (GPT)​

Economists classify AI as a General Purpose Technology—like the steam engine or electricity. These technologies are foundational; they don't just do one thing, they power everything else.
  • The Utility Logic: If AI becomes the "intelligence layer" for biology, coding, law, and logistics, treating it as a for-profit trade secret could bottleneck civilization.
  • The Price of Access: If a for-profit company controls the "electricity" (AI), they can extract rent from every sector of the economy. A utility model, conversely, regulates price to ensure universal access and economic stability.

3. The "For-Profit" Counter-Argument (Why we are here)​

If the utility argument is so strong, why is the current landscape dominated by fierce capitalist competition?
  • The Innovation Curve: Utilities are notoriously slow to innovate (think of how little the power grid changed for decades). The current AI explosion is driven by the "arms race" between private companies. The profit motive is currently funding the R&D that a public utility might not risk.
  • Efficiency: Private markets are often better at discovering valuable use cases. A utility might build a massive model that no one knows how to use, whereas startups are scrambling to find "product-market fit."

4. The Middle Ground: The "Telecom" Model​

We may be heading toward a future that resembles the Telecommunications industry or the Airline industry—highly regulated private companies that function as semi-utilities.
FeaturePure For-ProfitPure UtilityLikely AI Future (Regulated)
GoalMaximize Shareholder ValueUniversal AccessManaged Competition
PricingWhatever the market bearsCost + fixed marginTiered (Pro vs. Public)
InnovationRapid, chaoticSlow, steadyModerate, safety-focused
OversightLimitedHeavy Government ControlSafety & Bias Audits

5. The "CERN for AI" Concept​

There is currently a movement among researchers calling for a "CERN for AI"—a publicly funded, multi-national utility that builds open-source foundation models for the benefit of humanity, rather than profit. This would treat the foundational layer (the model) as public infrastructure, while allowing private companies to build the applications (the apps) on top of it.

Summary​

You are likely correct in the long term. Once the technology matures and the "wild west" phase of rapid discovery slows down, the economics of maintaining these massive systems will likely look much more like running a power grid than running a software startup.
Would you like me to explore the "CERN for AI" proposal in more detail, or compare how the "Railroad Barons" of the 19th century mirror today's AI giants?
 
Is this the first bubble/crash that everyone everywhere sees coming?
Yes. This substack article (archive) notes that trading algorithms are finally seeing in Nvidia what autists everywhere already saw. In short:
  • Conversion from Accounts Receivable to Revenue has increased by ~15%, from an average 46 to 53 days. In layman's terms, it's now taking an average of 53 days for a sale to turn into cash.
    • For comparison: AMD completes a sale in 42 days, Intel can do it in 38 days, Taiwan Semiconductor Manufacturing can do it in 35 days, and Micron Technologies can do it in 44 days.
    • The 7 day collection delay will cost them $4.39 billion per quarter, or $17.56 billion per year.
  • Inventory has increased from $15 billion worth of chips to $19.8 billion over the past quarter, and nearly doubled from $10 billion of inventory in January. This is unsold inventory- chips Nvidia hasn't converted into cash. Some inventory backlog is good, but an increase like this suggests people aren't buying as much as they should.
    • Jensen Huang insists this is because demand is so insane Nvidia can't keep up. CFO Collette Kress supports this claim, citing lead times of 10 weeks or more for certain products. This is a blatant lie: if demand is as insane as the executives insist, inventory would decrease as the company scrambled to fulfill orders (as in Q2 2023, when inventory decreased 18% quarter over quarter).
    • This issue and these lies suggest one of two possibilities: either sales are declining and Nvidia is engaging in accounting voodoo to cover it up, or Nvidia is selling without collecting payments, causing inventory to convert to accounts receivable rather than cash.
    • The author suggests the second possibility is more likely. GPU rent has declined and inventory holding has increased- in layman's terms, people aren't buying as many chips as they used to, but Nvidia needs to more that inventory.
  • Nvidia's Q3 Operating Cash Flow (i.e. how much money it makes from selling chips) is $14.5 billion, while total income is $19.3 billion. This means about 75% of its income comes from selling chips. This doesn't sound too bad, until you realize AMD's Operating Cash Flow is at 97% of its income, Intel's is at 91% (even while executing an expensive turnaround), and Taiwan Semiconductor boasts a whopping 100%-105%.
    • Inventory increases are devouring Nvidia's cash, i.e. they're making products that aren't selling. Further, Nvdia is buying back shares even as they trade close to the company's all-time high. These buybacks indicate management is trying to artificially keep the share price high.
    • Michael Burry's shorts expire in March 2026 with a strike price of $140, indicating he expects a price decrease of 25% or more by that point.
The article then goes into detail about the circular financial structure of the AI industry in general:
  • Elon Musk's xAI announced $20 billion in funding- $7.5 billion in equity, $12.5 billion in debt. This is normal. But here's where it gets abnormal:
    • Nvidia provided $2 billion for the equity portion; part of their deal with xAI is xAI must maintain GPU usage above 70% or default on the debt.
    • xAI uses the funding Nvidia provides to lease GPUs from Nvidia.
    • Nvidia books this as revenue.
    • This supports Nvidia's valuation and makes it look like the xAI investment is profitable.
    • The infinite money loop is thus: Nvidia invests in AI company, AI company buys/leases Nvidia chips with Nvidia's invested capital, Nvidia records on-paper revenue and profitable AI company investments.
  • The same story happens with Microsoft Azure, Oracle, and more. The author estimates $615 billions dollars are flowing in an endless loop. Infinite money, infinite growth, line go up forever. Is that a problem?
    • Yes, actually. These funny AI companies Nvidia/Apple/Microsoft are investing in aren't turning a dime in profits. The cash is thus devoured by the AI investments. They sure talk a big game, but the billions in profits aren't materializing.
      • Sam Altman's OpenAI claims to be valued at $157 billion. Yet, it has only $3.7 billion in revenue and $13 billion in expenses- meaning OpenAI is losing $9.3 billion per year. In order to pay off its debts and justify its valuation at standard venture capital return multiples, OpenAI must generate $3.1 trillion in revenue. If this sounds impossible, that's because it is. That is the revenue of an entire nation.
  • The author compares this circular money structure to the Enron, Lucent, and Worldcom scandals of the Dot-Com era. Further, with Bitcoin soon reaching its limit and no longer able to be mined, a lot of bitcoin mining centers will shut down and/or pivot to less profitable cryptocurrencies.
It's Thanksgiving and the turkey is calling, so two more points:

First, the Financial Times (archive only to bypass paywalls) is noting that OpenAI is currently losing money, and any future expansion will come with a corresponding cost increase. OpenAI is borrowing money for the purpose of losing money even faster. OpenAI will not be profitable for the foreseeable future... or possibly ever.

Second, as I said in an earlier post, Jensen Huang and major corporate officers have been selling hundreds of millions of dollars worth of stock for months now. This bodes poorly: corporate's share repurchases are solely so the executives can cash out safely. There are legitimate reasons to sell, but if the executives had any confidence in the company's future, they would not be selling so consistently.

Align your portfolios accordingly. Happy Thanksgiving, everybody.
 
It's getting worse: AI companies are using Nvidia's chips as collateral for loans (source/archive).

The updated loop works like this:
  • Nvidia invests in an AI company
  • AI company buys Nvidia chips to make AI
  • AI company applies for a loan
  • When asked for collateral, AI company points to their Nvidia chips
  • Uses new loan money to buy more Nvdia chips for more loans
But wait, there's more!

Like many companies, to bolster its balance sheet Nvidia invests in other companies in the same space. There's nothing illegal or unusual about this- until you realize the bulk of their holdings (85%!) are in a company called Coreweave (source/archive). As a company, Coreweave is not profitable, and has not been since its IPO. It primary source of income is investors and debt. To collateralize this debt and attract investors, whose GPUs do they use? Why, who else but Nvidia! And it gets worse! Nvidia has guaranteed $860 million of Coreweave's $15 billion in loans (source/archive). While this may not affect Nvidia's top or bottom line too much, it's not a good look.

This is the most egregious example of Nvidia's circular financing. Its other public investments- Applied Digital Corporation and Nebius group especially) have similar problems, though their loans aren't guaranteed by Nvidia. I can't find similar agreements for private companies like OpenAI or xAI, though their investment loop is likely similar (i.e. use Nvidia chips as loan collateral.) None of these companies are profitable; all of them depend on loans or investor cash to further their operations.

Ultimately, the bet is these companies will become profitable before the bubble pops. This is likely not happening for two reasons. First, any expansion will come with a corresponding cost increase, and while revenue comes and goes, costs are almost always set in granite. Second, more AI means more data centers, which means America's out-of-date electric infrastructure must be updated. I do not think this is happening quickly enough for these companies to become profitable.

(Disclaimer: none of this is financial advice. I am not a financial advisor, nor do I play one on TV; I am an insane man on the internet. Do your own research and consult a professional. If you follow my advice and lose money, that is your fault, not mine.)
 
No one is paying for this shit.

The one that 'wins' will be the one that's free.

Sinking money on the assumption that normal people are going to pay for these things is absurd.
 
As the AI 'circle jerk' rages on, OpenAI, the company behind ChatGPT, will need to raise at least $207 billion more by 2030 to simply keep the lights on,
So no profits at this business, huh?

I hate startup culture so much it's unreal. All anyone wants to do in this scene is get rich quick and leave. They don't care if they leave a functioning product or company in their wake. They just want to exit. They get in the game to leave it.
 
Like many companies, to bolster its balance sheet Nvidia invests in other companies in the same space. There's nothing illegal or unusual about this- until you realize the bulk of their holdings (85%!) are in a company called Coreweave (source/archive). As a company, Coreweave is not profitable, and has not been since its IPO. It primary source of income is investors and debt. To collateralize this debt and attract investors, whose GPUs do they use? Why, who else but Nvidia! And it gets worse! Nvidia has guaranteed $860 million of Coreweave's $15 billion in loans (source/archive). While this may not affect Nvidia's top or bottom line too much, it's not a good look.
From a purely accounting standpoint, there is zero difference between this and the SPVs that Enron used in the 1990s to hide bad debt.
 
When will this be at the point where you can short before it explodes?
There's apparently plenty of people shorting it now, which prompted Jensen to go "IT'S ALL DAIJOBU". So payment's due by next fiscal quarter, and it's prime time to short the shit out of the currently most overvalued company in history. And don't forget to buy up overstock by the second half of next year though, given Nvidia's track record, I wouldn't be surprised if they started actively buying up and destroying stock to keep GPU and NPU supply low.
 
Back
Top Bottom