OpenAI CEO Sam Altman outlined the company's path to profitability in a recent interview on the Big Technology Podcast, as summarized by Search Engine Journal in a December 2025 article by staff writer Roger Montti.
Key details from Sam Altman's profitability comments
Altman said OpenAI's current losses are primarily driven by rapidly increasing model training costs, even as revenue grows. He argued the company would already be profitable if it were not expanding training investment so aggressively.
- OpenAI's losses are tied to continued increases in model training costs rather than weak revenue.
- Altman said that "as revenue grows and as inference becomes a larger and larger part of the fleet, it eventually subsumes the training expense."
- He described the plan as to "spend a lot of money training, but make more and more."
- Altman said OpenAI would be profitable "way, way earlier" if it were not continually ramping up training spend.
- He said concern about OpenAI's spending would be reasonable if the company had large amounts of compute it could not monetize profitably.
- Altman noted that OpenAI is "so compute constrained" and said this limitation directly affects the company's revenue line.
- He said the company has "penciled this out a bunch of ways" and expects improvements in "flops per dollar" as compute becomes cheaper.
- Altman called compute the "lifeblood" of OpenAI's products and said the company has always operated in a compute deficit.
- When asked whether enterprise deals, ChatGPT payments, and API usage would fund compute, Altman replied, "Yeah, that is the plan."
Background on spending, revenue, and projections
In the interview, host Alex Kantrowitz asked how OpenAI's spending plans compare with reported revenue figures. He cited external reports suggesting OpenAI could lose about 120 billion dollars between now and 2028 or 2029.
Kantrowitz also referenced a reported 1.4 trillion dollar long term compute spending commitment against roughly 20 billion dollars in revenue. Altman did not confirm those specific projections and instead focused on how increased compute availability could drive revenue growth.
Altman said OpenAI expects inference workloads to become a larger share of its compute usage over time. Under the company's plan, revenue from inference is expected to eventually exceed and cover training costs.
He added that OpenAI anticipates continued efficiency improvements that reduce the cost of compute per floating point operation. Altman said the company is seeing growth from consumer products, enterprise adoption, and additional business lines that have not yet launched.
Altman reiterated that OpenAI has "always been in a compute deficit," which has limited the products it can ship. He said he expects compute to remain a constraint even as the company expands its available capacity.
Source citations
- Big Technology Podcast interview with Sam Altman, hosted by Alex Kantrowitz.
- Coverage by Roger Montti for Search Engine Journal, December 2025.
- Background on OpenAI from the official OpenAI website.






