OpenAI’s finance chief Sarah Friar said that the company will make 2026 its year of “practical adoption,” in a blog post.
“The priority is closing the gap between what AI now makes possible and how people, companies, and countries are using it day to day,” OpenAI CFO Sarah Friar wrote. “The opportunity is large and immediate, especially in health, science, and enterprise, where better intelligence translates directly into better outcomes.”
In the blog, Friar laid out how OpenAI views its strategy for driving monetization of its services, like ChatGPT, while securing the compute necessary to power those products. She mentioned that the AI lab’s revenue directly tracks with the availability of its technical infrastructure.
OpenAI’s compute grew from 0.2 gigawatts in 2023 to about 1.9 GW in 2025, while the company’s annual revenue run rate grew similarly from $2 billion in 2023 to more than $20 billion last year, Friar said.
READ: OpenAI signs $10 billion computing deal with chipmaker Cerebras (
“This is never-before-seen growth at such scale,” she wrote. “And we firmly believe that more compute in these periods would have led to faster customer adoption and monetization.”
This comes at a time when the tech industry is facing increased scrutiny for its massive spending on AI data centers — which is yet to deliver much returns for investors.
OpenAI’s deals include a recent agreement with chipmaker Nvidia. Under that agreement, Nvidia said it would commit $100 billion to support the AI startup as it builds and deploys at least 10 GW of Nvidia’s systems. According to a CNBC analysis of data from the Energy Information Administration, 10 GW is roughly equivalent to the annual power consumption of eight million U.S. households. However, in November 2025, Nvidia told investors that there was “no assurance” that its agreement with OpenAI would progress beyond an announcement and to an official contract stage.
“Securing world-class compute requires commitments made years in advance, and growth does not move in a perfectly smooth line,” wrote Friar, adding that the system requires discipline.
Friar also added that three years ago OpenAI relied on a single compute provider, but now it works with a diversified ecosystem.
READ: OpenAI seeks new executive to oversee AI risk preparedness (
“We can plan, finance, and deploy capacity with confidence in a market where access to compute defines who can scale,” she wrote.
Friar also said that OpenAI’s business model should scale with its services. “As intelligence moves into scientific research, drug discovery, energy systems, and financial modeling, new economic models will emerge,” she wrote.
This comes shortly after OpenAI said it is planning to introduce advertising inside ChatGPT for a segment of U.S. users, a move aimed at offsetting the soaring costs of building and running advanced AI systems. The company said Friday that the ads will initially be tested on its free version of ChatGPT as well as the lower-priced Go plan, which OpenAI is in the process of rolling out worldwide. The ads are expected to begin appearing in the coming weeks and, according to the company, will be clearly separated from ChatGPT’s AI-generated responses.

