What’s happening in artificial intelligence landscape reflects a shift in what functions as currency. Recently, companies are converting one form of economic value into another — people in exchange for more chips and tokens and the clearest expression of this is the simultaneous wave of mass layoffs.
In 2026 alone, major tech firms have eliminated more than 55,000 jobs in the early part of the year, with projections suggesting cuts could exceed 250,000 for the full year, even as they commit over $600–700 billion in AI-related capital spending across hyperscalers.
At the same time, this isn’t just about cutting jobs on one side and spending on the other.
Companies are moving huge amounts of money around in a way that feels less like normal planning and more like a big, focused bet—that compute will define the next era of the economy.
Spending on AI infrastructure is about 40% to 60% directed toward chips and accelerators, and the remainder flowing into data centers, energy contracts, and networking. In the traditional firm, there was a balance between people and tools. You hired more employees and gave them better technology, and together that drove productivity. In the AI economy, that balance is changing. Now companies are starting to compare a dollar spent on a person directly with a dol.
READ: Sreedhar Potarazu | We’re not competing with AI on intelligence—it’s the blind spots (April 27, 2026)
Unlike human labor, compute scales easily. Once the infrastructure is in place, it runs continuously without breaks, negotiation, or fatigue, and its output can keep expanding. That advantage is starting to push companies toward replacing human effort with compute rather than just improving it.
This is why layoffs are happening at the same time as massive spending. Companies like Microsoft, Meta, Amazon, and Google are not slowing down; they are redirecting where the money goes.
Meta, for example, has cut about 10% of its workforce while planning to spend more than $100 billion a year on AI infrastructure, and others are doing the same by expanding data centers even as they reduce staff. The money saved from payroll is not sitting idle. It is being funneled into chips, compute clusters, and energy deals that will determine how much capacity they control in the future.
At the center of all of this is compute, which now acts as both the input and the currency. It determines how much intelligence can be produced, how fast systems respond, and how widely they can scale. Companies may still spend in dollars, but what really matters is how much compute they control. That is why access to chips has become so strategic. GPUs are no longer just hardware; they represent future capability. Companies are treating them that way by locking in supply years ahead and structuring deals to guarantee access.
READ: Sreedhar Potarazu | AI was built in layers—now doctors must be trained on it fast (April 22, 2026)
Chips, in this context, start to look like hard currency. They are scarce, expensive, and increasingly determine who wins, and the control over their production is concentrated in the hands of a few key players like Nvidia, Taiwan Semiconductor Manufacturing Company (TSMC), Samsung Electronics, and to a lesser extent, Intel. Some deals even treat access to compute like a future asset, tying today’s spending to tomorrow’s capacity.
If chips are the hard currency, then tokens are the spending money of the AI economy. Every time someone uses an AI system, it costs tokens, each one reflecting a small amount of compute. Questions, answers, and even the model’s internal steps all use tokens. At scale, this means intelligence is measured and billed continuously, with most of the cost now coming from using the models rather than building them. It starts to look like a utility, where you pay for what you use in real time.
The better metaphor right now is not a factory, but a casino running at full capacity, where the size of the bets is so large that it is hard to tell the difference between smart investment and speculation. The chips are literally chips. The bets are hundreds of billions of dollars going into infrastructure. The players are a small group of corporate leaders deciding how much of the economy will run on AI. And like any casino, the real question is not just who is placing the bets, but whose money is on the table, because a lot of that capital is coming from workforce cuts, effectively turning payroll into betting money for future compute.
READ: Sreedhar Potarazu | Is AI an avatar of God? Anthropic, Mythos, and the rise of moral authority in machines (April 13, 2026)
There is an uncomfortable imbalance in all of this. Even as companies are gathering compute at record levels, much of it is not being fully used. GPU clusters sit partly idle, and data centers are often built before demand actually arrives. Scarcity pushes companies to keep accumulating even when efficiency is low. In most markets, that would be a warning sign of overbuilding. In this market, it is still pushing investment faster.
What ultimately emerges is a change in how the economy is organized. Energy becomes the main limit, chips become the most valuable asset, compute becomes the way things are produced, and tokens become the way intelligence is bought and used. Human work does not disappear, but it shifts away from directly producing output and more toward monitoring, coordination, and handling edge cases.
The bigger question is whether this is still careful investment or something closer to a massive shared bet, where companies are acting on the belief that compute will define the next economy and that it will eventually justify the huge cost of building it even before the demand for it is fully clear.

