Broadcom launched a new networking chip on Tuesday that will help companies build artificial intelligence (AI) computing systems by stringing together hundreds of thousands of chips that crunch data.
The chip, called Thor Ultra, enables computing infrastructure operators to deploy far more chips than they otherwise could, allowing them to build and run the large models used to power AI apps such as ChatGPT.
This development is bound to deepen the company’s ongoing rivalry with Nvidia. Thor Ultra will compete with Nvidia’s networking interface chips and aim to further entrench Broadcom’s control of network communications inside data centers designed for AI applications.
READ: OpenAI partners with Broadcom on custom AI chips to meet soaring demand (
This comes shortly after Broadcom entered into a partnership with OpenAI to jointly build and deploy 10 gigawatts of custom artificial intelligence (AI) accelerators as part of a broader effort across the industry to scale AI infrastructure. While the companies have been working together for 18 months, they’re now going public with plans to develop and deploy racks of OpenAI-designed chips starting late 2026.
Broadcom CEO Hock Tan said late last year the market the company is going after for its various AI chips is in the range of $60 billion to $90 billion in 2027, divided between its networking chips and the data center processors it helps Alphabet’s Google and OpenAI make.
The Thor Ultra chip operates as a critical link between an AI system and the rest of the data center. The networking chips help data center operators move information around inside a facility. “In the distributed computing system, network plays an extremely important role in building these large clusters,” Ram Velaga, a Broadcom senior vice president, told Reuters. “So I’m not surprised that anybody who’s in the GPU business wants to make sure that they are participating in the networking.”
READ: Walmart partners with OpenAI to allow purchases through ChatGPT (
While networking chips are important for Broadcom’s plans, the company also helps design AI chips for cloud companies like Google, which are lucrative. Broadcom has worked on multiple generations of Google’s Tensor processor which Google began designing more than a decade ago. The Tensor chips have generated billions of dollars in revenue for Broadcom, according to analyst estimates.
Broadcom executives also detailed measures taken to construct and test new networking processors, according to a Reuters report. Towards that end, Broadcom’s engineers doubled the bandwidth on Thor Ultra compared with the prior version. They put the chips through rigorous testing and evaluation from the earliest stages of production.
To make a chip like the Thor Ultra or flagship series of Tomahawk networking switches, the engineers build an entire system around the chip. With the hardware system team, the engineers will discuss what kind of package the chip uses, how much power it will need and how much heat it will emit, Velaga said.


1 Comment
What is the name of the new networking chip launched by Broadcom, and how is it intended to support AI computing systems?