Meta unveils four new MTIA chips

Meta has introduced four new in-house chips designed to handle generative AI features and content ranking systems across its apps. The processors join the company’s MTIA lineup, short for Meta Training and Inference Accelerators, and mark another step in Meta’s push to build more of its own AI hardware while it continues spending heavily on infrastructure from suppliers such as Nvidia and AMD.

New MTIA roadmap expands Meta’s in-house AI hardware plans

The new chips were developed in partnership with Broadcom. Meta says the processors are based on the open-source RISC-V architecture, while manufacturing is being handled by Taiwan Semiconductor Manufacturing Company, better known as TSMC.

The lineup includes the MTIA 300, MTIA 400, MTIA 450, and MTIA 500. Meta says the MTIA 300 is already in production. The other three chips are scheduled to ship at different points throughout 2027.

That pace stands out even in the semiconductor industry, where developing new silicon usually takes years. For a company best known for social platforms rather than physical computing infrastructure, the release cadence is even more unusual.

YJ Song, Meta’s vice president of engineering, said AI models are changing faster than traditional chip development cycles. By the time a processor is ready for deployment, the workloads it was built for may already look very different.

Meta says that is why it is taking an iterative approach instead of waiting for longer design cycles. Each MTIA generation is built on top of the previous one, using modular chiplets and updated insights from current AI workloads.

The MTIA 300 is intended mainly for training the algorithms that rank and recommend content across apps such as Facebook and Instagram. The remaining three chips are focused on inference workloads, which means running trained AI models to generate outputs such as text and images.

Meta says the MTIA 400 delivers performance that is competitive with leading commercial products. The chip has already been tested and is expected to reach data centers soon.

The MTIA 450 will carry twice as much high-bandwidth memory as the MTIA 400 and is expected to ship in early 2027. Later in 2027, Meta plans to roll out the MTIA 500, which will include even more memory than the 450 and add new work around low-precision data processing.

The new chips sit inside a broader strategy at Meta to secure as much computing power as possible for advanced AI development. The company first detailed its MTIA efforts in 2023, when it introduced the first chip under that brand.

Since then, more software companies and AI labs have started moving toward custom accelerators built for their own workloads. OpenAI has taken a similar path and has also said it is working with Broadcom on custom AI hardware.

Reports earlier this year suggested Meta had scaled back some internal efforts tied to high-end chips that would compete more directly with Nvidia. This new MTIA roadmap appears to counter that narrative, at least in part.

Even so, designing custom silicon remains expensive and technically demanding, which means Meta is still likely to rely on outside suppliers for the bulk of its AI hardware in the near term.

That balance is already visible in the company’s recent spending. Meta revealed its new MTIA chips shortly after announcing multibillion-dollar agreements with Nvidia and AMD.

The company has also signed a deal to rent chips made by Google, underlining that its custom silicon plans are growing alongside, not replacing, its dependence on external hardware partners.

Latest articles

Related articles

Leave a reply

Please enter your comment!
Please enter your name here