Meta Platforms recently unveiled its in-house custom chip “family” aimed at enhancing artificial intelligence (AI) work. The company developed its first-generation chip in 2020 as part of the Meta Training and Inference Accelerator (MTIA) programme, to improve efficiency for recommendation models used in serving ads and other content in news feeds.
The first MTIA chip was designed exclusively for an AI process called inference, where algorithms trained on large amounts of data make judgments about which content to display next in a user’s feed.
Joel Coburn, a software engineer at Meta, explained that the company initially used graphics processing units (GPUs) for inference tasks but found them ill-suited for the job. He stated…
“Their efficiency is low for real models, despite significant software optimizations. This makes them challenging and…