Meta Builds 1700W Superchip and Custom MTIA Chips While Ditching Nvidia, AMD, Intel, and ARM for Inference


  • Meta’s 1700W superchip offers 30 PFLOPs and 512GB of HBM memory
  • MTIA 450 and 500 prioritize inference over pre-training workloads
  • Future generations of MTIA will support GenAI classification and inference workloads

Meta is enhancing its AI infrastructure with a portfolio of custom MTIA chips designed specifically for inference workloads in its applications.

The company is developing a 1700W superchip capable of 30 PFLOPs and 512GB from HBM, built on the same MTIA infrastructure to handle inference tasks at scale.



Leave a Comment

Your email address will not be published. Required fields are marked *