AI GPUs will soon need more power than a small country, since HBM memory growth is out of control




  • Future memory chips of AI could demand more power than combined complete industrial areas
  • 6TB of memory in a GPU sounds incredible until you see the power drawing
  • HBM8 batteries are impressive in theory, but scary in practice for any energy conscious company

The relentless impulse to expand the IA processing power is marking the beginning of a new era for memory technology, but has a cost that raises practical and environmental concerns, experts warned.

The research of the Advanced Institute of Science and Technology of Korea (Kaist) and the interconnection of Terabyte and the Package Laboratory (Tera) suggest that by 2035, the GPU accelerators of the IA GPU equipped with 6 TB of HBM could become reality.

Leave a Comment

Your email address will not be published. Required fields are marked *