Here’s why 100TB+ SSDs will play such an important role in ultra-large language models in the near future



  • Kioxia reveals a new project called AiSAQ that wants to replace RAM with SSD for data processing with AI
  • Larger SSDs (read: 100+ TB) could improve RAG at a lower cost than using memory alone
  • No timeline has been given, but Kioxia’s rivals are expected to offer similar technology

Large language models often generate plausible but factually incorrect results; In other words, they invent things. These “hallucinations” can damage the reliability of critical reporting tasks such as medical diagnosis, legal analysis, financial reporting, and scientific research.

Recall Augmented Generation (RAG) mitigates this issue by integrating external data sources, allowing LLMs to access real-time information during generation, reduce errors, and, by basing results on current data, improve accuracy. contextual. Effective implementation of RAG requires significant memory and storage resources, and this is particularly true for large-scale vector data and indexes. Traditionally, this data has been stored in DRAM, which, while fast, is expensive and has limited capacity.

Leave a Comment

Your email address will not be published. Required fields are marked *