- John Carmack has shared an idea to use fiber instead of RAM
- This is a vision of the future for replacing RAM modules in AI workloads
- While very theoretical and a long way off, there are other possible short-term solutions to reduce AI’s RAM-guzzling appetite.
John Carmack has floated the idea of effectively using fiber cables as “storage” instead of conventional RAM modules, which is a particularly intriguing vision of the future given the current memory crisis and all the havoc it’s wreaking.
Tom’s Hardware noted id Software co-founder’s post on
Carmack notes: “Data rates of 256 Tb/s have been demonstrated over a distance of 200 km over single-mode fiber optics, which is equivalent to 32 GB of in-flight data, ‘stored’ in the fiber, with a bandwidth of 32 TB/s. Neural Network Inference and Training [AI] “can have deterministic weight reference patterns, so it’s fun to consider a system with no DRAM and weights continuously streamed to an L2 cache via a recycle fiber loop.”
What this means is that said length of fiber is a loop where the necessary data (usually stored in RAM) is “continuously transmitted” and keeps the AI processor always powered (as the AI model weights can be accessed sequentially; otherwise this would not work). This would also be a very green and energy-saving way to complete these tasks, compared to traditional RAM.
As Carmack points out, this is the “modern equivalent of the old mercury echo tube memories”, or delay line memory, where data is stored in waves passing through a coil of wire.
It’s not a feasible idea now, but rather a concept for the future, as mentioned, and what Carmack argues is that it’s a conceivable path forward that arguably has a “better growth trajectory” than what we’re currently seeing with traditional DRAM.
Analysis: fast forward
There are very obvious problems with RAM these days in terms of supply and demand, with the latter far outweighing the former thanks to the rise of AI and the huge memory requirements that come with it. (Not just for servers in data centers that answer queries to popular AI models, but also for video RAM on AI accelerator boards.)
So what Carmack is envisioning is a different way of operating with AI models that uses fiber lines. In theory, this could leave the rest of us free to stop worrying about RAM costing a ridiculous amount of money (or even a PC, or a graphics card, and the list goes on with the knock-on effects of the memory crisis on prices).
The problem is that there are many problems with such a fiber proposal, as Carmack acknowledges. That includes the large amount of fiber needed and the difficulties in maintaining signal strength through the loop.
However, there are other possibilities along these lines, and other people have been talking about similar concepts in recent years. Carmack mentions: “Much more practically, you should be able to combine cheap flash memory to provide almost any read bandwidth you need, as long as it’s done on a page-by-page basis and pipelined well in advance. That should be viable for inference service today if the flash and accelerator vendors could agree on a high-speed interface.”
In other words, it’s an army of cheap flash memory modules linked together, running massively in parallel, but, as Carmack points out, the key would be to agree on an interface where these chips could work directly with the AI accelerator.
This is an interesting short-term proposition, but it depends on the relevant manufacturers (GPU and AI storage) getting their act together and coming up with a new system along these lines.
The RAM crisis is predicted to last this year, and probably next year as well, and could go on even longer, with all kinds of pain for consumers. Therefore, looking for alternative solutions for memory in terms of AI models could be a valuable pursuit to ensure that this RAM crisis is the last episode of its kind that we have to suffer.

The best laptops for all budgets
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course, you can also follow TechRadar on YouTube and tiktok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.




