- ARM Lumex chips promise great improvements to AI on the device
- Your CPU offer up to 5x best AI performance
- CPUs are seen as AI’s power
ARM has raised the wrapping of its next -generation Chip Lumex Chip designs, optimized to execute some local workloads on mobile devices.
Its architecture allows four different types of design, ranging from energy efficiency nuclei to wearable to high -performance nuclei for flagship phones.
Panar cycles of accelerated products, which result in stricter time scales and a reduced margin of error, ARM says that its integrated platforms combine CPU, GPU and software batteries to accelerate marketing time.
ARM Lumex could be used on its next smartphone
ARM described Lumex as his “new specially designed computing subsystem platform (CSS) to meet the growing demands of experiences of the device.”
The CPU ARMV9.3 C1 cluster includes incorporated SM2 units for AI AI, promising a 5x performance better and 3 times more efficiency compared to the previous generation.
The standard reference points see the increase in performance by 30%, with an acceleration of 15% in applications and a daily energy use in 15% in daily workloads compared to the previous generation.
The four CPU offered are C1-Ultra for large models inferences, C1-Premium for Multitasa, C1-Pro for Video and C1-Nano Reproduction for Wearables.
The Mali G1-Ultra GPU also allows a faster AI/ml inference than immortalis-G295, as well as improvements in games such as 2x-ray layout performance.
Lumex also offers G1-Premium and G1-Pro options, but not G1-Nano.
Interestingly, the arm positions the CPU as the universal AI engine given the lack of standardization in the NPU, despite the fact that NPUs begin to win their place in PC chips.
Lumex launch is a complete Android 16 software stack, lists for Android 16, Kleidiai libraries enabled for SME2 and telemetry to analyze performance and identify bottlenecks, allowing developers to adapt Lumex to each model.
“Mobile computer science is entering a new era that is defined by how intelligence is built, scale and delivered,” said Senior Director Kinjal Dave.
Looking towards the future, Arm points out that many popular Google applications are already enabled for SME2, which means they are prepared to benefit from AI characteristics in the improved device when next -generation hardware is available.