Next-Generation Mobile DRAM Targets On-Device AI with Faster Speeds, 20% Lower Power Use
SK hynix announced Tuesday the development of its next-generation 16-gigabit LPDDR6 DRAM, built on its 10-nanometer-class sixth-generation 1c process. This cutting-edge memory solution is engineered to enhance on-device artificial intelligence (AI) capabilities in mobile devices.
LPDDR (low-power double data rate) memory is a standard in smartphones and tablets, known for its low-voltage operation that significantly reduces power consumption, contributing to longer battery life.
The company stated that it has achieved the world’s first certification for a 1c-based LPDDR6 product after showcasing the innovative chip at CES 2026 in Las Vegas earlier this year in January.
“We are on track to complete preparations for mass production in the first half of the year, with plans to commence product supply in the second half,” stated an SK hynix official. “This initiative will contribute to establishing a versatile memory lineup optimized for AI implementation across various applications.”
The advanced chip is primarily designed for use in smartphones and tablets equipped with on-device AI, enabling AI functions to be processed directly on the device, eliminating the need for reliance on external servers.
On-device AI facilitates quicker response times and more personalized services by enabling local data processing on devices.
According to SK hynix, the new LPDDR6 chip offers substantial improvements in both processing speed and power efficiency compared to its LPDDR5X predecessor.
By expanding memory bandwidth, the LPDDR6 significantly boosts data throughput, achieving approximately 33 percent higher data processing performance compared to the previous generation. It boasts an operating speed of at least 10.7 gigabits per second, exceeding the maximum performance capabilities of LPDDR5X.
The chip also incorporates a sub-channel architecture and dynamic voltage and frequency scaling (DVFS) technology to optimize performance and energy efficiency.
The sub-channel design activates only the necessary data paths for specific tasks, while DVFS automatically adjusts voltage and frequency according to the current operating conditions.
Together, these technologies result in a power consumption reduction of more than 20 percent compared to the previous generation.
In demanding scenarios such as gaming, DVFS raises operating levels to maximize bandwidth. During normal usage, the system intelligently lowers frequency and voltage to conserve power.
“Consumers can anticipate not only extended battery life but also a smoother multitasking experience,” the official added.
This development aligns with SK Group’s broader strategy to strengthen its presence in the artificial intelligence semiconductor market.
During the Trans-Pacific Dialogue 2026 in February, Chey Tae-won emphasized the critical role of memory innovation in addressing the escalating demand for AI solutions.
“AI is essentially consuming massive amounts of electricity,” he stated, highlighting the increasing energy demands driven by the expansion of AI infrastructure.
Industry experts predict that power-efficient memory solutions will become increasingly vital as AI workloads are deployed across mobile and edge devices.
yeeun
