Older generations of low power DRAM (LPDDR) can be perfectly adequate for many AI devices at the edge...
For all the chatter about high performance or emerging memories meeting the needs of artificial intelligence (AI), the enduring, legacy memories appear to have a role to play for AI devices at the edge, including older generations of low power DRAM (LPDDR).
Winbond Electronic Corp.’s 1Gb LPDDR3 DRAM die is one such example with AI company Kneron having selected it for its latest system-on-chip (SoC), the KL720. It’s one of several SOCs the company offers that being used in a variety of edge devices, including battery-powered applications such as smart locks and drones that take advantage of a 512Mb LPDDR2 from Winbond. By using Winbond’s LPDDR3 in its KL720 SoC, Kneron is looking to support a new set of low-power/high-performance applications for AI and machine learning technology.
Winbond DRAM marketing manager Jacky Tseng said the Winbond’s LPDDR3 DRAM can deliver a maximum bandwidth 8.5GB/s, operate from a dual 1.2V/1.8V supply, and offers power-saving features such as Deep Power-Down mode and a Clock Stop capability. These specifications enable customer devices like Kneron’s KL720 to process 4K, Full HD or 3D sensor video images in real time to support AI applications such as face recognition in security cameras or gesture control in public kiosks, as well as perform natural language processing.
Even as big DRAM players advance LPDDR4 and roll out LPPDR5, many customers don’t need premium DRAM for their applications, said Tseng, and they don’t want to pay the premium pricing either. Customers such as Kneron are looking for a certain level of performance at relatively low densities. “Even the LPDDR3 is good enough to implement some AI training models,” he said. “The AI training model can be optimized to reduce the size so 1Gb is enough, but it still requires high bandwidth.”
Beyond the needs of Kneron and the applications for its KL720 SoC, Tseng said there are potential uses for devices with the density and bandwidth of Winbond’s LPDDR3 DRAM in automotive applications such as Advanced Driver Assistance Systems (ADAS), which employ cameras that must process video images in real time. Otherwise, there are many opportunities for IoT endpoints that need to do basic AI inference, he said, which like the Kneron SOC, require low densities, but high bandwidth.
More powerful inference requirements at the edge using FPGA and ASICs would require more memory such as 4Gb LPDDR4, said Tseng, while more centralized AI training and inference at the heart of the cloud powered by CPUs and GPUs demand higher density and higher performance LPDDR4, LPDDR5 and even High Bandwidth Memory (HBM) and GDDR6. Ultimately, there are many “layers” of edge computing requiring different power / performance ratios, including those well-suited for Winbond’s LPDDR3 DRAM, he said.
Jim Handy, principal analyst with Objective Analyst, this latest offering from WinBond is in line with the company’s broader product offerings a on lot of its device get sold into applications that require low chip counts. “They’re, they’re huge in serial NOR flash sell a lot of SRAM into smaller kinds of applications where DRAM would be overkill.”
He had expected there would be a wide variety options touted for AI at the edge before it boiled down to a single architecture, such as a single micro-controller chip or something that looked like a smaller PC with five chips, but that it hasn’t happened yet. “There’s still such an incredible variety that it’s hard to say that there’s really a trend going anywhere for AI,” he said. “Emerging memories are still a wish and a prayer. They’ll be something important at the edge that’s not happening yet.”