In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
Source: Content compiled by Semiconductor Industry Watch (ID: icbank) from Venturebeat
This article is about 1142 words and it is recommended to read for 4 minutes.
This article introduces IBM Zurich Lab's phase-change memory-based technology, which has developed a machine learning solution that can achieve high energy efficiency and high accuracy at the same time.
Researchers at IBM Zurich Laboratories published a paper in Nature Communications this week. In it, they claim to have developed a machine-learning scheme based on phase-change memory technology that achieves both high energy efficiency and high precision. This is an approach to in-memory computing using resistance-based storage devices, and their approach compensates for the shortcomings of separate schemes for storing and computing data and greatly reduces power consumption in the process.
Many existing AI inference schemes physically split memory and processing units, resulting in AI models stored off-chip in memory. This increases computational overhead because data must be transferred between units, which slows processing and increases power consumption.
IBM's technology ostensibly solves the problem of phase-change memory, a nonvolatile memory that is faster than common flash technology. This work, if proven scalable, could pave the way for powerful hardware to run AI in drones, robots, mobile devices, and other compute-constrained devices.
As the IBM team explains, the challenge with phase change storage devices is that they tend to introduce computational inaccuracy. That's because it's analog in nature. Its accuracy is limited by variability and read/write conductance noise.
The proposed solution requires the injection of extra noise during training of AI models in software to improve the resilience of the models. It turned out to be successful. After mapping the trained weights (i.e., the parameters used to transform the input data) to the phase change memory component, adding extra noise improves the accuracy of the trained ResNet model to 93.7% on the popular CIFAR-19 dataset, while ImageNet achieves 71.6%.
Furthermore, after mapping the model-specific weights onto the 723,444 phase change memory devices in the prototype chip, the accuracy remained above 92.6% over a single day of overtesting. The researchers claim this is a record.
To further improve the retention of accuracy over time, the study's co-authors also developed a compensation technique that periodically corrects the activation function (the equation that determines the model output) during inference. This resulted in hardware accuracy improvements to 93.5 percent, they said.
At the same time, the team experimented with training machine learning models using simulated phase change memory components. The report says they achieved "software-equivalent" accuracy using a hybrid precision architecture on several types of small-scale models, including multilayer perceptrons, convolutional neural networks, short-term memory networks, and generative adversarial networks. They detailed the training experiments in a recent study published in Frontiers in Neuroscience.
IBM's latest work in the field follows the introduction of phase change memory chips for AI training. But while the company's technology is reportedly still in the research phase, the company's researchers have demonstrated that the system can store weight data as electrical charges, with 100 times more computation per square millimeter than graphics cards, while consuming 280 times less power.
IBM said: "In an era when more and more applications (including IoT battery-powered devices and autonomous vehicles) are moving towards AI, fast, low-power and reliable DNN inference engines are very attractive.
They said in a statement. The AI hardware accelerator architecture we are studying has great potential to support DNN training and inference. "
- End-
https://www.toutiao.com/i6828549234929697287/
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.