MIT Announces 95% Reduction in Power Consumption of New Neural Network Chips

Neural networks are very powerful, but they require a lot of energy. MIT engineers have now developed a new chip that can reduce the power consumption of neural networks by 95%, which may make it possible to run on battery-powered mobile devices.

Today, smart phones are becoming more and more intelligent, providing more and more artificial intelligence services such as digital assistants and real-time translation. However, the neural networks that perform data operations for these services are usually in the cloud, and the smart phone data is also transmitted back and forth in the cloud.

This is not an ideal situation because it requires a lot of communication bandwidth, and this means that potentially sensitive data is being transmitted and stored on servers that are not under user control. However, the normal operation of the graphics processor's neural network requires a lot of energy, making it impractical to run neural networks on devices with limited battery power.

MIT engineers have now designed a chip that can significantly reduce the need to transfer data back and forth between the chip's memory and the processor, thereby reducing power consumption by 95%. A neural network consists of thousands of artificial neurons interconnected by layers. Each neuron receives input from multiple neurons in its next layer, and if this combined input passes a certain threshold, it sends the output to multiple neurons in the upper layer. The strength of the connection between neurons is controlled by the weights set during training.

This means that for each neuron, the chip must retrieve the input data for the particular connection and the connection weights from memory, multiply them, store the result, and then repeat the process on each input. This requires a lot of data movement and therefore requires a lot of energy. MIT's new chip is another way to use analog circuits to compute all inputs in parallel in memory. This greatly reduces the amount of data that needs to be advanced, and ultimately saves a lot of energy. This method requires the connection weight to be binary instead of a series of values, but previous theoretical work shows that this will not cause too much impact on the accuracy of the chip, the researchers found that the results of the chip basically include running on a standard computer 2% to 3% of traditional non-binary neural networks.

This is not the first time that researchers have created chips that process data in memory to reduce the power consumption of neural networks, but this is the first time that this method has been used to run powerful convolutional neurons of image-based artificial intelligence applications. The internet. Dario Gill, IBM's vice president of artificial intelligence, said in a statement: "The results of the study show that when using a memory array for convolution operations, its performance is impressive. It will certainly provide images for the future of the Internet of Things. Video classification provides more complex convolutional neural networks."

However, it is not just the research team that is studying this issue. The desire for smart phones, home appliances, and various Internet of Things devices to carry artificial intelligence is driving Silicon Valley bigwigs into low-power artificial intelligence chips.

Apple has integrated its Neural Engine chip into the iPhone X to enhance its facial recognition technology and other functions. According to reports, Amazon is developing its own custom AI chip for the next-generation Echo digital assistant. Large chip companies are also increasingly inclined to support advanced features such as machine learning, which also forces them to upgrade equipment and become more energy efficient. Earlier this year, ARM introduced two new chips: ARM machine learning processor, which is mainly for artificial intelligence tasks, from translation to face recognition, and the other is used to detect human faces in images. Object detection processor.

Qualcomm’s new mobile chip, the Snapdragon 845, is equipped with a graphics processor and takes artificial intelligence as a top priority. The company also released the Xiaolong 820E chip, which is mainly aimed at drones, robots and industrial equipment. In the longer term, IBM and Intel are developing a neuromorphic chip whose architecture is inspired by the human brain and its amazing energy efficiency. In theory, this allows IBM's TrueNorth chip and Intel's Loihi chip to run a fraction of the energy required by conventional chips to run powerful machine learning, but at this stage, both technologies are still at a high level. Experimental stage.

Making these chips run a neural network as powerful as cloud computing services will be a huge challenge. But judging from the current speed of innovation, it will not be too long for you to get real artificial intelligence at your fingertips.


marble machine and tools include Marble Polishing Machine, Marble Cutting Machine. tiling machine, Tile Cutting Machine

the marble polishing machine function include Edge grinding, slotting, 45 degree chamfering, trimming, lower chamfering, L-shaped groove.


Marble Machine

Key Cutting Machine,Marble Machine,Marble Cutting Machine,Marble Water Jet Cutting Machine

HARBIN QIAN FAN XI FENG MACHINERY CO.LTD , https://www.waterjetwamit.com

Posted on