Some of the best circuits to drive AI in the future may be analog, not digital, and research teams around the world are increasingly developing new devices to support such analog AI.
The most basic computation in the deep neural networks driving the current explosion in AI is the multiply-accumulate (MAC) operation. Deep neural networks are composed of layers of artificial neurons, and in MAC operations, the output of each one of these layers is multiplied by the values of the strengths or “weights” of their connections to the next layer, which then sums up these contributions.
Modern computers have digital components devoted to MAC operations, but analog circuits theoretically can perform these computations for orders of magnitude less energy. This strategy—known as analog AI, compute-in-memory or processing-in-memory—often performs these multiply-accumulate operations using non-volatile memory devices such as flash, magnetoresistive RAM (MRAM), resistive RAM (RRAM), phase-change memory (PCM) and even more esoteric technologies.