Advancing Analog Reservoir Computing through Temporal Attention and MLP Integration

Khalil Sedki and Yang Yi
Virginia Tech


Abstract

This paper presents a novel approach for Image classification, integrating Analog Delay Feedback Reservoir (DFR), Temporal Attention Mechanism, Multi-layer Perceptron (MLP), and backpropagation. The DFR system simplifies recurrent neural networks by focusing on the readout stage, offering enhanced performance and adaptability. The study details the design of an analog DFR system for low-power embedded applications, which utilizes a temporal encoder, Mackey-Glass nonlinear module, and dynamic delayed feedback loop to efficiently process sequential inputs with minimal power consumption. This system, implemented in standard GF 22nm CMOS FD-SOI technology, achieves high energy efficiency and a compact design area. It exhibits promise in emulating mammalian brain behavior, with only a remarkable 155μW power consumption and design area of 0.0044mm^2. In addition, this paper introduces a temporal attention mechanism that operates directly on continuous analog signals. The attention mechanism enhances the DFR system's ability to capture relevant temporal patterns. Furthermore, our approach incorporates the MLP for post-processing the DFR output. This comprehensive approach integrates DFR, Temporal Attention Mechanism and MLP via backpropagation, advancing the development of computationally efficient Reservoir Computing (RC) systems for image classification with 98.48% accuracy.