Hyperdimensional Computing vs. Neural Networks: Comparing Architecture and Learning Process

Dongning Ma1, Cong Hao2, Xun Jiao1
1Villanova University, 2Georgia Institute of Technology


Abstract

Hyperdimensional Computing (HDC) has obtained abundant attention as an emerging non von Neumann computing paradigm. Inspired by the way human brain functions, HDC leverages high dimensional patterns to perform learning tasks. Compared to neural networks, HDC has shown advantages such as energy efficiency and smaller model size, but sub-par learning capabilities in sophisticated applications. Therefore, HDC is usually regarded as a promising alternative to neural networks for scenarios on resource-critical architectures which often requires ultra-light solutions. Recently, researchers have observed that when combined with neural network components, HDC can achieve better performance than conventional HDC models. This motivates us to explore the deeper insights behind theoretical foundations of HDC, particularly the connection and differences with neural networks. In this paper, we make a comparative study between HDC and neural network to provide a different angle where HDC can be derived from an extremely compact neural network trained upfront. Experimental results show such neural network-derived HDC model can achieve up to 21% and 5% accuracy increase from conventional and learning-based HDC models respectively. This paper aims to provide more insights and shed lights on future directions for researches on this popular emerging learning scheme.