Locality-sensing Fast Neural Network (LFNN): An Efficient Neural Network Acceleration Framework via Locality Sensing for Real-time Videos Queries

Xiaotian Ma, Jiaqi Tang, Yu Bai
California State University, Fullerton


Abstract

As computer version tasks have been advanced by the deep neural networks continuously, researchers from academia and industry focus on developing a powerful deep neural network model to process volumes of data. With the increasing size of DNN models, their inference process is computationally expensive and limits the employment of DNNs in real-time applications. In response, we present the proposed Locality-sensing Fast Neural Network (LFNN), a generalized framework for accelerating the querying videos process via locality sensing to reduce the cost of DNN in video evaluation by three times saving in inference time. The LFNN framework can automatically sense the similarity between two input frames via a defined locality from a given input video. The LFNN framework enables us to process the input videos within the specialized processing method that is far less computationally expensive than conventional DNN inference that conducts detection for each frame. Within the highlighted temporal locality information across frames, we can accelerate the Yolov5 two to three times speed-ups. Experimental results show that the proposed LFNN is easily implemented on the FPGA board with neglectable extra hardware costs. Index Terms—Locality Sensing, Neural Network, FPGA