Algorithms and circuits for low-power machine learning IC
Thosani, Tejas Hemant
Date of Issue2017
School of Electrical and Electronic Engineering
The world of artificial neural networks is an amazing field inspired by the biological model of learning. Multi layered feed-forward networks require significant human intervention for tuning and shows incredibly slow speeds of processing. An alternative model of a single layer feedforward neural network with randomized input layer and hidden layer bias has been proposed to improve efficiency and processing time by almost a thousand fold. We look at extreme learning machines proposed by Prof. Guang-Bin Huang which suggests that the input weights and the hidden layer biases can be randomly assigned if the activation functions are infinitely differentiable. We test different datasets to generate models using noisy parameters for regression, medical classification applications like Diabetes and speech recognition on cochlear implant extracted sound data. We study techniques to generalize data and optimize hidden layer and output of the machine by tuning parameters based on our needs. We also look at circuit implementations of sub-blocks of the neural network concerning the activation thresholding functions after optimizing the same for our datasets of interests. Future research in implementation of the entire neural network in hardware and the implications of non-idealities arising from the same are discussed.
DRNTU::Engineering::Electrical and electronic engineering
Final Year Project (FYP)
Nanyang Technological University