Alexandria Digital Research Library

Hardware aware and architecture friendly training of memristive crossbar circuits as neural network pattern classifiers

Author:
Zamanidoost, Elham
Degree Grantor:
University of California, Santa Barbara. Electrical & Computer Engineering
Degree Supervisor:
Dmitri Strukov
Place of Publication:
[Santa Barbara, Calif.]
Publisher:
University of California, Santa Barbara
Creation Date:
2016
Issued Date:
2016
Topics:
Computer engineering and Electrical engineering
Genres:
Online resources and Dissertations, Academic
Dissertation:
Ph.D.--University of California, Santa Barbara, 2016
Description:

The field of artificial neural networks is experiencing a resurge of interest due to increase in demand for intelligence systems, which can process data with minimal human interference. In the information era, classification, recognition, and clustering data are becoming everyday tasks which should be performed accurately and efficiently. High- performance computing systems, although accurate, are not very efficient in implementing neural network systems and are limited by speed and power challenges related to Van Neumann architecture. Development of specialized hardware presents itself as a solution to this challenge and opens the door to a field of research which has been explored by utilization of conventional as well as emerging technologies. Among new technologies, which are candidates for exploitation in neural network implementation, memristor crossbars have been in the spotlight in recent years due to their numerous attractive characteristics. A memristor, as a two terminal nano-device, is highly scalable and can be fabricated in 3-dimensional arrays to achieve a high density of connections, which is required for powerful neural networks, such as convolutional nets. Also, its nonlinear I-V characteristics, along with its non-volatile properties, make it perfect for synapse implementation.

Therefore, in this thesis our goal is to consider memristive crossbar circuits as a synaptic layer in a feedforward neural network and study and develop efficient training algorithms for such network implementation. Our focus is mainly on transistor-less crossbar implementation of weight layers, which results in the highest density of connections per unit area.

We have explored ex-situ training, in-situ training, and hybrid training methods for multilayer neural networks in the presence of an accurate memristor dynamic model to show the challenges of using such devices as synapses and to offer solutions for efficient tuning of all (or several) devices in an array in parallel. In order to show the resilience of our training approach, we have considered realistic phenomena involved with emerging technologies, such as switching variation from device to device, stuck-at-open (and close) defects within the memristive crossbar, and limitations of tuning devices in large arrays. Our training algorithms are tested against standard and well-known benchmarks to compare the effectiveness and performance of our trained networks against other state-of-the-art methods, as well as software implemented training.

Furthermore, we have experimentally demonstrated ex-situ and in-situ training of a memristive crossbar implemented perceptron classifier as a proof of the concept and a small demonstration of our proposed training algorithm.

Physical Description:
1 online resource (101 pages)
Format:
Text
Collection(s):
UCSB electronic theses and dissertations
ARK:
ark:/48907/f3v69jrh
ISBN:
9781369576559
Catalog System Number:
990047512380203776
Rights:
Inc.icon only.dark In Copyright
Copyright Holder:
Elham Zamanidoost
Access: This item is restricted to on-campus access only. Please check our FAQs or contact UCSB Library staff if you need additional assistance.