• Format: ms-word (doc)
  • Pages: 65
  • Chapter 1 to 5
  • With abstract reference and questionnaire
  • Preview abstract and chapter 1 below

 5,000

Spiking Neural Network Architecture Design And Performance Exploration Towards The Design Of A Scalable Neuro-Inspired System For Complex Cognition Applications

ABSTRACT

Research into artificial neural networks (ANNs) is inspired by how information is dynamically and massively processed by biological neurons. Conventional ANNs research has received a wide range of applications including automation, but there are still problems of timing, power consumption, and massive parallelism. Spiking neural networks (SNNs), being the third-generation of neural networks, has drawn attention from a greater number of researchers due to the timing concept, which defines its closeness to biological Spiking Neural Network (bio-SNN tested) functions. Spike timing plays an important role in every spiking neuron and proves computationally more plausible than other conventional ANNs. The real biological and distinct neuron timing and spike firing can be modelled artificially using neurodynamics and spike neuron models. The spike timing dependent plasticity (STDP) learning rule also incorporates timing concepts and is suitable for training SNNs which describes general plasticity rules that depend on the actual timing of pre- and postsynaptic spikes. This work presents a software implementation of an SNN based on the Leaky Integrate-and-Fire (LIF) neuron model and STDP learning algorithm. Also, we present a novel hardware design and architecture of a lightweight neuro-processing core (NPC) to be implemented in a packet-switched based neuro-inspired system, named NASH. The NASH architecture uses the LIF neuron model and reduced flit format size that solves the problems of timing and high-power consumption. Software evaluation shows that our network tested 94% accuracy with MNIST datasets of handwritten digits.

 

 

TABLE OF CONTENTS

 

CERTIFICATION …………………………………………………………………………………………………….. ii
ABSTRACT ……………………………………………………………………………………………………. v
DEDICATION ……………………………………………………………………………………………………. vi
ACKNOWLEDGEMENT ……………………………………………………………………………………………… vii
LIST OF FIGURES ……………………………………………………………………………………………………. x
LIST OF TABLES ……………………………………………………………………………………………………. xi
LIST OF ABBREVIATIONS …………………………………………………………………………………………. xii
CHAPTER ONE INTRODUCTION ……………………………………………………………………………. 1
1.1 Neuro-Inspired Systems and Spiking Neurons ……………………………………… 1
1.2 Research Background and Motivation …………………………………………………. 2
1.3 Statement of Problem……………………………………………………………………….. 3
1.4 Research Aim and Objectives ……………………………………………………………. 3
1.5 Research Methodology …………………………………………………………………….. 3
1.6 Research Contributions …………………………………………………………………….. 4
CHAPTER TWO LITERATURE REVIEW …………………………………………………………………… 5
2.1 Spiking Neural Networks (SNNs) ……………………………………………………….. 5
2.2 Related Works…………………………………………………………………………………. 7
2.2.1 The IBM TrueNorth …………………………………………………………………………………… 7
2.2.2 The ROLLS neuromorphic processor …………………………………………………………… 8
2.2.3 The NeuroGrid …………………………………………………………………………………………. 8
2.2.4 The SpiNNaker ………………………………………………………………………………………… 9
2.2.5 The 3D OASIS NoC ………………………………………………………………………………… 10
2.3 Learning Mechanism ………………………………………………………………………..11
2.3.1 Offline Learning Algorithms ………………………………………………………………………. 11
2.3.2 Online Learning Algorithms ………………………………………………………………………. 14
2.4 Neuron Spiking Models …………………………………………………………………….15
2.4.1 Hodgkin and Huxley Neuron Model ……………………………………………………………. 16
2.4.2 FitHugh-Nagumo Neuron Model ……………………………………………………………….. 18
2.4.3 Integrate-and-fire (IF) and Leaky IF (LIF) neuron models ………………………………. 19
2.5 Software Simulators and Examples …………………………………………………….22
2.5.1 Brian Simulator ………………………………………………………………………………………. 23
2.5.2 NEURON Simulator ………………………………………………………………………………… 23
ix
2.5.3 GENESIS Simulator ………………………………………………………………………………… 24
CHAPTER THREE METHODOLOGY AND IMPLEMENTATION …………………………………….. 26
3.1 Introduction to STDP ………………………………………………………………………..26
3.2 STDP Model and Algorithm ……………………………………………………………….27
3.3 Methods …………………………………………………………………………………………29
3.3.1 Computational Model ………………………………………………………………………………. 30
3.3.2 Spiking Neuron and Synapse Model ………………………………………………………….. 30
3.3.3 Architecture and Network Learning ……………………………………………………………. 33
3.3.4 Data Encoding ……………………………………………………………………………………….. 36
3.3.5 Network Training and Classification …………………………………………………………… 36
3.4 Chapter Summary ……………………………………………………………………………37
CHAPTER FOUR ON THE DESIGN OF SCALABLE SNN BASED ON NOC ARCHITECTURE …………………………………………………………………………. 38
4.1 Introduction …………………………………………………………………………………….38
4.2 Network on Chip (NoC) …………………………………………………………………….38
4.3 OASIS Network on Chip (OASIS-NoC) ……………………………………………….39
4.3.1 Switching and Packet Format …………………………………………………………………… 41
4.3.2 Switch Architecture and Buffer ………………………………………………………………….. 42
4.3.3 Flow Control and Routing Mechanism ……………………………………………………….. 43
4.4 High-Level Neuro-Inspired Architectures in Hardware ……………………………45
4.4.1 NASH Components Description ………………………………………………………………… 46
4.5 Advantages of High-Level NASH Design over Traditional Bus-Systems……50
4.6 Applications of NASH on-chip system …………………………………………………51
4.7 Chapter Summary ……………………………………………………………………………51
CHAPTER FIVE RESULT, ANALYSIS, AND EVALUATION ……………………………………….. 52
5.1 Introduction …………………………………………………………………………………….52
5.2 Result Analysis on Accuracy ……………………………………………………………..52
5.3 Discussion ………………………………………………………………………………………55
5.4 Result Analysis on Power Consumption ………………………………………………57
CHAPTER SIX CONCLUSION AND FUTURE WORK ……………………………………………… 60
6.1 Introduction …………………………………………………………………………………….60
6.2 Conclusion ……………………………………………………………………………………..60
6.3 Technical Challenges ……………………………………………………………………….60
6.4 Future Work ……………………………………………………………………………………61
REFERENCES ………………………………………………………………………………………………….. 62
x

 

CHAPTER ONE

INTRODUCTION
1.1 Neuro-Inspired Systems and Spiking Neurons
Artificial Neural Network is an attractive, competitive and colossal research area in artificial intelligence which is inspired by the incredible and powerful performance of the interconnected biological brain. According to one of the first inventors of neurocomputing, Robert Hecht-Nielsen, Neural Networks is defined as ‘a computing system made up of some simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs’. The whole idea of biological neural networks of the brain gave birth to Artificial Neural Networks (ANNs), with scientists digging deep on how and the best way to mimic the brain functionalities using silicon chips.
It is based on the fact that the biological brain connection architecture can be mimicked with silicon and wires in place of living neurons and dendrites. The human brain is a structure made of 100 billion cells named neurons which connect thousands of cells by axons (von Bartheld, Bahney & Herculano-Houzel, 2016). Inputs from sensory organs and the external environment are accepted by dendrites which create electric impulses that rapidly travel within the neural networks. Messages are sent across from neurons to other neurons.
However, Deep Learning as an exciting research area in machine learning is concerned with developing different algorithms inspired by the structure and function of the brain. Artificial Neural Networks have been developed to solve various computational problems, but earlier research never considered timing issues which are the hallmark of Spiking Neural Networks (SNNs). Currently, most ANN models are built on extremely simplified brain dynamics (Ghosh-Dastidar & Adeli, 2009). They have been used as popular computational tools to solve complex classification, function estimation and pattern recognition problems.
2
Spiking Neural Networks integrate Spike-timing-dependent Plasticity (STDP), which serves as a technique of adjusting the strength of connection (synapse) between neurons in the brain based on the relative timing of a particular neuron’s output and input action potentials. In the past decade, SNNs were developed that comprising spiking neurons. Information transmission in these neurons mimics the information transmission in natural neurons due to their inherent dynamic representation. The massive parallelism of the brain has focused researchers’ minds on many competitive areas of neuro-inspired systems, and many algorithms have been developed to enable machines and systems, even IoT devices, to leverage brain performance. Neurocomputing has various and robust applications in science and technology. The applications of neuro-inspired systems have actually been found inspiring in the field of Biomedicine and Neuroscience because the collaboration between biological and electronic circuits has led to ultra-low-power and noise-robust chips that could serve the deaf, blind, and paralysed and that could also lead to advanced ear-inspired radio receivers (Sarpeshkar, 2012).
1.2 Research Background and Motivation
The background of this research centres on Spiking Neural Networks (SNNs) modelled with the Leaky Integrate-and-Fire (LIF) spiking neuron model that integrates the STDP learning algorithm. We are motivated by the efficient and parallel processing of the biological neuron. The biological brain implements massively parallel computations using a complex architecture that is different from the current Von Neumann machine. Our brain is a low-power, fault-tolerant, and high-performance machine. It consumes only about 20 W and brain circuits continue to operate as the organism needs even when the circuit is perturbed. The interconnection of the brain neurons drives our motivation to future on-chip systems.
3
1.3 Statement of Problem
Timing is a significant issue in implementing the neuro-inspired system and is not considered in conventional neural networks. Conventional neural networks encode information with static input coding (encoding pattern as 0011 and 0010, binary bits).
While in SNN, besides the pattern coding, the time-related parameters can be used to present the information which increases the information processing capacity of ANN. Implementing SNN with spiking neuron models solves this timing problem, hence the need for this research.
1.4 Research Aim and Objectives
This research is aimed at exploring the theoretical framework behind spiking neuron models and investigating the architecture of OASIS Network on Chip (OASIS-NoC). In addition to our research studies aim, we ought to be guided by the following objectives:
ï‚· To implement a software-based SNN using the Leaky Integrate-and-Fire neuron model;
ï‚· To train our network with STDP learning algorithm;
ï‚· To test our SNN for digit recognition with MNIST datasets of handwritten digits;
ï‚· To propose a novel scalable high-level Neuro-Inspired Architectures in Hardware (NASH) for future OASIS Network on Chip (OASIS-NoC).
1.5 Research Methodology
The methodology of this research is based on studying related literatures to get the general concepts. In our studies, we proposed a design for a novel high-level architecture, NASH for future on-chip systems. NASH main components were described.
4
We implemented a software-based Spiking Neural Network (SNN) using a Leaky Integrate-and-Fire (LIF) neuron. The SNN is trained with an STDP learning algorithm for digit recognition using MNIST datasets of handwritten digits.
The rest of the research work is organized in chapters as follows: Chapter 2 is the literature review on some related works, discussing SNNs, learning mechanisms, neuron spiking models and software simulators; Chapter 3 covers methodology and implementation, discussing STDP, STDP algorithm, and methods; Chapter 4 is on the design of scalable SNN based on NoC Architecture, discussing NoC, OASIS-NoC and proposing a novel NASH as the main contribution of this research; Chapter 5 contains results, analysis, and evaluation; and finally Chapter 6 discusses the research conclusion, challenges, and future work.
1.6 Research Contributions
Interconnection of many cores on a single chip has remained a bottleneck in system design since high power consumption, scalability and high throughput must be considered appropriately. Network on Chip is a promising solution for efficient interconnection of many cores on a single chip (Ahmed & Abdallah, 2012). This research leverages NoC architecture to propose a novel high-level and scalable Neuro-Inspired Architectures in Hardware for complex cognition applications. Hence, we made the following sub-contributions.
1. Study and implementation of a software-based Spiking Neural Network (SNN) using a Leaky Integrate-and-Fire (LIF) neuron model with a spike timing dependent plasticity (STDP) learning rule.
2. We proposed a design for a novel architecture and circuit development towards the implementation of a spiking neuro-inspired architecture. We performed hardware design and evaluation of a LIF Core for Neuro-inspired Spiking architecture.

GET THE COMPLETE PROJECT»

Do you need help? Talk to us right now: (+234) 08060082010, 08107932631 (Call/WhatsApp). Email: [email protected].

IF YOU CAN'T FIND YOUR TOPIC, CLICK HERE TO HIRE A WRITER»

Disclaimer: This PDF Material Content is Developed by the copyright owner to Serve as a RESEARCH GUIDE for Students to Conduct Academic Research.

You are allowed to use the original PDF Research Material Guide you will receive in the following ways:

1. As a source for additional understanding of the project topic.

2. As a source for ideas for you own academic research work (if properly referenced).

3. For PROPER paraphrasing ( see your school definition of plagiarism and acceptable paraphrase).

4. Direct citing ( if referenced properly).

Thank you so much for your respect for the authors copyright.

Do you need help? Talk to us right now: (+234) 08060082010, 08107932631 (Call/WhatsApp). Email: [email protected].

//
Welcome! My name is Damaris I am online and ready to help you via WhatsApp chat. Let me know if you need my assistance.