[giaban]0.000 VNĐ[/giaban] [kythuat]
Parallel and distributed implementation of a multilayer perceptron neural network on a wireless sensor network


[/kythuat]
[tomtat]
Parallel and distributed implementation of a multilayer perceptron neural network on a wireless sensor network
Table of Contents
Abstract
Acknowledgements
Table of Contents
List of Tables
List of Figures
1 Introduction
2 Background
2.1 Artificial Neural Networks
2.1.1 Neuron Computational Model
2.1.2 Multilayer Perceptron Neural Network
2.1.3 Learning for ANNS
2.2 Parallel and Distributed Processing for ANNs
2.2.1 Supercomputer-based Systems
2.2.2 GPU-based Systems
2.2.3 Circuit-based Systems
2.2.4 WSN-based Systems
2.3 Scalability of MLP-BP
2.4 Wireless Sensor Networks
2.4.1 Single Node (Mote) Architecture.
2.4.2 Network Protocols
2.5 WSN Simulators
2.5.1 Bit Level Simulators
2.5.2 Packet Level Simulators
2.5.3 Algorithm Level Simulators
2.5.4 Proposed Approach of Simulation for WSN-MLP Design
3 Probabilistic Modeling of Delay and Drop Phenomena for Packets Carrying Neuron Outputs in WSNs
3.1 Neuron Outputs and Wireless Communication Delay
3.2 Modeling the Probability Distribution for Packet Drop and Delay Phenomena
3.2.1 Literature Survey
3.2.2 Dataset for Building the Drop Model
3.2.3 Empirical Model as an Equation for Packet Delivery Ratio vs. Node Count
3.2.4 The number of transmission hops
3.3 Neuron Outputs and Wireless Communication Delay
3.3.1 Delay and Delay Variance
3.3.2 Delay Generation using Truncated Gaussian distribution
3.4 Modeling the Neuron Output Delay (NOD)
3.4.1 Distance Calculation
3.4.2 Model of the Delay for Transmission of Neuron Outputs
4 Simulation Study: Preliminaries
4.1 Data Sets
4.1.1 Iris Data Set
4.1.2 Wine Data Set
4.1.3 Ionosphere Data Set
4.1.4 Dermatology Data Set
4.1.5 Handwritten Numerals Data Set
4.1.6 Isolet Data Set
4.1.7 Gisette Data Set
4.2 Data Preprocessing
4.2.1 Data Normalization
4.2.2 Balance of Classes
4.2.3 Data Set Partitioning for Training and Testing
4.3 MLP Neural Network Parameter Settings
4.3.1 Training Algorithm
4.3.1.1 Back-Propagation with Adaptive Learning Rate
4.3.1.2 Resilient Back-Propagation
4.3.1.3 Conjugate Gradient Back-Propagation
4.3.1.4 Levenberg-Marquardt Algorithm
4.3.1.5 Back-Propagation with Momentum
4.3.2 Learning Rate, Momentum and Hidden Layer Neuron Count
5 Simulation Study
5.1 The Simulator
5.2 Parameter Value Settings
5.3 Simulation Results
5.3.1 Iris Dataset
5.3.2 Wine Dataset
5.3.3 Ionosphere Dataset
5.3.4 Dermatology Dataset
5.3.5 Numerical Dataset
5.3.6 Isolet Dataset
5.3.7 Gisette Dataset
5.3.8 Summary and Discussion
5.4 Performance Comparison with Studies Reported in Literature
5.5 Time and Message Complexity
5.5.1 Time Complexity of WSN-MLP
5.5.2 Message Complexity of WSN-MLP
5.6 Weights of Neurons in Output Layer
6 Conclusions
6.1 Research Study Conclusions
6.2 Recommendations for Future Study
References
A. Data from Literature Survey for Drop and Delay
B. Time and Message Complexity
C. C++ Code for WSN-MLP Simulator 
[/tomtat]

Bài viết liên quan