Lecture Notes For All: Introduction to Neural Networks

GoDaddy

...................

Tuesday, May 17, 2011

Introduction to Neural Networks

Introduction to Neural Networks 


Topics covered: 




Artificial Neural Network Theory, models and architectures, neurobiological basis, learning theory, applications and hardware implementation issues.
Textbook:
Applications of Neural Networks in Electromagnetics (first 6 chapters) by Christos Christodoulou, and Michael Georgiopoulos, Artech House, 2001 (recommended).

Useful References:
Introduction to Neural Networks, by J. M. Zurada, West Publishing Company, 1992.
Neural Networks, A Comprehensive Foundation, by Simon Haykin, Prentice Hall, second edition, 2001.
Neural Network Design, by Hagan, Demuth and Beale, PWS Company, 1996.
Perceptrons (Expanded Edition), by Minsky and Papert, 1988.
Pattern Recognition Principles, by Duda and Hart, 2001.

Instructor:
Dr. Michael Georgiopoulos, Engineering Building 1, Room 407, Te; (407) 823-5338, E-Mail: michaelg@mail.ucf.edu





Objectives:
The objectives of this course is to examine the fundamental concepts of neural network computing from the theoretical, as well as from the applications point of view. A variety of neural network architectures with their associated learning algorithms are going to be examined thoroughly. Furthermore, successful applications of neural networks will be discussed. Comparisons of the neural network architectures with already existing approaches will be conducted, whenever data are available.

Prerequisites by Topic:
From the catalog data, it appears that the course has as a prerequisite the Pattern Recognition class (EEL 5825). Although having a background in pattern recognition will help you understand the material of the course faster, EEL 5825 is not a hard prerequisite. In other words, the material presented in this class is self-contained. The mathematical prerequisites are the standard calculus courses, material that is normally covered in the first two years of the standard engineering, computer science, or physics curriculum. Finally, to achieve the full benefit of the material some programming experience in a high-level language, such as C, C++, Fortran or MATLAB is needed.

Course Outline:

Introduction to Neural Networks:
  • Preliminaries
  • Benefits of Neural Networks
  • Types of Activation Functions
  • Multi-Layer Feed-Forward Networks
  • Learning Procedures (Supervised, Unsupervised, Hybrid Learning)
  • Learning Tasks (Association, Pattern Classification, Clustering, Prediction)
  • Knowledge Representation
  • Brief History of Neural Networks

Single-Layer and Multi-Layer Perceptrons
  • The Single-Layer Perceptron
  • Perceptron Learning Algorithm
  • A Geometrical Interpretation of the Perceptron Learning Algorithm
  • Adaline Network
  • Multi-Layer Perceptron
  • The Back-Propagation Algorithm
  • Issues with the Back-Propagation Algorithm
  • Variations of the Back-Propagation Algorithm
  • Applications

Radial Basis Functions – Kohonen Networks
  • Preliminaries of Radial Basis Function Networks
  • Learning Strategies with Radial Basis Function Networks
Ø  Fixed Centers selected at random.
Ø  Self-Organized selection of centers
Ø  Supervised selection of centers
Ø  Supervised selection of centers and variances
  • A Radial Basis Function Neural Network Algorithm
  • Issues with Radial Basis Function Learning
  • The General Regression Neural Network (GRNN)
  • Applications

Adaptive Resonance Theory Neural Networks
  • The Fuzzy ARTMAP Neural Network
  • The Fuzzy ARTMAP Architecture
  • Operating Phases of the Fuzzy ARTMAP Architecture
  • Geometrical Interpretation of the Fuzzy ARTMAP Learning
  • Properties of Learning in Fuzzy ARTMAP
  • Applications

Recurrent Neural Networks (or other topic)
  • Preliminaries of Associative Memories
  • The Hopfield Model
  • Optimization Problems using the Hopfield model
  • The RTRL Neural Network
  • The Elman Neural Network
Notes :

No comments:

Post a Comment