Multi-Layer Perceptrons in Neural Networks

undefined
MLP Architecture
Multi Layer Perceptrons
Neural Networks
 
y
1
 
y
2
 
Input
layer
 
Hidden layers
 
Output
layer
 
Inputs
 
Outputs
 
P
ossesses sigmoid activation functions
 
in the neurons 
to
enable modeling of nonlinearity
.
Contains one or more “hidden layers”.
Trained using the “
B
ackpropagation
” algorithm.
MLP Design Consideration
Multi Layer Perceptrons
Neural Networks
 
W
hat activation functions should be used?
How many inputs does the network need?
How many hidden layers does the network need?
How many hidden neurons per hidden layer?
How many outputs should the network have?
 
 
There is no standard methodology to determine these
values. Even there is some heuristic points, final values are
determinate by a 
trial and error 
procedure.
Advantages of MLP
Multi Layer Perceptrons
Neural Networks
 
M
LP
 with one hidden layer is a 
universal
approximator
.
MLP can approximate a
ny function
with
in
 any preset accuracy
The conditions: the weights and the
biases are appropriately assigned
through the use of adequate learning
algorithm.
 
M
LP
 can be applied directly in identification and control of
dynamic system with nonlinear relationship between input
and output
.
M
LP
 delivers the best compromise between number of
parameters, structure complexity, and calculation cost
.
Learning Algorithm of MLP
Multi Layer Perceptrons
Neural Networks
 
Function signal
 
Error signal
 
Forward propagation
 
Backward propagation
 
Computations at each neuron 
j
:
Neuron output, 
y
j
Vector of error gradient,
E
/
w
ji
 
Backpropagation
Learning Algorithm”
MLP Training
 
Backward Pass
Calculate 
j
(
n
)
Update weights 
w
ji
(
n+1
)
 
Forward Pass
Fix 
w
ji
(
n
)
Compute 
y
j
(
n
)
Multi Layer Perceptrons
Neural Networks
Learning Algorithm of MLP
Multi Layer Perceptrons
Neural Networks
 
Cost function / performance index:
 
Minimize
 
Goal:
 
Weight Modification Rule
Learning Algorithm of MLP
Multi Layer Perceptrons
Neural Networks
 
Backpropagation Learning Algorithm:
Learning on output neuron
Learning on hidden neurons
Learning Algorithm of MLP
Multi Layer Perceptrons
Neural Networks
 
the output of
the 
k
-the neuron of
the 
l
-th layer
,
at the 
i
-th time instant
 
the output of
the 
j
-the neuron of
the 
l
1
-th layer
,
at the 
i
-th time instant
 
Notations:
Back Propagation Learning Algorithm
Multi Layer Perceptrons
Neural Networks
 
Learning on 
output neuron
 
Depends on the
activation function
Back Propagation Learning Algorithm
Multi Layer Perceptrons
Neural Networks
 
Learning on 
hidden neuron
Back Propagation Learning Algorithm
Multi Layer Perceptrons
Neural Networks
 
Depends on the
activation function
 
Depends on the
activation function
Back Propagation Learning Algorithm
Multi Layer Perceptrons
Neural Networks
Backward
propagation
 
Set the weights
Calculate output
 
Forward
propagation
 
Calculate error
Calculate gradient
vector
 
 
 
Update the weights
Influen
tial
 Factor
s
 in Learning
 
Initial weights and bias
Cost function / performance index
Training data and generalization
Network structure
N
umber of layers
Number of neurons
Interconnections
Learning Methods
Weight modification rule
Variable or fixed learning rate (
)
Multi Layer Perceptrons
Neural Networks
 
Write
 an 
m
-file 
that
 will perform the backpropagation
learning algorithm for the following neural network with
2
 inputs, 1 hidden layer 
of
 2 neurons, and 1 output layer 
of
1 neuron, no bias at all
 (all 
a
 = 1)
.
Be sure to obtain decreasing errors.
Note
: Submit the hardcopy and softcopy of the 
m
-file.
Homework 11
Multi Layer Perceptrons
Neural Networks
 
Hint
:
 
The number of parameters
 
to be trained is six.
 
Write
 an 
m
-file 
that
 will perform the backpropagation
learning algorithm for the following neural network with
2 inputs, 1 hidden layer 
of
 
3
 neurons, and 1 output layer 
of
1 neuron, 
with
 
bias
 at all neurons (all 
a
 
=
 
1.2)
.
Be sure to obtain decreasing errors (convergence).
Note
: 
Submit the hardcopy and softcopy of the 
m
-file.
Homework 11A (Odd Student-ID)
Multi Layer Perceptrons
Neural Networks
 
Hint
:
 
The number of parameters
 
to be trained is eleven.
Deadline
: See Google Classroom
 
Write
 an 
m
-file 
that
 will perform the backpropagation
learning algorithm for the following neural network with
2 inputs, 1 hidden layer 
of
 
3
 neurons, and 1 output layer 
of
1 neuron, 
with
 
bias
 at all neurons
 
(all 
a
 
=
 
0.8)
.
Be sure to obtain decreasing errors (convergence).
Note
: 
Submit the hardcopy and softcopy of the 
m
-file.
Homework 11B (Even Student-ID)
Multi Layer Perceptrons
Neural Networks
 
Hint
:
 
The number of parameters
 
to be trained is twelve.
Deadline
: See Google Classroom
Slide Note
Embed
Share

In this lecture by Dr. Erwin Sitompul at President University, the focus is on Multi-Layer Perceptrons (MLP) in neural networks, discussing their architecture, design considerations, advantages, learning algorithms, and training process. MLPs with hidden layers and sigmoid activation functions enable modeling of nonlinearity, with the Backpropagation algorithm used for training. Important MLP design considerations include activation functions, number of inputs, hidden layers, neurons per layer, and outputs. MLPs are powerful approximators and can be applied to dynamic systems with nonlinear relationships between input and output. The lecture emphasizes the importance of appropriate weight assignment and learning algorithms for efficient MLP performance.

  • Neural Networks
  • Multi-Layer Perceptrons
  • MLP
  • Backpropagation
  • Sigmoid Activation

Uploaded on Jul 17, 2024 | 2 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Introduction to Neural Networks and Fuzzy Logic Lecture 11 Dr.-Ing. Erwin Sitompul President University http://zitompul.wordpress.com 2 0 2 1 President University Erwin Sitompul NNFL 11/1

  2. Neural Networks Multi Layer Perceptrons MLP Architecture Hidden layers Input layer Output layer x1 y1 x2 Outputs Inputs y2 x3 wlk wji wkj Possesses sigmoid activation functions in the neurons to enable modeling of nonlinearity. Contains one or more hidden layers . Trained using the Backpropagation algorithm. President University Erwin Sitompul NNFL 11/2

  3. Neural Networks Multi Layer Perceptrons MLP Design Consideration What activation functions should be used? How many inputs does the network need? How many hidden layers does the network need? How many hidden neurons per hidden layer? How many outputs should the network have? There is no standard methodology to determine these values. Even there is some heuristic points, final values are determinate by a trial and error procedure. President University Erwin Sitompul NNFL 11/3

  4. Neural Networks Multi Layer Perceptrons Advantages of MLP MLP with one hidden layer is a universal approximator. MLP can approximate any function within any preset accuracy The conditions: the weights and the biases are appropriately assigned through the use of adequate learning algorithm. x1 x2 x3 wlk wj wkj i MLP can be applied directly in identification and control of dynamic system with nonlinear relationship between input and output. MLP delivers the best compromise between number of parameters, structure complexity, and calculation cost. President University Erwin Sitompul NNFL 11/4

  5. Neural Networks Multi Layer Perceptrons Learning Algorithm of MLP Function signal Error signal f(.) f(.) f(.) Computations at each neuron j: Neuron output, yj Vector of error gradient, E/ wji Forward propagation Backpropagation Learning Algorithm Backward propagation President University Erwin Sitompul NNFL 11/5

  6. Neural Networks Multi Layer Perceptrons MLP Training k j i Right Left Forward Pass Fix wji(n) Compute yj(n) Backward Pass Calculate j(n) Update weights wji(n+1) k j i Right Left President University Erwin Sitompul NNFL 11/6

  7. Neural Networks Multi Layer Perceptrons Learning Algorithm of MLP 0 1( ) y i ( ) = = w x , y f d Goal: k k 0 2( ) y i l ( ) ky i . . . 0( ) m y i Cost function / performance index: ( 1 2 i = ( ) E w ( ) kj kj w p 1 ) 2 = ( ) w ( ) ( ) E d i y i k k Minimize w E = w Weight Modification Rule President University Erwin Sitompul NNFL 11/7

  8. Neural Networks Multi Layer Perceptrons Learning Algorithm of MLP 0 1( ) y i 0 2( ) y i l ( ) ky i . . . 0( ) m y i Backpropagation Learning Algorithm: Learning on output neuron Learning on hidden neurons President University Erwin Sitompul NNFL 11/8

  9. Neural Networks Multi Layer Perceptrons Learning Algorithm of MLP ( ) Notations: = l k l k ( ) ( ) y i f net i 1 ly 1( ) . . . i m = 1 l k l kj l j ( ) ( ) i net i w y l ( ) ky i = 1 j 1( ) l jy i . . . the output of the k-the neuron of the l-th layer, at the i-th time instant l ( ) ky i 1( ) l m y i the output of the j-the neuron of the l 1-th layer, at the i-th time instant 1( ) l jy i President University Erwin Sitompul NNFL 11/9

  10. Neural Networks Multi Layer Perceptrons Back Propagation Learning Algorithm Learning on output neuron ( ) = l k l k ( ) ( ) y i f net i 0 1( ) y i m = 1 l k l kj l j ( ) ( ) i net i w y 0 2( ) y i l ( ) ky i = 1 j . . . p 1 2 ( ) 2 = l k l k ( ) w 0( ) m y i ( ) ( ) E d i y i = 1 i l k l k p w w ( ) ( ) k ( ) ( ) l kj w ( ) ( ) k y i y i net i net i w E E = l l l kj ( net i net i ) f net i l k ( ) l k = ( ) ( ) k y i net i 1 i = l l k ( ) ( ) k ( ) ( ) = l f l k ( ) 1( ) l f net i l jy i ( ) ki Depends on the activation function President University Erwin Sitompul NNFL 11/10

  11. Neural Networks Multi Layer Perceptrons Back Propagation Learning Algorithm Learning on hidden neuron i y i ( ) = 1 1 l j l j ( ) i ( ) i y f net 1( ) l jy 0 1( ) n = 1 1 2 l j l ji l i ( ) i ( ) i net w y 0 2( ) y i l ( ) ky i = 1 i . . . p 1 2 ( ) 2 = l k l k ( ) w 0( ) m y i ( ) ( ) E d i y i = net 1 i 1 l j ( ) i p w w ( ) l ji w ( ) l j E E = net 1 1 1 l ji ( ) i w = 1 i net 1 1 l j l j ( ) ( ) ( ) i y i p w w ( ) l ji w ( ) ( ) E E y = net 1 1 1 1 l j l j l ji i i w = 1 i President University Erwin Sitompul NNFL 11/11

  12. Neural Networks Multi Layer Perceptrons Back Propagation Learning Algorithm 1 ( ) ( ) ( ) ( ) ( ) i ji j j w y i net i = net 1 l j l j ( ) i y i p w w E E = 1 1 1 1 l l l l ji w 1 net 1 1 l j l j ( ) ( ) ( ) i y i l k l k l j p w w ( ) ( ) k ( ) ( ) l ji w ( ) ( ) k y i y i net i net i y E E = net 1 1 1 1 l l l j l ji i w = 1 i 2( ) l iy i l ( ) ki ( ) ( ) 1( ) l k l j ( ) f net i f net i l kj w Depends on the activation function Depends on the activation function ( ) ( ) = = 1 1 l k l k l j l j ( ) ( ) ( ) i ( ) i y i f net i y f net m n = = 1 1 1 2 l k l kj l j l j l ji l i ( ) ( ) i ( ) i ( ) i net i w y net w y = = 1 1 j i President University Erwin Sitompul NNFL 11/12

  13. Neural Networks Multi Layer Perceptrons Back Propagation Learning Algorithm Forward propagation ( ) = 1 1 l j l j ( ) i ( ) i y f net Set the weights Calculate output 1( ), j y i y i ( ) = l k l k ( ) ( ) y i f net i n = 1 1 2 l j l ji l i ( ) i ( ) i net w y m l l k = ( ) = 1 i 1 l k l kj l j ( ) ( ) i net i w x f(.) = 1 j f(.) Backward propagation Calculate error Calculate gradient vector E w p w ( ) l kj w ) w E ( ) f(.) = 1 l l k l j ( ) i ( ) ( ) i f net i y k = 1 i ( p w ( ) l ji w E ( ) l ( ) ki = 1 2 l l k l kj l j l i ( ) i ( ) ( ) i ( ) i f net i f net y k 1 = 1 i w w ( ) l kj ( ) l ji w E , 1 Update the weights President University Erwin Sitompul NNFL 11/13

  14. Neural Networks Multi Layer Perceptrons Influential Factors in Learning Initial weights and bias Cost function / performance index Training data and generalization Network structure Number of layers Number of neurons Interconnections Learning Methods Weight modification rule Variable or fixed learning rate ( ) President University Erwin Sitompul NNFL 11/14

  15. Neural Networks Multi Layer Perceptrons Homework 11 Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2 inputs, 1 hidden layer of 2 neurons, and 1 output layer of 1 neuron, no bias at all (all a = 1). Be sure to obtain decreasing errors. Note: Submit the hardcopy and softcopy of the m-file. = = = 0 1 0 2 2 1 y y 2 1 0 3 1 3 1 0 17 3 9 9 0 2 0 0 1( ) y i 2 1( ) y i d 0 2( ) y i Hint: The number of parameters to be trained is six. President University Erwin Sitompul NNFL 11/15

  16. Neural Networks Multi Layer Perceptrons Homework 11A (Odd Student-ID) Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2 inputs, 1 hidden layer of 3 neurons, and 1 output layer of 1 neuron, with bias at all neurons (all a=1.2). Be sure to obtain decreasing errors (convergence). Note: Submit the hardcopy and softcopy of the m-file. = = = 0 1 0 2 2 1 y y 2 1 0 3 1 3 1 0 17 3 9 9 0 2 0 0 1( ) y i 2 1( ) y i d 0 2( ) y i Hint: The number of parameters to be trained is eleven. Deadline: See Google Classroom President University Erwin Sitompul NNFL 11/16

  17. Neural Networks Multi Layer Perceptrons Homework 11B (Even Student-ID) Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2 inputs, 1 hidden layer of 3 neurons, and 1 output layer of 1 neuron, with bias at all neurons (all a=0.8). Be sure to obtain decreasing errors (convergence). Note: Submit the hardcopy and softcopy of the m-file. = = = 0 1 0 2 2 1 y y 2 1 0 3 1 3 1 0 17 3 9 9 0 2 0 0 1( ) y i 2 1( ) y i d 0 2( ) y i Hint: The number of parameters to be trained is twelve. Deadline: See Google Classroom President University Erwin Sitompul NNFL 11/17

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#