Understanding Neural Networks and Neuron Models

neural networks l.w
1 / 19
Embed
Share

Explore the fascinating world of neural networks, from biological neural activity to artificial neural networks and neuron models. Learn about the structure, connectivity, and functions of neurons, as well as the basics of perceptrons and simple architectures. Discover how neural networks can be applied in creating a two-bit adder for binary calculations.

  • Neural Networks
  • Neuron Models
  • Artificial Intelligence
  • Machine Learning
  • Perceptron

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Neural Networks

  2. Biological neural activity Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing and rest. Neuron fires if total incoming stimulus exceeds a threshold Synapse: thin gap between axon of one neuron and dendrite of another. Signal exchange Synaptic strength/efficiency

  3. Artificial neural network Set of nodes (units, neurons, processing elements) Each node has input and output Each node performs a simple computation by its node function Weightedconnections between nodes Connectivity gives the structure/architecture of the net What can be computed by a NN is primarily determined by the connections and their weights Simplified version of networks of neurons in animal nerve systems

  4. ANN Neuron Models Each node has one or more inputs from other nodes, and one output to other nodes Input/output values can be Binary {0, 1} Bipolar {-1, 1} Continuous (bounded or not) All inputs to a node come in at same time and remain activated until output is produced Weights associated with links Node function net f here function w General neuron model ( is ) the most popular node n i = = net w ix Weighted input summation i 1

  5. Node Function Step function Ramp function

  6. Node Function Sigmoid function S-shaped Continuous and everywhere differentiable Rotationally symmetric about some point (net = c) Asymptotically approaches saturation points Sigmoid function When y = 0 and z = 0: a = 0, b = 1, c = 0. When y = 0 and z = -0.5 a = -0.5, b = 0.5, c = 0. Examples: Larger x gives steeper curve

  7. Perceptron A single layer neural network

  8. Simple architectures w1,3 w1,3 w3,5 1 3 1 3 5 w w w 1,4 1,4 3,6 w w w 2,3 2,3 4,5 2 4 2 4 6 w w w 2,4 2,4 4,6 (a) (b) 12

  9. Can we make a two bit adder? Inputs are bits x1 and x2 Outputs: carry bit (y1), sum bit (y2) Two NNs, really x1 y1 x2 y2 X1 0 0 1 1 X2 0 1 0 1 Y1 (carry) 0 0 0 1 Y2 (sum) 0 1 0 0 13

  10. Perceptron training rule Adjust weights slightly to reduce error between perceptron output o and target value t; repeat 14

  11. Not with a perceptron Training examples are not linearly separable for one case: sum=1 iff x1 xor x2 15

  12. Works well on some problems 1 Learning curves Proportion correct on test set 0.9 0.8 Are majority of inputs 1? 0.7 0.6 Perceptron Decision tree 0.5 0.4 0 10 20 30 40 50 60 70 80 90 100 Training set size 1 Proportion correct on test set 0.9 Restaurant example: WillWait? 0.8 0.7 0.6 0.5 Perceptron Decision tree 0.4 0 10 20 30 40 50 60 70 80 90 100 Training set size

  13. Sigmoid Unit 17

  14. Multilayer Networks Input Hidden Output 18

  15. Backpropagation Algorithm Forward direction Calculate network and error

  16. Backpropagation Algorithm Backward direction Backpropagate: from output to input, recursively compute and adjust weights ?? ? = ???

  17. Network Architecture: Feedforward net A connection is allowed from a node in layer i only to nodes in layer i + 1. Most widely used architecture. Conceptually, nodes at higher levels successively abstract features from preceding layers

  18. Recurrent neural networks Good for learning sequences of data e.g., text Lots of variations today: convoluted NNs, LSTMs,

  19. Neural network playground Screen Shot 2017-05-10 at 3.43.08 PM.png http://playground.tensorflow.org/

Related


More Related Content