CONCEPT OF ARTIFICIAL NEURAL NETWORK
Nayagarh, 752069, Odisha · Whatsapp- 9090949732
Mail- sspritamrath93@gmail.com
Linked in :- www.linkedin.com/in/subham-preetam-97b2b0148 Researchgate ID:- https://www.researchgate.net/profile/Subham_Preetam2
SUBHAM PREETAM
CONTENTS
• INTRODUCTION
• BIOLOGICALNEURONMODEL
• ARTIFICIALNEURONMODEL
• ARTIFICIALNEURALNETWORK
• NEURALNETWORKARCHITECTURE
• LEARNING
• BACKPROPAGATIONALGORITHM
• APPLICATIONS
• ADVANTAGES
Dear all
Here we discussing the basic
benefits of neural coding system . These
mentation formula we getting collected
for various authors. As the very
adopting technological field, artificial
neural network provide for basic
platform in cloud technology, artificial
intelligence & Machine learning.
Thanks and Regards
Subham Preetam
INTRODUCTION
• “Neural“ is an adjective for neuron, and “network” denotes agraph like
structure.
• Artificial Neural Networks are also referred to as“neural nets” , “artificial
neural systems”, “parallel distributed processing systems”, “connectionist
systems”.
• For acomputing systemsto be called by these pretty names, it is necessary
for the system to have alabeled directed graph structure where nodes
performs some simplecomputations.
• “DirectedGraph” consists of set of “nodes”(vertices) and aset of
“connections”(edges/links/arcs) connecting pair of nodes.
• Agraph is saidto be “labeled graph” if eachconnection is associated with a
label to identify some property of the connection
Fig1:ANDgate graph
This graph cannot be considered a neural
network since the connections between the
nodes are fixed and appear to play no other
role than carrying the inputs to the node that
computed theirconjunction.
Fig2:AND gate network
The graph structure which connects the
weights modifiable using a learning
algorithm, qualifies the computing
system to be called an artificial neural
networks.
x2ϵ{0,1}
x1
x2
x1ϵ{0,1}
o = x1 AND x2
multiplier
(x1 w1)
(x2w2)
o = x1 AND x2
x1
x2
w1
w2
•The field of neuralnetwork was pioneered by BERNARD WIDROW of Stanford University in
1950’s.
CONTD…
BIOLOGICAL NEURON MODEL
Fourparts of a typical nervecell :-
• DENDRITES:Accepts theinputs
• SOMA: Process theinputs
• AXON : Turns the processed inputs into
outputs.
• SYNAPSES: Theelectrochemical
contact betweenthe
neurons.
ARTIFICIAL NEURON MODEL
• Inputs to the network are represented bythe
mathematical symbol,xn
• Each of these inputs are multiplied by a
connection weight ,wn
sum =w1 x1+……+wnxn
• These products are simply summed, fed
through the transfer function, f( ) to generate
aresult and thenoutput.
w1
w2
xn
x2
x1
wn
f(w1 x1+……+wnxn)
TERMINOLOGY
Biological Terminology Artificial Neural Network Terminology
Neuron Node/Unit/Cell/Neurode
Synapse Connection/Edge/Link
Synaptic Efficiency Connection Strength/Weight
Firing frequency Node output
ARTIFICIAL NEURAL NETWORK
• Artificial Neural Network (ANNs) are programs designed to solve
any problem by trying to mimic the structure and the function of our
nervous system.
• Neural networks are based on simulated neurons, Which are joined
together in avariety of ways to form networks.
• Neural network resembles the human brain in the following two
ways:-
* Aneural network acquires knowledge through learning.
*A neural network’s knowledge is stored within the
interconnection strengths knownassynaptic weight.
ARTIFICIAL NEURALNETWORK MODEL
output layer
connections
Input layer
Hidden layers
Neural network
Including
connections
(called
weights)
between
neuron
Com
p
are
Actual
output
Desired
output
Input
output
Figure showing adjust of neural networkFig 1: artificial neural network model
CONTD…
NEURAL NETWORKARCHITECTURES
These are networks in which nodes are
partitioned into subsetscalled layers, with
no connections from layer j to k if j >k.
Input node
Input node
output node
output node
Hidden node
Layer3Layer0
(Input layer) (Outputlayer)
Layer 1 Layer2
Hidden Layer
Fig: fully connectednetwork
The neural network in which every node is
connected to every other nodes, and these
connections may be either excitatory (positive
weights), inhibitory (negative weights), or
irrelevant (almost zero weights).
fig: layered network
This is the subclass of the layered
networks in which there is no intra-layer
connections. In other words, a
connection may exist between any node
in layer i and any node in layer j for i <j,
but aconnection is not allowed for i=j.
This is a subclass of acyclic
networks in which aconnection is
allowed from anode in layer i only
to nodes in layer i+1
Layer0
(Input layer)
Layer3
(Outputlayer)
Layer 1 Layer2
Hidden Layer
Layer 1 Layer2
Layer3Layer0
(Input layer) (Outputlayer)
Hidden Layer
fig : Feedforward networkFig :Acyclic network
CONTD…
Many problems are best solved using neural networks whose
architecture consists of several modules, with sparse
interconnections between them. Modules can be organized in
several different ways asHierarchial organization, Successive
refinement, Inputmodularity
FIG : MODULAR NEURAL NETWORK
CONTD…
LEARNING
• Neurons in an animal’s brain are “hard wired”. It is equally obvious that
animals, especially higher order animals, learn asthey grow.
• How does thislearning occur?
• What arepossible mathematical models of learning?
• In artificial neural networks, learning refers to the method of
modifying the weights of connections between the nodesof a
specified network.
• The learning ability of aneural network is determined by its
architecture and by the algorithmic method chosen for training.
• This is learning bydoing.
• In this approach no sample
outputs are provided to the
network against which it can
measure its predictive
performancefor agiven vector of
inputs.
• One common form of
unsupervised learning is
clustering where we try to
categorize data indifferent
clusters by theirsimilarity.
UNSUPERVISED LEARNING
• A teacher is available to indicate
whether asystem is performing
correctly, or to indicate the amount of
error in system performance. Here a
teacher is aset of training data.
• The training data consist of pairs of
input and desired output values that
aretraditionally represented in data
vectors.
• Supervised learning can also be
referred asclassification, wherewe
have awide range of classifiers,
(Multilayer perceptron, k nearest
neighbor..etc)
SUPERVISED LEARNING
CONTD…
THE BACKPROPAGATIONALGORITHM
• Thebackpropagation algorithm (Rumelhart and McClelland, 1986) is
usedin layered feed-forwardArtificial Neural Networks.
• Backpropagation is amulti-layer feed forward, supervised learning
network based on gradient descent learning rule.
• we provide the algorithm with examples of the inputs and outputs we
want the network to compute, and then the error (difference
between actual and expected results) is calculated.
• Theidea of the backpropagation algorithm is to reducethis error,
until theArtificial Neural Network learns the training data.
• The activation function of the artificial neurons in ANNs
implementing the backpropagation algorithm is a
weighted sum(the sumof the inputs xi multiplied by their
respective weights wji)
• Themost common output function is the sigmoidal
function:
• Since the error is the difference between the actual and
the desired output, the error depends on the weights, and
we need to adjust the weights in order to minimize the
error. Wecan define the error function for the output of
eachneuron:
Inputs, x
Weights,v weights, w
output
Fig: Basic Blockof
Back propagation neuralnetwork
Weights,v weights, w
CONTD…
• If we want to adjust vik, the weights (let’s call them vik ) of aprevious
layer, we need first to calculate how the error depends not on the
weight, but in the input from the previous layer i.e. replacing w by x
asshown in belowequation.
Inputs, x
output
where
• and
ADVANTAGES
• Itinvolves human like thinking.
• They handle noisy or missing data.
• They canwork with large number of variables or parameters.
• They provide general solutions with good predictive accuracy.
• System hasgot property of continuous learning.
• They deal with the non-linearity in the world in which we live.
THANK
YOU

Basics of Artificial Neural Network

  • 1.
    CONCEPT OF ARTIFICIALNEURAL NETWORK Nayagarh, 752069, Odisha · Whatsapp- 9090949732 Mail- [email protected] Linked in :- www.linkedin.com/in/subham-preetam-97b2b0148 Researchgate ID:- https://www.researchgate.net/profile/Subham_Preetam2 SUBHAM PREETAM
  • 2.
    CONTENTS • INTRODUCTION • BIOLOGICALNEURONMODEL •ARTIFICIALNEURONMODEL • ARTIFICIALNEURALNETWORK • NEURALNETWORKARCHITECTURE • LEARNING • BACKPROPAGATIONALGORITHM • APPLICATIONS • ADVANTAGES Dear all Here we discussing the basic benefits of neural coding system . These mentation formula we getting collected for various authors. As the very adopting technological field, artificial neural network provide for basic platform in cloud technology, artificial intelligence & Machine learning. Thanks and Regards Subham Preetam
  • 3.
    INTRODUCTION • “Neural“ isan adjective for neuron, and “network” denotes agraph like structure. • Artificial Neural Networks are also referred to as“neural nets” , “artificial neural systems”, “parallel distributed processing systems”, “connectionist systems”. • For acomputing systemsto be called by these pretty names, it is necessary for the system to have alabeled directed graph structure where nodes performs some simplecomputations. • “DirectedGraph” consists of set of “nodes”(vertices) and aset of “connections”(edges/links/arcs) connecting pair of nodes. • Agraph is saidto be “labeled graph” if eachconnection is associated with a label to identify some property of the connection
  • 4.
    Fig1:ANDgate graph This graphcannot be considered a neural network since the connections between the nodes are fixed and appear to play no other role than carrying the inputs to the node that computed theirconjunction. Fig2:AND gate network The graph structure which connects the weights modifiable using a learning algorithm, qualifies the computing system to be called an artificial neural networks. x2ϵ{0,1} x1 x2 x1ϵ{0,1} o = x1 AND x2 multiplier (x1 w1) (x2w2) o = x1 AND x2 x1 x2 w1 w2 •The field of neuralnetwork was pioneered by BERNARD WIDROW of Stanford University in 1950’s. CONTD…
  • 5.
    BIOLOGICAL NEURON MODEL Fourpartsof a typical nervecell :- • DENDRITES:Accepts theinputs • SOMA: Process theinputs • AXON : Turns the processed inputs into outputs. • SYNAPSES: Theelectrochemical contact betweenthe neurons.
  • 6.
    ARTIFICIAL NEURON MODEL •Inputs to the network are represented bythe mathematical symbol,xn • Each of these inputs are multiplied by a connection weight ,wn sum =w1 x1+……+wnxn • These products are simply summed, fed through the transfer function, f( ) to generate aresult and thenoutput. w1 w2 xn x2 x1 wn f(w1 x1+……+wnxn)
  • 7.
    TERMINOLOGY Biological Terminology ArtificialNeural Network Terminology Neuron Node/Unit/Cell/Neurode Synapse Connection/Edge/Link Synaptic Efficiency Connection Strength/Weight Firing frequency Node output
  • 8.
    ARTIFICIAL NEURAL NETWORK •Artificial Neural Network (ANNs) are programs designed to solve any problem by trying to mimic the structure and the function of our nervous system. • Neural networks are based on simulated neurons, Which are joined together in avariety of ways to form networks. • Neural network resembles the human brain in the following two ways:- * Aneural network acquires knowledge through learning. *A neural network’s knowledge is stored within the interconnection strengths knownassynaptic weight.
  • 9.
    ARTIFICIAL NEURALNETWORK MODEL outputlayer connections Input layer Hidden layers Neural network Including connections (called weights) between neuron Com p are Actual output Desired output Input output Figure showing adjust of neural networkFig 1: artificial neural network model CONTD…
  • 10.
    NEURAL NETWORKARCHITECTURES These arenetworks in which nodes are partitioned into subsetscalled layers, with no connections from layer j to k if j >k. Input node Input node output node output node Hidden node Layer3Layer0 (Input layer) (Outputlayer) Layer 1 Layer2 Hidden Layer Fig: fully connectednetwork The neural network in which every node is connected to every other nodes, and these connections may be either excitatory (positive weights), inhibitory (negative weights), or irrelevant (almost zero weights). fig: layered network
  • 11.
    This is thesubclass of the layered networks in which there is no intra-layer connections. In other words, a connection may exist between any node in layer i and any node in layer j for i <j, but aconnection is not allowed for i=j. This is a subclass of acyclic networks in which aconnection is allowed from anode in layer i only to nodes in layer i+1 Layer0 (Input layer) Layer3 (Outputlayer) Layer 1 Layer2 Hidden Layer Layer 1 Layer2 Layer3Layer0 (Input layer) (Outputlayer) Hidden Layer fig : Feedforward networkFig :Acyclic network CONTD…
  • 12.
    Many problems arebest solved using neural networks whose architecture consists of several modules, with sparse interconnections between them. Modules can be organized in several different ways asHierarchial organization, Successive refinement, Inputmodularity FIG : MODULAR NEURAL NETWORK CONTD…
  • 13.
    LEARNING • Neurons inan animal’s brain are “hard wired”. It is equally obvious that animals, especially higher order animals, learn asthey grow. • How does thislearning occur? • What arepossible mathematical models of learning? • In artificial neural networks, learning refers to the method of modifying the weights of connections between the nodesof a specified network. • The learning ability of aneural network is determined by its architecture and by the algorithmic method chosen for training.
  • 14.
    • This islearning bydoing. • In this approach no sample outputs are provided to the network against which it can measure its predictive performancefor agiven vector of inputs. • One common form of unsupervised learning is clustering where we try to categorize data indifferent clusters by theirsimilarity. UNSUPERVISED LEARNING • A teacher is available to indicate whether asystem is performing correctly, or to indicate the amount of error in system performance. Here a teacher is aset of training data. • The training data consist of pairs of input and desired output values that aretraditionally represented in data vectors. • Supervised learning can also be referred asclassification, wherewe have awide range of classifiers, (Multilayer perceptron, k nearest neighbor..etc) SUPERVISED LEARNING CONTD…
  • 15.
    THE BACKPROPAGATIONALGORITHM • Thebackpropagationalgorithm (Rumelhart and McClelland, 1986) is usedin layered feed-forwardArtificial Neural Networks. • Backpropagation is amulti-layer feed forward, supervised learning network based on gradient descent learning rule. • we provide the algorithm with examples of the inputs and outputs we want the network to compute, and then the error (difference between actual and expected results) is calculated. • Theidea of the backpropagation algorithm is to reducethis error, until theArtificial Neural Network learns the training data.
  • 16.
    • The activationfunction of the artificial neurons in ANNs implementing the backpropagation algorithm is a weighted sum(the sumof the inputs xi multiplied by their respective weights wji) • Themost common output function is the sigmoidal function: • Since the error is the difference between the actual and the desired output, the error depends on the weights, and we need to adjust the weights in order to minimize the error. Wecan define the error function for the output of eachneuron: Inputs, x Weights,v weights, w output Fig: Basic Blockof Back propagation neuralnetwork
  • 17.
    Weights,v weights, w CONTD… •If we want to adjust vik, the weights (let’s call them vik ) of aprevious layer, we need first to calculate how the error depends not on the weight, but in the input from the previous layer i.e. replacing w by x asshown in belowequation. Inputs, x output where • and
  • 18.
    ADVANTAGES • Itinvolves humanlike thinking. • They handle noisy or missing data. • They canwork with large number of variables or parameters. • They provide general solutions with good predictive accuracy. • System hasgot property of continuous learning. • They deal with the non-linearity in the world in which we live.
  • 19.