NCCT Centre for Advanced Technology ------------------------------------------------------------------------------------------------------------------------------------------------------------------------    SOFTWARE DEVELOPMENT * EMBEDDED SYSTEMS #109, 2nd Floor, Bombay Flats, Nungambakkam High Road,  Nungambakkam, Chennai - 600 034.  Phone - 044 - 2823 5816, 98412 32310 E-Mail: ncct@eth.net, esskayn@eth.net, URL: ncctchennai.com   Dedicated to Commitments, Committed to Technologies
NEURAL NETWORKS & ITS APPLICATIONS NCCT Where Technology and Solutions Meet
INTRODUCTION The purpose is to make a technical presentation on  NEURAL NETWORKS & ITS APPLICATIONS   NCCT
About NCCT NCCT is a leading IT organization backed by a  strong R & D, concentrating on Software Development & Electronics product development.  The major activities of NCCT includes System Software  Design and Development, Networking and Communication, Enterprise computing, Application Software Development, Web Technologies Development   NCCT
WHAT WILL WE DISCUSS Machine learning and human brain Introduction to Neural Networks  Computer neurons Architecture of Neural Networks Need for Neural Networks Uses of Neural Networks Algorithms Applications   NCCT
MACHINE LEARNING Machine learning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy  Learning capabilities can improve the performance of an intelligent system over time  The most popular approaches to machine learning are  Artificial Neural Networks  and  Genetic Algorithms   This session is dedicated to  NEURAL NETWORKS
SUPERVISED LEARNING Recognizing hand-written digits,  pattern recognition, regression. Labeled examples (input , desired output) Neural Network models:  perceptron, feed-forward, radial basis function, support vector machine. UNSUPERVISED LEARNING Find similar groups of documents in the web,  content addressable memory, clustering. Unlabeled examples  (different realizations of the input alone) Neural Network models:  self organizing maps, Hopfield networks . LEARNING   NCCT
BRAIN AND MACHINE THE BRAIN Pattern Recognition Association Complexity Noise Tolerance THE MACHINE Calculation Precision Logic
The Contrast in Architecture The Von Neumann architecture uses a single processing unit; Tens of millions of operations per second Absolute arithmetic precision The brain uses many slow unreliable processors acting in parallel
Features of the Brain Ten billion neurons Average several thousand connections  Hundreds of operations per second Reliability low Die off frequently (never replaced) Compensates for problems by massive parallelism
The Biological Inspiration The brain has been extensively studied by scientists. Vast complexity prevents all but rudimentary understanding. Even the behaviour of an individual neuron is extremely complex Single “percepts” distributed among many neurons  Localized parts of the brain are responsible for certain well-defined functions (e.g.. vision, motion). Which features are integral to the brain's performance? Which are incidentals imposed by the fact of biology?
WHAT ARE NEURAL NETWORKS A  NEURAL NETWORK  can be defined as a model of reasoning based on the human brain.  The brain consists of a densely interconnected set of nerve cells, or basic information - processing units, called neurons.  The human brain incorporates nearly 10 billion neurons and 60 trillion connections, synapses, between them.  By using multiple neurons simultaneously, the brain can perform its functions much faster than the fastest computers in existence today
A  NEURON  consists of a cell body, soma, a number of fibres called dendrites, and a single long fibre called the axon Each neuron has a very simple structure, but an army of such elements constitutes a tremendous processing power The neurons are connected by weighted links passing signals from one neuron to another Neural Networks are a  type of  artificial intelligence  that attempts to imitate the way a human brain works.  WHAT ARE NEURAL NETWORKS
Rather than using a  digital  model, in which all computations manipulate zeros and ones, a neural network works by creating connections between processing elements, the  computer  equivalent of neurons.  The organization and weights of the connections determine the  output Information is stored and processed in a neural network simultaneously throughout the whole network, rather than at specific locations In other words, in neural networks, both data and its processing are global rather than local WHAT ARE NEURAL NETWORKS
BIOLOGICAL NEURAL NETWORK   NCCT
The Neuron as a Simple Computing Element DIAGRAM OF A NEURON
Analogy between Biological and  Artificial Neural Networks
ARCHITECTURE OF A TYPICAL ARTIFICIAL NEURAL NETWORK
USES OF NEURAL NETWORK Neural networks are used for both regression and classification. Regression is a function approximation and time series prediction. Classification, the objective is to assign the input patterns to one of the several categories or classes, usually represented by outputs restricted to lie in the range from 0 to 1.
WHY NEURAL NETWORKS ? It is well proven that function approximation gives better results than the classical regression techniques. Could work very well for non linear systems.   NCCT
SIMPLE EXPLANATION  HOW NEURAL NETWORK WORKS Neural Networks  use a set of processing elements (or nodes) loosely analogous to neurons in the brain  These nodes are interconnected in a network that can then identify patterns in data as it is exposed to the data, In a sense, the network learns from an experience just as people do This distinguishes neural networks from traditional computing programs, that simply follow instructions in a fixed sequential order.   NCCT
SIMPLE EXPLANATION  HOW NEURAL NETWORK WORKS The bottom layer represents the input layer, in this case with 5 inputs labelled X1 through X5.  In the middle is something called the hidden layer, with a variable number of nodes. It is the hidden layer that performs much of the work of the network.  The output layer in this case has two nodes, Z1 and Z2 representing output values we are trying to determine from the inputs.  For example, we may be trying to predict sales (output) based on past sales, price and season (input). The structure of a neural network looks something like th is image
SIMPLE EXPLANATION  HIDDEN LAYER   Each node in the hidden layer is fully connected to the inputs. That means what is learned in a hidden node is based on all the inputs taken together This hidden layer is where the network learns interdependencies in the model The following diagram provides some detail into what goes on inside a hidden node More on the Hidden Layer
SIMPLE EXPLANATION  HIDDEN LAYER   Simply speaking a weighted sum is performed: X1 times W1 plus X2 times W2 on through X5 and W5 This weighted sum is performed for each hidden node and each output node and is how interactions are represented in the network Each summation is then transformed using a nonlinear function before the value is passed on to the next layer. More on the Hidden Layer
HEBBIAN LEARNING TWO NEURONS REPRESENT TWO CONCEPTS Synaptic strength between them indicates the strength of association of concepts; HEBBIAN LEARNING Connections are strengthened whenever two concepts occur together; PAVLOVIAN CONDITIONING An animal is trained to associate two events i.e. dinner is served after going in the rings
CAN A SINGLE NEURON LEARN A TASK? In 1958, Frank Rosenblatt  introduced a training algorithm that provided the first procedure for training a simple ANN: a perceptron The perceptron is the simplest form of a neural network.  It consists of a single neuron with  adjustable  synaptic weights and a  hard limiter
THE PERCEPTRON The operation of Rosenblatt’s perceptron is based on the McCulloch and Pitts neuron model.  The model consists of a linear combiner followed by a hard limiter The weighted sum of the inputs is applied to the hard limiter, which produces an output equal to +1 if its input is positive and   1 if it is negative.
SINGLE-LAYER TWO-INPUT PERCEPTRON
This is done by making small adjustments in the weights to reduce the difference between the actual and desired outputs of the perceptron. The initial weights are randomly assigned, usually in the range [  0.5, 0.5], and then updated to obtain the output consistent with the training examples. How does the perceptron learn its classification tasks?
If at iteration  p , the actual output is  Y ( p ) and the desired output is  Y d  ( p ), then the error is given by:   where  p  = 1, 2, 3, . . . Iteration  p  here refers to the  p th training example presented to the perceptron. If the error,  e ( p ), is positive, we need to increase perceptron output  Y ( p ), but if it is negative, we need to decrease  Y ( p ). How does the perceptron learn its classification tasks?
THE PERCEPTRON LEARNING RULE where  p  = 1, 2, 3, . . .    is the learning rate, a positive constant less than unity. The perceptron learning rule was first proposed by Rosenblatt in 1960.  Using this rule we can derive the perceptron training algorithm for classification tasks.
STEP 1: INITIALISATION Set initial weights  w 1,  w 2,…,  wn  and threshold     to random numbers in the range [  0.5, 0.5].  PERCEPTRON’S TRAINING ALGORITHM STEP 2: ACTIVATION Activate the perceptron by applying inputs  x1(p), x2(p),…, xn(p) and desired output Yd (p).  Calculate the actual output at iteration p = 1 where n is the number of the perceptron inputs,  and step is a step activation function.
STEP 3: WEIGHT TRAINING Update the weights of the  perceptron where  is the weight correction at iteration p. The weight correction is computed by the delta rule: where STEP 4: ITERATION Increase iteration p by one, go back to Step 2 and repeat the process until convergence. PERCEPTRON’S TRAINING ALGORITHM
The neuron computes the weighted sum of the input signals and compares the result with a threshold value,   .  If the net input is less than the threshold, the neuron output is –1.  But if the net input is greater than or equal to the threshold, the neuron becomes activated and its output attains a value +1. The neuron uses the following transfer or activation function: This type of activation function is called a sign function. NEURON COMPUTATION
ACTIVATION FUNCTIONS
EXAMPLE A neuron uses a step function as its activation function q = 0.2 and W1 = 0.1, W2 = 0.4,  What is the output with the  following values of x1 and x2: 1 1 0 1 1 0 0 0 Y x2 x1
NETWORK STRUCTURE The output signal is transmitted through the neuron’s outgoing connection.  The outgoing connection splits into a number of branches that transmit the same signal.  The outgoing branches terminate at the incoming connections of other neurons in the network.   NCCT
NETWORK ARCHITECTURES   THREE DIFFERENT CLASSES OF NETWORK ARCHITECTURES Single-layer Feed-forward Multi-layer  Feed-forward Recurrent The   ARCHITECTURE  of a neural network is linked with the learning algorithm used to train Neurons are organized in a Cyclic Layers
NETWORK ARCHITECTURES SINGLE LAYER FEED FORWARD  Input layer of source nodes Output layer of neurons   NCCT
NETWORK ARCHITECTURES MULTI LAYER FEED-FORWARD   INPUT LAYER OUTPUT LAYER HIDDEN LAYER 3-4-2 NETWORK   NCCT
Recurrent Network with hidden neuron(s): unit delay operator z -1  implies dynamic system RECURRENT NETWORK INPUT HIDDEN OUTPUT z -1 z -1 z -1
NEURAL NETWORK ARCHITECTURES
NEURAL NETWORK APPLICATIONS Biomedical Applications Business Forecasting Applications Demand Analysis and Forecasting Marketing Applications Financial Applications Space Research Psychiatric Diagnosis   NCCT
FACE RECOGNITION 90% accurate learning head pose, and recognizing 1-of-20 faces
HANDWRITTEN DIGIT RECOGNITION
Projects @ NCCT Redefining the Learning Specialization, Design, Development and Implementation with Projects Experience the learning with the latest new tools and technologies…
Projects @ NCCT Project Specialization Concept NCCT , in consultation with Export-Software Division, offers Live  Electronics related Projects, to experience the learning with the latest new tools and technologies NCCT  believes in specialized Hardware Design, development training and implementation with an emphasis on development principles and standards NCCT  plays a dual  positive  role by satisfying your academic requirements as well as giving the necessary training in electronics and embedded product development   NCCT
Projects @ NCCT WE ARE OFFERING PROJECTS FOR THE FOLLOWING DISCIPLINES   COMPUTER SCIENCE AND ENGINEERING INFORMATION TECHNOLOGY ELECTRONICS AND COMMUNICATION ENGINEERING ELECTRICAL AND ELECTRONICS ENGINEERING  ELECTRONICS AND INSTRUMENTATION MECHANICAL AND MECHATRONICS
Projects @ NCCT PROJECTS IN THE AREAS OF System Software Development Application Software Development, Porting Networking & Communication related Data Mining, Neural Networks, Fuzzy Logic, AI based Bio Medical related Web & Internet related Embedded Systems - Microcontrollers, VLSI, DSP, RTOS WAP, Web enabled Internet Applications UNIX \ LINUX based Projects
Projects @ NCCT SAMPLE PROJECTS @ NCCT ANN TECHNOLOGY   CHARACTER AND PATTERN RECOGNITION USING NEURAL NETWORKS   NCCT
Projects @ NCCT BRIEF IDEA TO DETERMINE HANDWRITTEN CHARECTERS USING ARTIFICIAL NEURAL NETWORKS FEATURES USING ANN TECHNOLOGY ACCURACY EASY TO IMPLEMENT FOOL PROOF MECHANISM   NCCT
Projects @ NCCT SAMPLE PROJECTS @ NCCT NEURAL NETWORK BASED MEDICAL SYSTEMS NEURAL NETWORK BASED DIAGNOSTIC SYSTEM
Projects @ NCCT BRIEF IDEA FORECASTING FETAL HEART BEATS USING NEURAL NETWORKS COMBINES INPUT WINDOWS< HIDDEN LAYERS, FEEDBACK AND SELF RECURRENT UNIT FEATURES ADDITIONAL SELF RECURRENT INPUT COMBINES SEVERAL TECHNIQUES FOR PROCESSING TEMPORAL ASPECTS OF THE INPUT SEQUENCE
Placements @ NCCT NCCT has an enormous placement wing, which enrolls all candidates in its placement bank, and will keep in constant touch with various IT related industries in India / Abroad, who are in need of computer trained quality manpower Each candidate goes through complete pre-placement session before  placement made by NCCT  The placement division also helps students in getting projects and organize guest lectures, group discussions, soft learning skills, mock interviews, personality development skills, easy learning skills, technical discussions, student meetings, etc.,   For every student we communicate the IT organizations, with the following documents *  Curriculum highlighting the skills *  A brief write up of the software knowledge acquired at NCCT, syllabus    taught at NCCT *  Projects and Specialization work done at NCCT *  Additional skills learnt
  NCCT THE FOLLOWING SKILL SET IS SECURE
NCCT Quality is Our Responsibility Dedicated to Commitments and Committed to Technology

Neural Networks Ver1

  • 1.
    NCCT Centre forAdvanced Technology ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ SOFTWARE DEVELOPMENT * EMBEDDED SYSTEMS #109, 2nd Floor, Bombay Flats, Nungambakkam High Road, Nungambakkam, Chennai - 600 034. Phone - 044 - 2823 5816, 98412 32310 E-Mail: [email protected], [email protected], URL: ncctchennai.com Dedicated to Commitments, Committed to Technologies
  • 2.
    NEURAL NETWORKS &ITS APPLICATIONS NCCT Where Technology and Solutions Meet
  • 3.
    INTRODUCTION The purposeis to make a technical presentation on NEURAL NETWORKS & ITS APPLICATIONS NCCT
  • 4.
    About NCCT NCCTis a leading IT organization backed by a strong R & D, concentrating on Software Development & Electronics product development. The major activities of NCCT includes System Software Design and Development, Networking and Communication, Enterprise computing, Application Software Development, Web Technologies Development NCCT
  • 5.
    WHAT WILL WEDISCUSS Machine learning and human brain Introduction to Neural Networks Computer neurons Architecture of Neural Networks Need for Neural Networks Uses of Neural Networks Algorithms Applications NCCT
  • 6.
    MACHINE LEARNING Machinelearning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy Learning capabilities can improve the performance of an intelligent system over time The most popular approaches to machine learning are Artificial Neural Networks and Genetic Algorithms This session is dedicated to NEURAL NETWORKS
  • 7.
    SUPERVISED LEARNING Recognizinghand-written digits, pattern recognition, regression. Labeled examples (input , desired output) Neural Network models: perceptron, feed-forward, radial basis function, support vector machine. UNSUPERVISED LEARNING Find similar groups of documents in the web, content addressable memory, clustering. Unlabeled examples (different realizations of the input alone) Neural Network models: self organizing maps, Hopfield networks . LEARNING NCCT
  • 8.
    BRAIN AND MACHINETHE BRAIN Pattern Recognition Association Complexity Noise Tolerance THE MACHINE Calculation Precision Logic
  • 9.
    The Contrast inArchitecture The Von Neumann architecture uses a single processing unit; Tens of millions of operations per second Absolute arithmetic precision The brain uses many slow unreliable processors acting in parallel
  • 10.
    Features of theBrain Ten billion neurons Average several thousand connections Hundreds of operations per second Reliability low Die off frequently (never replaced) Compensates for problems by massive parallelism
  • 11.
    The Biological InspirationThe brain has been extensively studied by scientists. Vast complexity prevents all but rudimentary understanding. Even the behaviour of an individual neuron is extremely complex Single “percepts” distributed among many neurons Localized parts of the brain are responsible for certain well-defined functions (e.g.. vision, motion). Which features are integral to the brain's performance? Which are incidentals imposed by the fact of biology?
  • 12.
    WHAT ARE NEURALNETWORKS A NEURAL NETWORK can be defined as a model of reasoning based on the human brain. The brain consists of a densely interconnected set of nerve cells, or basic information - processing units, called neurons. The human brain incorporates nearly 10 billion neurons and 60 trillion connections, synapses, between them. By using multiple neurons simultaneously, the brain can perform its functions much faster than the fastest computers in existence today
  • 13.
    A NEURON consists of a cell body, soma, a number of fibres called dendrites, and a single long fibre called the axon Each neuron has a very simple structure, but an army of such elements constitutes a tremendous processing power The neurons are connected by weighted links passing signals from one neuron to another Neural Networks are a type of artificial intelligence that attempts to imitate the way a human brain works. WHAT ARE NEURAL NETWORKS
  • 14.
    Rather than usinga digital model, in which all computations manipulate zeros and ones, a neural network works by creating connections between processing elements, the computer equivalent of neurons. The organization and weights of the connections determine the output Information is stored and processed in a neural network simultaneously throughout the whole network, rather than at specific locations In other words, in neural networks, both data and its processing are global rather than local WHAT ARE NEURAL NETWORKS
  • 15.
  • 16.
    The Neuron asa Simple Computing Element DIAGRAM OF A NEURON
  • 17.
    Analogy between Biologicaland Artificial Neural Networks
  • 18.
    ARCHITECTURE OF ATYPICAL ARTIFICIAL NEURAL NETWORK
  • 19.
    USES OF NEURALNETWORK Neural networks are used for both regression and classification. Regression is a function approximation and time series prediction. Classification, the objective is to assign the input patterns to one of the several categories or classes, usually represented by outputs restricted to lie in the range from 0 to 1.
  • 20.
    WHY NEURAL NETWORKS? It is well proven that function approximation gives better results than the classical regression techniques. Could work very well for non linear systems. NCCT
  • 21.
    SIMPLE EXPLANATION HOW NEURAL NETWORK WORKS Neural Networks use a set of processing elements (or nodes) loosely analogous to neurons in the brain These nodes are interconnected in a network that can then identify patterns in data as it is exposed to the data, In a sense, the network learns from an experience just as people do This distinguishes neural networks from traditional computing programs, that simply follow instructions in a fixed sequential order. NCCT
  • 22.
    SIMPLE EXPLANATION HOW NEURAL NETWORK WORKS The bottom layer represents the input layer, in this case with 5 inputs labelled X1 through X5. In the middle is something called the hidden layer, with a variable number of nodes. It is the hidden layer that performs much of the work of the network. The output layer in this case has two nodes, Z1 and Z2 representing output values we are trying to determine from the inputs. For example, we may be trying to predict sales (output) based on past sales, price and season (input). The structure of a neural network looks something like th is image
  • 23.
    SIMPLE EXPLANATION HIDDEN LAYER Each node in the hidden layer is fully connected to the inputs. That means what is learned in a hidden node is based on all the inputs taken together This hidden layer is where the network learns interdependencies in the model The following diagram provides some detail into what goes on inside a hidden node More on the Hidden Layer
  • 24.
    SIMPLE EXPLANATION HIDDEN LAYER Simply speaking a weighted sum is performed: X1 times W1 plus X2 times W2 on through X5 and W5 This weighted sum is performed for each hidden node and each output node and is how interactions are represented in the network Each summation is then transformed using a nonlinear function before the value is passed on to the next layer. More on the Hidden Layer
  • 25.
    HEBBIAN LEARNING TWONEURONS REPRESENT TWO CONCEPTS Synaptic strength between them indicates the strength of association of concepts; HEBBIAN LEARNING Connections are strengthened whenever two concepts occur together; PAVLOVIAN CONDITIONING An animal is trained to associate two events i.e. dinner is served after going in the rings
  • 26.
    CAN A SINGLENEURON LEARN A TASK? In 1958, Frank Rosenblatt introduced a training algorithm that provided the first procedure for training a simple ANN: a perceptron The perceptron is the simplest form of a neural network. It consists of a single neuron with adjustable synaptic weights and a hard limiter
  • 27.
    THE PERCEPTRON Theoperation of Rosenblatt’s perceptron is based on the McCulloch and Pitts neuron model. The model consists of a linear combiner followed by a hard limiter The weighted sum of the inputs is applied to the hard limiter, which produces an output equal to +1 if its input is positive and  1 if it is negative.
  • 28.
  • 29.
    This is doneby making small adjustments in the weights to reduce the difference between the actual and desired outputs of the perceptron. The initial weights are randomly assigned, usually in the range [  0.5, 0.5], and then updated to obtain the output consistent with the training examples. How does the perceptron learn its classification tasks?
  • 30.
    If at iteration p , the actual output is Y ( p ) and the desired output is Y d ( p ), then the error is given by: where p = 1, 2, 3, . . . Iteration p here refers to the p th training example presented to the perceptron. If the error, e ( p ), is positive, we need to increase perceptron output Y ( p ), but if it is negative, we need to decrease Y ( p ). How does the perceptron learn its classification tasks?
  • 31.
    THE PERCEPTRON LEARNINGRULE where p = 1, 2, 3, . . .  is the learning rate, a positive constant less than unity. The perceptron learning rule was first proposed by Rosenblatt in 1960. Using this rule we can derive the perceptron training algorithm for classification tasks.
  • 32.
    STEP 1: INITIALISATIONSet initial weights w 1, w 2,…, wn and threshold  to random numbers in the range [  0.5, 0.5]. PERCEPTRON’S TRAINING ALGORITHM STEP 2: ACTIVATION Activate the perceptron by applying inputs x1(p), x2(p),…, xn(p) and desired output Yd (p). Calculate the actual output at iteration p = 1 where n is the number of the perceptron inputs, and step is a step activation function.
  • 33.
    STEP 3: WEIGHTTRAINING Update the weights of the perceptron where is the weight correction at iteration p. The weight correction is computed by the delta rule: where STEP 4: ITERATION Increase iteration p by one, go back to Step 2 and repeat the process until convergence. PERCEPTRON’S TRAINING ALGORITHM
  • 34.
    The neuron computesthe weighted sum of the input signals and compares the result with a threshold value,  . If the net input is less than the threshold, the neuron output is –1. But if the net input is greater than or equal to the threshold, the neuron becomes activated and its output attains a value +1. The neuron uses the following transfer or activation function: This type of activation function is called a sign function. NEURON COMPUTATION
  • 35.
  • 36.
    EXAMPLE A neuronuses a step function as its activation function q = 0.2 and W1 = 0.1, W2 = 0.4, What is the output with the following values of x1 and x2: 1 1 0 1 1 0 0 0 Y x2 x1
  • 37.
    NETWORK STRUCTURE Theoutput signal is transmitted through the neuron’s outgoing connection. The outgoing connection splits into a number of branches that transmit the same signal. The outgoing branches terminate at the incoming connections of other neurons in the network. NCCT
  • 38.
    NETWORK ARCHITECTURES THREE DIFFERENT CLASSES OF NETWORK ARCHITECTURES Single-layer Feed-forward Multi-layer Feed-forward Recurrent The ARCHITECTURE of a neural network is linked with the learning algorithm used to train Neurons are organized in a Cyclic Layers
  • 39.
    NETWORK ARCHITECTURES SINGLELAYER FEED FORWARD Input layer of source nodes Output layer of neurons NCCT
  • 40.
    NETWORK ARCHITECTURES MULTILAYER FEED-FORWARD INPUT LAYER OUTPUT LAYER HIDDEN LAYER 3-4-2 NETWORK NCCT
  • 41.
    Recurrent Network withhidden neuron(s): unit delay operator z -1 implies dynamic system RECURRENT NETWORK INPUT HIDDEN OUTPUT z -1 z -1 z -1
  • 42.
  • 43.
    NEURAL NETWORK APPLICATIONSBiomedical Applications Business Forecasting Applications Demand Analysis and Forecasting Marketing Applications Financial Applications Space Research Psychiatric Diagnosis NCCT
  • 44.
    FACE RECOGNITION 90%accurate learning head pose, and recognizing 1-of-20 faces
  • 45.
  • 46.
    Projects @ NCCTRedefining the Learning Specialization, Design, Development and Implementation with Projects Experience the learning with the latest new tools and technologies…
  • 47.
    Projects @ NCCTProject Specialization Concept NCCT , in consultation with Export-Software Division, offers Live Electronics related Projects, to experience the learning with the latest new tools and technologies NCCT believes in specialized Hardware Design, development training and implementation with an emphasis on development principles and standards NCCT plays a dual positive role by satisfying your academic requirements as well as giving the necessary training in electronics and embedded product development NCCT
  • 48.
    Projects @ NCCTWE ARE OFFERING PROJECTS FOR THE FOLLOWING DISCIPLINES COMPUTER SCIENCE AND ENGINEERING INFORMATION TECHNOLOGY ELECTRONICS AND COMMUNICATION ENGINEERING ELECTRICAL AND ELECTRONICS ENGINEERING ELECTRONICS AND INSTRUMENTATION MECHANICAL AND MECHATRONICS
  • 49.
    Projects @ NCCTPROJECTS IN THE AREAS OF System Software Development Application Software Development, Porting Networking & Communication related Data Mining, Neural Networks, Fuzzy Logic, AI based Bio Medical related Web & Internet related Embedded Systems - Microcontrollers, VLSI, DSP, RTOS WAP, Web enabled Internet Applications UNIX \ LINUX based Projects
  • 50.
    Projects @ NCCTSAMPLE PROJECTS @ NCCT ANN TECHNOLOGY CHARACTER AND PATTERN RECOGNITION USING NEURAL NETWORKS NCCT
  • 51.
    Projects @ NCCTBRIEF IDEA TO DETERMINE HANDWRITTEN CHARECTERS USING ARTIFICIAL NEURAL NETWORKS FEATURES USING ANN TECHNOLOGY ACCURACY EASY TO IMPLEMENT FOOL PROOF MECHANISM NCCT
  • 52.
    Projects @ NCCTSAMPLE PROJECTS @ NCCT NEURAL NETWORK BASED MEDICAL SYSTEMS NEURAL NETWORK BASED DIAGNOSTIC SYSTEM
  • 53.
    Projects @ NCCTBRIEF IDEA FORECASTING FETAL HEART BEATS USING NEURAL NETWORKS COMBINES INPUT WINDOWS< HIDDEN LAYERS, FEEDBACK AND SELF RECURRENT UNIT FEATURES ADDITIONAL SELF RECURRENT INPUT COMBINES SEVERAL TECHNIQUES FOR PROCESSING TEMPORAL ASPECTS OF THE INPUT SEQUENCE
  • 54.
    Placements @ NCCTNCCT has an enormous placement wing, which enrolls all candidates in its placement bank, and will keep in constant touch with various IT related industries in India / Abroad, who are in need of computer trained quality manpower Each candidate goes through complete pre-placement session before placement made by NCCT The placement division also helps students in getting projects and organize guest lectures, group discussions, soft learning skills, mock interviews, personality development skills, easy learning skills, technical discussions, student meetings, etc., For every student we communicate the IT organizations, with the following documents * Curriculum highlighting the skills * A brief write up of the software knowledge acquired at NCCT, syllabus taught at NCCT * Projects and Specialization work done at NCCT * Additional skills learnt
  • 55.
    NCCTTHE FOLLOWING SKILL SET IS SECURE
  • 56.
    NCCT Quality isOur Responsibility Dedicated to Commitments and Committed to Technology