Prof. Dr. K ADISESHA
Genetic
Algorithms
Genetic
Algorithms Genetic Algorithms
Applications of GA
Applications of GA
GA in Classification
Optimization problems
2
Working of GA
Contents--
Prof. Dr. K. Adisesha
Introduction
3
Genetic Algorithms:
Genetic algorithms are iterative optimization techniques that simulate natural
selection to find optimal solutions.
➢ Directed search algorithms based on the mechanics of biological evolution
➢ Developed by John Holland, University of Michigan (1970’s)
➢ To understand the adaptive processes of natural systems
➢ Provide efficient, effective techniques for optimization and machine learning
applications
➢ Widely-used today in business, scientific and engineering circles
Prof. Dr. K. Adisesha
Genetic Algorithms
4
Genetic Algorithms:
Genetic algorithms have been successfully applied to various optimization problems,
including parameter tuning, scheduling, routing, and machine learning.
➢ Genetic algorithms (GAs) are a type of computational optimization technique inspired
by the principles of natural selection and genetics.
Prof. Dr. K. Adisesha
➢ Optimization is the process of making something
better. In any process, we have a set of inputs and a
set of outputs.
➢ Optimization refers to finding the values of inputs in
such a way that we get the “best” output values.
Genetic Algorithms
5
Working of Genetic Algorithm:
In order to understand the simple genetic algorithm and how it works, there are some
basic terminologies that will help you understand it better. We have described them
below.
➢ Genetic operators: In genetic algorithms, genetic operators are used when you wish to
change the genetic composition of the next generation.
➢ Chromosome/Individual: It refers to the collection of genes that can be represented with
a string of each bit as a gene.
➢ Population: Each chromosome represents an individual and a collection of
chromosomes/individual make up the population.
➢ Fitness function: This function in genetic algorithms produces an improved output for a
specific input.
Prof. Dr. K. Adisesha
Genetic Algorithms
6
Working of Genetic Algorithm:
In order to understand the simple genetic algorithm and how it works, there are some
basic terminologies that will help you understand it better.
➢ We have described flow chart below.
❖ Initialization
❖ Fitness assignment
❖ Selection
❖ Reproduction / Crossover
❖ Mutation
❖ Termination
Prof. Dr. K. Adisesha
Genetic Algorithms
7
Working of Genetic Algorithm:
In order to understand the simple genetic algorithm and how it works, there are some
basic terminologies that will help you understand it better.
➢ We have described flow chart below.
❖ Initialization
❖ Fitness assignment
❖ Selection
❖ Reproduction / Crossover
❖ Mutation
❖ Termination
Prof. Dr. K. Adisesha
Genetic Algorithms
8
Working of Genetic Algorithm:
In order to understand the simple genetic algorithm and how it works, there are some
basic terminologies that will help you understand it better.
➢ Initial population: It includes a set of individuals where each individual is a solution to
the concerned problem. We characterize each individual by the set of parameters that
we refer to as genes.
➢ Calculate Fitness: A fitness function is implemented to compute the fitness of each
individual in the population. The function provides a fitness score to each individual in
the population.
➢ Selection: The selection process selects the individuals with the highest fitness score
and is allowed to pass on their genes to the next generation.
Prof. Dr. K. Adisesha
Genetic Algorithms
9
Working of Genetic Algorithm:
Common Selection Methods:
Prof. Dr. K. Adisesha
➢ Roulette Wheel Selection: This method assigns a probability of selection to each individual,
proportional to its fitness. The algorithm then "spins" a roulette wheel, where the size of each
slice corresponds to an individual's probability, and selects individuals based on where the
wheel lands.
➢ Tournament Selection: This method involves randomly selecting a small group of individuals
and choosing the fittest one from that group to participate in the next generation. This can be
adjusted to control the selection pressure.
➢ Rank Selection: Instead of using fitness values directly, rank selection sorts the population
based on fitness and assigns selection probabilities based on their rank. Higher-ranked
individuals have a higher probability of being selected.
Genetic Algorithms
10
Working of Genetic Algorithm:
In order to understand the simple genetic algorithm and how it works, there are some
basic terminologies that will help you understand it better.
➢ Crossover: It is a core phase of the genetic algorithm. Now the algorithm chooses a
crossover point within the parents’genes chosen for mating.
➢ Mutation: The mutation phase inserts random genes into the generated offspring to
maintain the population’s diversity. It is done by flipping random genes in new
offspring.
➢ Termination: The iteration of the algorithm stops when it produces offspring that is not
different from the previous generation. It is said to have produced a set of solutions for
the problem at this stage.
Prof. Dr. K. Adisesha
Genetic Algorithms
11
Working of Genetic Algorithm:
In order to understand the simple genetic algorithm and how it works, there are some
basic terminologies that will help you understand it better.
Prof. Dr. K. Adisesha
➢ Crossover: The algorithm chooses a
crossover point within the parents’ genes
chosen for mating.
❖ Single point Crossover
❖ Two-Point Crossover
❖ Uniform Crossover
➢ Point mutation: Done by flipping random
genes in new offspring.
Genetic Algorithms
12
Genetic algorithm with Example:
Suppose a company wants to increase its profit, so it came up with the idea of sending
promotional mail along with the coupons. This promotional mail must increase the sales
and profit of the company; it can also produce a negative result of revenue loss.
Prof. Dr. K. Adisesha
➢ Well, some factors may complicate the idea of adding
coupons to the mail to maximize the profit, such as:
➢ Adding more coupons in the mail would increase the postal
cost, ultimately reducing the profit.
➢ Sending fewer coupons will also reduce the opportunity to
gain profit and lead to a potential loss in revenue.
➢ Adding too many coupons may reduce the customer’s
interest in using any coupon at all.
Genetic Algorithms
13
Applications of GA:
list some of the areas in which Genetic Algorithms are frequently used. These are −:
➢ Optimization − Genetic Algorithms are most commonly used in optimization problems wherein
we have to maximize or minimize a given objective function value under a given set of
constraints.
➢ Economics − GAs are also used to characterize various economic models like the cobweb
model, game theory equilibrium resolution, asset pricing, etc.
➢ Neural Networks − GAs are also used to train neural networks, particularly recurrent neural
networks.
➢ Parallelization − GAs also have very good parallel capabilities, and prove to be very effective
means in solving certain problems, and also provide a good area for research.
➢ Image Processing − GAs are used for various digital image processing (DIP) tasks as well like
dense pixel matching.
Prof. Dr. K. Adisesha
Genetic Algorithms
14
Applications of GA:
list some of the areas in which Genetic Algorithms are frequently used. These are −:
➢ Vehicle routing problems − With multiple soft time windows, multiple depots and a
heterogeneous fleet.
➢ Scheduling applications − GAs are used to solve various scheduling problems as well,
➢ particularly the time tabling problem.
➢ Robot Trajectory Generation − GAs have been used to plan the path which a robot
➢ arm takes by moving from one point to another.
➢ DNA Analysis − GAs have been used to determine the structure of DNA using spectrometric
data about the sample.
➢ Traveling salesman problem and its applications − GAs have been used to solve the TSP, which
is a well-known combinatorial problem using novel crossover and packing strategies.
Prof. Dr. K. Adisesha
Genetic Algorithms
15
Advantages of GA:
GAs have various advantages which have made them immensely popular algorithms:
➢ Does not require any derivative information (which may not be available for many real-
world problems).
➢ Is faster and more efficient as compared to the traditional methods.
➢ Has very good parallel capabilities.
➢ Optimizes both continuous and discrete functions and also multi-objective problems.
➢ Provides a list of “good” solutions and not just a single solution.
➢ Always gets an answer to the problem, which gets better over the time.
➢ Useful when the search space is very large and there are a large number of parameters
involved.
Prof. Dr. K. Adisesha
Genetic Algorithms
16
ID3 tree:
The ID3 (Iterative Dichotomiser 3) algorithm is a machine learning technique for
building decision trees. It's used to classify data and make predictions. .
Prof. Dr. K. Adisesha
➢ How it works
❖ ID3 uses a top-down approach to
search through training data.
❖ It tests each attribute at every node.
❖ It uses information gain to select
which attribute to test next.
❖ It splits the data based on the most
informative features.
Genetic Algorithms
17
ID3 tree:
The ID3 (Iterative Dichotomiser 3) algorithm is a machine learning technique for
building decision trees. It's used to classify data and make predictions.
Prof. Dr. K. Adisesha
Genetic Algorithms
18
ID3 tree:
The ID3 (Iterative Dichotomiser 3) algorithm is a machine learning technique for
building decision trees. It's used to classify data and make predictions.
Prof. Dr. K. Adisesha
Genetic Algorithms
19
ID3 tree:
Prof. Dr. K. Adisesha
Genetic Algorithms
20
ID3 tree:
Enjoy Sport Concept Learning Task.
Prof. Dr. K. Adisesha
Genetic Algorithms
21
ID3 tree:
Enjoy Sport Concept Learning Task.
Prof. Dr. K. Adisesha
Genetic Algorithms
22
Genetic Algorithm (GA) in Decision trees:
Genetic Algorithm (GA) with decision trees involves leveraging GA's optimization
capabilities to find better attribute subsets or structures for decision trees, potentially
leading to improved performance and generalization.
➢ Attribute Subsets: GAs can be used to identify the best combinations of attributes from a
dataset to build a decision tree.
➢ Decision Tree Induction: GAs can be used as an alternative to the traditional stepwise
search strategy used by decision tree induction algorithms.
➢ Ensemble Methods: GAs can be used to design ensembles of decision trees, such as
random forests or gradient boosting, by optimizing the individual tree structures.
Prof. Dr. K. Adisesha
Genetic Algorithms
23
Genetic Algorithm (GA) in Clustering:
Clustering is an unsupervised learning problem where the task is to explore the data to
find the best label for each data instance..
➢ The genetic algorithm is used to cluster data, starting from random clusters and
running until the optimal clusters are found.
Prof. Dr. K. Adisesha
➢ We'll start by briefly revising the K-means clustering
algorithm to point out its weak points, which are later solved
by the genetic algorithm.
➢ Clustering is used in various fields like bioinformatics, image
processing, and machine learning, and can be combined with
genetic algorithms to solve complex problems.
Genetic Algorithms
24
Genetic Algorithm (GA) in Clustering:
In the context of GAs, "clustring" refers to the use of clustering techniques to organize
or process data that the GA is working with. This can be done in a few ways:.
Prof. Dr. K. Adisesha
❖ Input Data: Clustering can be used as a preprocessing
step to group similar data points before they are fed into
the GA.
❖ Fitness Function: The GA's fitness function, which
evaluates the quality of solutions, can be designed to take
into account the clustering of data points.
➢ For example, a fitness function might aim to minimize the
sum of distances within clusters, or maximize the
separation between clusters.
Genetic Algorithms
25
Optimization problems in Genetic Algorithm:
In genetic algorithms, bi-objective optimization involves finding a set of solutions where
no solution dominates another in terms of both objectives, addressing trade-offs between
conflicting goals.
Prof. Dr. K. Adisesha
➢ Genetic algorithms can be adapted to solve bi-objective optimization problems by:
❖ Fitness Evaluation: Defining a fitness function that considers both objectives, often using a
weighted sum or other techniques to combine them.
❖ Population Maintenance: Maintaining a population of solutions, where each solution
represents a potential trade-off between the two objectives.
❖ Selection, Crossover, and Mutation: Applying genetic operators (selection, crossover, and
mutation) to evolve the population towards the Pareto front.
❖ Non-dominated Sorting: Implementing strategies to identify and maintain non-dominated
solutions (Pareto optimal solutions) in the population.
Genetic Algorithms
26
Optimization problems in Genetic Algorithm:
In genetic algorithms, bi-objective optimization involves finding a set of solutions where
no solution dominates another in terms of both objectives, addressing trade-offs between
conflicting goals.
Prof. Dr. K. Adisesha
➢ Trade-off Complexity: Finding the right balance between objectives can be complex,
especially when there are many potential trade-offs.
➢ Examples of Bi-objective Problems:
❖ Minimizing cost and maximizing quality.
❖ Maximizing performance and minimizing fuel consumption.
❖ Minimizing travel time and minimizing cost.
Genetic Algorithms
27
Gradient Descent/Ascent:
Genetic Algorithms (GAs) and Gradient Descent/Ascent are both optimization
techniques, they approach the problem differently.
Prof. Dr. K. Adisesha
➢ Gradient Descent/Ascent uses a deterministic, iterative approach based on the function's
gradient.
❖ Mechanism: Iteratively adjusts parameters in the direction of the steepest descent (for
minimization) or ascent (for maximization) of a function.
❖ Suitability: Best suited for smooth, differentiable functions where the gradient provides a
clear direction for optimization.
❖ Strengths: Efficient for finding local optima, especially in well-behaved landscapes.
❖ Weaknesses: Can get stuck in local optima, sensitive to learning rate and initial conditions.
➢ Example: Training neural networks, minimizing cost functions in machine learning.
Queries
28
Prof. Dr. K. Adisesha

Genetic Algorithm in Machine Learning PPT by-Adi

  • 1.
    Prof. Dr. KADISESHA Genetic Algorithms
  • 2.
    Genetic Algorithms Genetic Algorithms Applicationsof GA Applications of GA GA in Classification Optimization problems 2 Working of GA Contents-- Prof. Dr. K. Adisesha
  • 3.
    Introduction 3 Genetic Algorithms: Genetic algorithmsare iterative optimization techniques that simulate natural selection to find optimal solutions. ➢ Directed search algorithms based on the mechanics of biological evolution ➢ Developed by John Holland, University of Michigan (1970’s) ➢ To understand the adaptive processes of natural systems ➢ Provide efficient, effective techniques for optimization and machine learning applications ➢ Widely-used today in business, scientific and engineering circles Prof. Dr. K. Adisesha
  • 4.
    Genetic Algorithms 4 Genetic Algorithms: Geneticalgorithms have been successfully applied to various optimization problems, including parameter tuning, scheduling, routing, and machine learning. ➢ Genetic algorithms (GAs) are a type of computational optimization technique inspired by the principles of natural selection and genetics. Prof. Dr. K. Adisesha ➢ Optimization is the process of making something better. In any process, we have a set of inputs and a set of outputs. ➢ Optimization refers to finding the values of inputs in such a way that we get the “best” output values.
  • 5.
    Genetic Algorithms 5 Working ofGenetic Algorithm: In order to understand the simple genetic algorithm and how it works, there are some basic terminologies that will help you understand it better. We have described them below. ➢ Genetic operators: In genetic algorithms, genetic operators are used when you wish to change the genetic composition of the next generation. ➢ Chromosome/Individual: It refers to the collection of genes that can be represented with a string of each bit as a gene. ➢ Population: Each chromosome represents an individual and a collection of chromosomes/individual make up the population. ➢ Fitness function: This function in genetic algorithms produces an improved output for a specific input. Prof. Dr. K. Adisesha
  • 6.
    Genetic Algorithms 6 Working ofGenetic Algorithm: In order to understand the simple genetic algorithm and how it works, there are some basic terminologies that will help you understand it better. ➢ We have described flow chart below. ❖ Initialization ❖ Fitness assignment ❖ Selection ❖ Reproduction / Crossover ❖ Mutation ❖ Termination Prof. Dr. K. Adisesha
  • 7.
    Genetic Algorithms 7 Working ofGenetic Algorithm: In order to understand the simple genetic algorithm and how it works, there are some basic terminologies that will help you understand it better. ➢ We have described flow chart below. ❖ Initialization ❖ Fitness assignment ❖ Selection ❖ Reproduction / Crossover ❖ Mutation ❖ Termination Prof. Dr. K. Adisesha
  • 8.
    Genetic Algorithms 8 Working ofGenetic Algorithm: In order to understand the simple genetic algorithm and how it works, there are some basic terminologies that will help you understand it better. ➢ Initial population: It includes a set of individuals where each individual is a solution to the concerned problem. We characterize each individual by the set of parameters that we refer to as genes. ➢ Calculate Fitness: A fitness function is implemented to compute the fitness of each individual in the population. The function provides a fitness score to each individual in the population. ➢ Selection: The selection process selects the individuals with the highest fitness score and is allowed to pass on their genes to the next generation. Prof. Dr. K. Adisesha
  • 9.
    Genetic Algorithms 9 Working ofGenetic Algorithm: Common Selection Methods: Prof. Dr. K. Adisesha ➢ Roulette Wheel Selection: This method assigns a probability of selection to each individual, proportional to its fitness. The algorithm then "spins" a roulette wheel, where the size of each slice corresponds to an individual's probability, and selects individuals based on where the wheel lands. ➢ Tournament Selection: This method involves randomly selecting a small group of individuals and choosing the fittest one from that group to participate in the next generation. This can be adjusted to control the selection pressure. ➢ Rank Selection: Instead of using fitness values directly, rank selection sorts the population based on fitness and assigns selection probabilities based on their rank. Higher-ranked individuals have a higher probability of being selected.
  • 10.
    Genetic Algorithms 10 Working ofGenetic Algorithm: In order to understand the simple genetic algorithm and how it works, there are some basic terminologies that will help you understand it better. ➢ Crossover: It is a core phase of the genetic algorithm. Now the algorithm chooses a crossover point within the parents’genes chosen for mating. ➢ Mutation: The mutation phase inserts random genes into the generated offspring to maintain the population’s diversity. It is done by flipping random genes in new offspring. ➢ Termination: The iteration of the algorithm stops when it produces offspring that is not different from the previous generation. It is said to have produced a set of solutions for the problem at this stage. Prof. Dr. K. Adisesha
  • 11.
    Genetic Algorithms 11 Working ofGenetic Algorithm: In order to understand the simple genetic algorithm and how it works, there are some basic terminologies that will help you understand it better. Prof. Dr. K. Adisesha ➢ Crossover: The algorithm chooses a crossover point within the parents’ genes chosen for mating. ❖ Single point Crossover ❖ Two-Point Crossover ❖ Uniform Crossover ➢ Point mutation: Done by flipping random genes in new offspring.
  • 12.
    Genetic Algorithms 12 Genetic algorithmwith Example: Suppose a company wants to increase its profit, so it came up with the idea of sending promotional mail along with the coupons. This promotional mail must increase the sales and profit of the company; it can also produce a negative result of revenue loss. Prof. Dr. K. Adisesha ➢ Well, some factors may complicate the idea of adding coupons to the mail to maximize the profit, such as: ➢ Adding more coupons in the mail would increase the postal cost, ultimately reducing the profit. ➢ Sending fewer coupons will also reduce the opportunity to gain profit and lead to a potential loss in revenue. ➢ Adding too many coupons may reduce the customer’s interest in using any coupon at all.
  • 13.
    Genetic Algorithms 13 Applications ofGA: list some of the areas in which Genetic Algorithms are frequently used. These are −: ➢ Optimization − Genetic Algorithms are most commonly used in optimization problems wherein we have to maximize or minimize a given objective function value under a given set of constraints. ➢ Economics − GAs are also used to characterize various economic models like the cobweb model, game theory equilibrium resolution, asset pricing, etc. ➢ Neural Networks − GAs are also used to train neural networks, particularly recurrent neural networks. ➢ Parallelization − GAs also have very good parallel capabilities, and prove to be very effective means in solving certain problems, and also provide a good area for research. ➢ Image Processing − GAs are used for various digital image processing (DIP) tasks as well like dense pixel matching. Prof. Dr. K. Adisesha
  • 14.
    Genetic Algorithms 14 Applications ofGA: list some of the areas in which Genetic Algorithms are frequently used. These are −: ➢ Vehicle routing problems − With multiple soft time windows, multiple depots and a heterogeneous fleet. ➢ Scheduling applications − GAs are used to solve various scheduling problems as well, ➢ particularly the time tabling problem. ➢ Robot Trajectory Generation − GAs have been used to plan the path which a robot ➢ arm takes by moving from one point to another. ➢ DNA Analysis − GAs have been used to determine the structure of DNA using spectrometric data about the sample. ➢ Traveling salesman problem and its applications − GAs have been used to solve the TSP, which is a well-known combinatorial problem using novel crossover and packing strategies. Prof. Dr. K. Adisesha
  • 15.
    Genetic Algorithms 15 Advantages ofGA: GAs have various advantages which have made them immensely popular algorithms: ➢ Does not require any derivative information (which may not be available for many real- world problems). ➢ Is faster and more efficient as compared to the traditional methods. ➢ Has very good parallel capabilities. ➢ Optimizes both continuous and discrete functions and also multi-objective problems. ➢ Provides a list of “good” solutions and not just a single solution. ➢ Always gets an answer to the problem, which gets better over the time. ➢ Useful when the search space is very large and there are a large number of parameters involved. Prof. Dr. K. Adisesha
  • 16.
    Genetic Algorithms 16 ID3 tree: TheID3 (Iterative Dichotomiser 3) algorithm is a machine learning technique for building decision trees. It's used to classify data and make predictions. . Prof. Dr. K. Adisesha ➢ How it works ❖ ID3 uses a top-down approach to search through training data. ❖ It tests each attribute at every node. ❖ It uses information gain to select which attribute to test next. ❖ It splits the data based on the most informative features.
  • 17.
    Genetic Algorithms 17 ID3 tree: TheID3 (Iterative Dichotomiser 3) algorithm is a machine learning technique for building decision trees. It's used to classify data and make predictions. Prof. Dr. K. Adisesha
  • 18.
    Genetic Algorithms 18 ID3 tree: TheID3 (Iterative Dichotomiser 3) algorithm is a machine learning technique for building decision trees. It's used to classify data and make predictions. Prof. Dr. K. Adisesha
  • 19.
  • 20.
    Genetic Algorithms 20 ID3 tree: EnjoySport Concept Learning Task. Prof. Dr. K. Adisesha
  • 21.
    Genetic Algorithms 21 ID3 tree: EnjoySport Concept Learning Task. Prof. Dr. K. Adisesha
  • 22.
    Genetic Algorithms 22 Genetic Algorithm(GA) in Decision trees: Genetic Algorithm (GA) with decision trees involves leveraging GA's optimization capabilities to find better attribute subsets or structures for decision trees, potentially leading to improved performance and generalization. ➢ Attribute Subsets: GAs can be used to identify the best combinations of attributes from a dataset to build a decision tree. ➢ Decision Tree Induction: GAs can be used as an alternative to the traditional stepwise search strategy used by decision tree induction algorithms. ➢ Ensemble Methods: GAs can be used to design ensembles of decision trees, such as random forests or gradient boosting, by optimizing the individual tree structures. Prof. Dr. K. Adisesha
  • 23.
    Genetic Algorithms 23 Genetic Algorithm(GA) in Clustering: Clustering is an unsupervised learning problem where the task is to explore the data to find the best label for each data instance.. ➢ The genetic algorithm is used to cluster data, starting from random clusters and running until the optimal clusters are found. Prof. Dr. K. Adisesha ➢ We'll start by briefly revising the K-means clustering algorithm to point out its weak points, which are later solved by the genetic algorithm. ➢ Clustering is used in various fields like bioinformatics, image processing, and machine learning, and can be combined with genetic algorithms to solve complex problems.
  • 24.
    Genetic Algorithms 24 Genetic Algorithm(GA) in Clustering: In the context of GAs, "clustring" refers to the use of clustering techniques to organize or process data that the GA is working with. This can be done in a few ways:. Prof. Dr. K. Adisesha ❖ Input Data: Clustering can be used as a preprocessing step to group similar data points before they are fed into the GA. ❖ Fitness Function: The GA's fitness function, which evaluates the quality of solutions, can be designed to take into account the clustering of data points. ➢ For example, a fitness function might aim to minimize the sum of distances within clusters, or maximize the separation between clusters.
  • 25.
    Genetic Algorithms 25 Optimization problemsin Genetic Algorithm: In genetic algorithms, bi-objective optimization involves finding a set of solutions where no solution dominates another in terms of both objectives, addressing trade-offs between conflicting goals. Prof. Dr. K. Adisesha ➢ Genetic algorithms can be adapted to solve bi-objective optimization problems by: ❖ Fitness Evaluation: Defining a fitness function that considers both objectives, often using a weighted sum or other techniques to combine them. ❖ Population Maintenance: Maintaining a population of solutions, where each solution represents a potential trade-off between the two objectives. ❖ Selection, Crossover, and Mutation: Applying genetic operators (selection, crossover, and mutation) to evolve the population towards the Pareto front. ❖ Non-dominated Sorting: Implementing strategies to identify and maintain non-dominated solutions (Pareto optimal solutions) in the population.
  • 26.
    Genetic Algorithms 26 Optimization problemsin Genetic Algorithm: In genetic algorithms, bi-objective optimization involves finding a set of solutions where no solution dominates another in terms of both objectives, addressing trade-offs between conflicting goals. Prof. Dr. K. Adisesha ➢ Trade-off Complexity: Finding the right balance between objectives can be complex, especially when there are many potential trade-offs. ➢ Examples of Bi-objective Problems: ❖ Minimizing cost and maximizing quality. ❖ Maximizing performance and minimizing fuel consumption. ❖ Minimizing travel time and minimizing cost.
  • 27.
    Genetic Algorithms 27 Gradient Descent/Ascent: GeneticAlgorithms (GAs) and Gradient Descent/Ascent are both optimization techniques, they approach the problem differently. Prof. Dr. K. Adisesha ➢ Gradient Descent/Ascent uses a deterministic, iterative approach based on the function's gradient. ❖ Mechanism: Iteratively adjusts parameters in the direction of the steepest descent (for minimization) or ascent (for maximization) of a function. ❖ Suitability: Best suited for smooth, differentiable functions where the gradient provides a clear direction for optimization. ❖ Strengths: Efficient for finding local optima, especially in well-behaved landscapes. ❖ Weaknesses: Can get stuck in local optima, sensitive to learning rate and initial conditions. ➢ Example: Training neural networks, minimizing cost functions in machine learning.
  • 28.