Skip to content

WojtAcht/agh-stochastic-ml

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

75 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Stochastic Methods in Machine Learning

This repository contains laboratory materials for the "Stochastic Methods in Machine Learning" course at AGH University of Science and Technology in Krakow.

Course Description

This course explores various problems at the intersection of optimization and machine learning.

Labs Overview

Lab Title Description
1 Gradient Descent Basic (lab1): Implements gradient descent algorithm to train a linear regression model from scratch.

Advanced (lab1_new): Reproduces key experiments from "Understanding Deep Learning Requires Rethinking Generalization" — investigates memorization of random labels, effect of model capacity, and weight decay regularization on MNIST.
2 Gradient Descent Extensions Covers gradient descent extensions including Momentum, AdaGrad, and Adam. Students test these optimizers on standard benchmark test functions: Sphere, Rosenbrock, Rastrigin.
3 Adversarial Examples Investigates the vulnerability of neural networks to adversarial attacks, implementing the Fast Gradient Sign Method (FGSM) to generate perturbations that cause misclassification.
4 Model-Based Offline Optimization Explores optimization of black-box functions using pre-collected datasets without additional function evaluations. Involves training neural network surrogate models to approximate benchmark functions and implementing gradient-based optimization techniques on these surrogate models to find optimal solutions.
5 Hyperparameter Optimization Utilizes Optuna framework to fine-tune CatBoost model hyperparameters on the Covertype dataset. Demonstrates practical application of HPO to maximize classification performance in a multiclass problem.
6 Bayesian Optimization Covers various acquisition functions (e.g., Expected Improvement, UCB), focusing on how they manage the exploration-exploitation trade-off.
7 Normal Distribution Explores the fundamental properties and significance of the Normal Distribution.
8 CMA-ES Applies the pycma library to evolve policy parameters and solve the OpenAI Gym CartPole balancing task.
9 Neuroevolution Demonstrates evolutionary training of neural networks by applying CMA-ES to directly optimize network weights for solving a supervised learning problem.
10 Differential Evolution Implements Differential Evolution from scratch.
11 LLM × EA Evolution of Heuristics
12 Cuckoo Search Explores the common issue of seemingly new optimization algorithms that may not offer genuinely novel ideas, focusing on Cuckoo Search and its critical analysis.
13 Multiobjective Optimization Demonstrates multiobjective optimization techniques to find optimal asset allocations for investment portfolios.

Issues and Contributions

If you find any mistakes or have suggestions for improvements:

  1. Create an Issue: Open a new issue in the repository describing the problem or suggestion in detail.
  2. Submit a Pull Request: If you have a fix or improvement, feel free to fork the repository and submit a pull request with your changes.

Your contributions help improve the quality of these materials for all students.

About

The tasks accompanying the "Stochastic Methods in Machine Learning" classes at AGH University of Science and Technology in Krakow.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors