Skip to content

Latest commit

 

History

History
 
 

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Deep Learning Crash Course

by Giovanni Volpe, Benjamin Midtvedt, Jesús Pineda, Henrik Klein Moberg, Harshith Bachimanchi, Joana B. Pereira, Carlo Manzo
No Starch Press, San Francisco (CA), 2026
ISBN-13: 9781718503922
https://nostarch.com/deep-learning-crash-course


  1. Dense Neural Networks for Classification

  2. Dense Neural Networks for Regression

  3. Convolutional Neural Networks for Image Analysis

  4. Encoders–Decoders for Latent Space Manipulation

  5. U-Nets for Image Transformation

  6. Self-Supervised Learning to Exploit Symmetries
    Explains how to use unlabeled data and the symmetries symmetries of a problem for improved model performance with an application in particle localization.

  • Code 6-1: Localizing Particles Using LodeSTAR
    Demonstrates how to train a self-supervised neural network to determine the sub-pixel position of a particle within a microscope image. The network uses two channels for displacement and one channel for a probability distribution (intensity of detection). This example starts with small, single-particle images and shows how LodeSTAR’s architecture avoids bias by design. You’ll see how the model can accurately predict the x–y position even without direct labels—using only translations (and optionally flips) during training.

  • Code 6-A: Localizing Multiple Cells Using LodeSTAR
    Applies the LodeSTAR approach to detect multiple mouse stem cells in a brightfield microscopy dataset. Trained solely on a single crop containing one cell, the network can generalize to large frames with many cells. The script showcases how LodeSTAR calculates probability and displacement maps for each pixel, clusters them into detections, and evaluates performance via true centroids provided by Cell Tracking Challenge annotations.

  1. Recurrent Neural Networks for Timeseries Analysis

  2. Attention and Transformers for Sequence Processing

  3. Generative Adversarial Networks for Image Synthesis

  4. Diffusion Models for Data Representation and Exploration

  5. Graph Neural Networks for Relational Data Analysis

  6. Active Learning for Continuous Learning

  7. Reinforcement Learning for Strategy Optimization

  8. Reservoir Computing for Predicting Chaos