Learning Paths
Structured, sequential learning journeys through machine learning. Each path provides a curated curriculum with clear objectives, prerequisites, and hands-on practice.
Total time investment: 120-160 hours for complete curriculum
How Learning Paths Work
Each path includes:
- Clear learning objectives - What you’ll master
- Prerequisites - Required background knowledge
- Week-by-week structure - Organized timeline with estimated hours
- Content references - Links to concepts, papers, and examples
- Hands-on practice - Exercises and projects
- Assessment criteria - Self-check understanding
Philosophy: Build knowledge progressively through structured practice, not random exploration.
Foundation Paths
Core deep learning fundamentals. Start here if you’re new to deep learning.
Deep Learning Foundations
Duration: 5 weeks | 45-60 hours | Beginner
Complete introduction to deep learning covering neural networks, CNNs, attention, and language models. This is the recommended entry point.
Covers: Modules 1-4 (Neural Networks, CNNs, Transformers, GPT)
Neural Network Foundations
Duration: 2 weeks | 15-20 hours | Beginner
Master the fundamentals of neural networks, backpropagation, and optimization.
Topics:
- Perceptrons and MLPs
- Backpropagation and gradient descent
- Optimization algorithms (SGD, Adam)
- Regularization (dropout, weight decay)
- Practical training techniques
Hands-on: Build MNIST classifier from scratch in NumPy
Computer Vision with CNNs
Duration: 2 weeks | 12-18 hours | Beginner to Intermediate
Learn convolutional networks for computer vision.
Topics:
- Convolution and pooling operations
- CNN architectures (AlexNet, VGG, ResNet)
- Transfer learning strategies
- Medical imaging applications
Papers: AlexNet (2012), VGG (2014), ResNet (2015)
Attention and Transformers
Duration: 2 weeks | 12-16 hours | Intermediate
Master attention mechanisms and transformer architecture.
Topics:
- RNN limitations and attention motivation
- Scaled dot-product attention
- Multi-head attention
- Transformer architecture (Vaswani et al., 2017)
Paper: Attention Is All You Need (most important paper in modern AI)
Language Models with NanoGPT
Duration: 2 weeks | 15-25 hours | Intermediate
Build GPT from scratch following Andrej Karpathy’s approach.
Topics:
- GPT architecture (decoder-only transformer)
- Causal attention and autoregressive generation
- Tokenization (BPE)
- Language model training techniques
- Text generation strategies
Hands-on: Implement GPT-2 (124M params) and train on custom data
Advanced Paths
Cutting-edge techniques and modern AI systems.
Advanced Topics Overview
Duration: 3 weeks | 26-38 hours | Advanced
Survey of advanced deep learning topics.
Covers: Multimodal learning, generative models, advanced training
Multimodal Vision-Language Models
Duration: 2 weeks | 12-18 hours | Advanced
Master vision-language models like CLIP, ViT, and modern VLMs.
Topics:
- Multimodal fusion strategies
- Contrastive learning (InfoNCE, CLIP)
- Vision Transformers (ViT)
- Zero-shot transfer
- Advanced VLMs (Flamingo, BLIP-2, LLaVA)
Papers: CLIP (2021), ViT (2021)
Applications: Visual search, VQA, accessibility tools
Generative Diffusion Models
Duration: 2 weeks | 12-18 hours | Advanced
Learn diffusion models for high-quality generation.
Topics:
- Generative models comparison (GANs, VAEs, Diffusion)
- Diffusion fundamentals (forward/reverse process)
- DDPM training and DDIM sampling
- Classifier-free guidance for text-to-image
- Healthcare applications (synthetic medical data)
Papers: DDPM (2020), DDIM (2021), DALL-E 2 (2022)
Advanced Training Topics
Duration: 1 week | 8-12 hours | Advanced
Modern training techniques and dynamics.
Topics:
- Self-supervised learning (contrastive + masked)
- Masked prediction (BERT, MAE)
- Training dynamics (double descent, overparameterization)
- Scaling laws and compute-optimal training
Specialized Paths
Domain-specific applications and skills.
Healthcare AI & Electronic Health Records
Duration: 2 weeks | 16-24 hours | Intermediate to Advanced
Apply deep learning to healthcare, focusing on EHR analysis.
Topics:
- EHR structure and medical coding (ICD-10, ATC, CPT)
- Tokenization for medical events
- Healthcare foundation models (ETHOS, BEHRT, GatorTron)
- Clinical decision support
- Interpretability and fairness in medical AI
Datasets: MIMIC-III, MIMIC-IV, EmergAI
Thesis connection: Directly supports multimodal EHR research
Research Methodology & Academic Writing
Duration: 1 week | 6-10 hours | Intermediate
Essential skills for conducting and publishing ML research.
Topics:
- Reading research papers (three-pass method)
- Formulating research questions (PICOT framework)
- Experimental design (baselines, ablations, statistics)
- Structuring papers (IMRaD format)
- Publication strategy
Outcome: Ready to conduct rigorous ML research and write papers
Recommended Learning Sequences
For Beginners
Start from scratch and build to advanced topics:
- Deep Learning Foundations (5 weeks)
- Or individual modules: NN → CNNs → Transformers → GPT
- Advanced Topics (3 weeks)
- VLMs → Diffusion → Advanced Training
- Specialize: Healthcare AI or domain of choice
Total: 8-10 weeks for solid foundation + specialization
For Researchers
Focus on modern techniques and research skills:
- Attention and Transformers (2 weeks)
- Vision-Language Models (2 weeks)
- Research Methodology (1 week)
- Healthcare AI (2 weeks) - if applicable
Total: 5-7 weeks to research-ready
For Healthcare AI
Specialized curriculum for medical AI:
- Neural Network Foundations (2 weeks)
- Computer Vision (2 weeks)
- Transformers (2 weeks)
- Healthcare AI & EHR (2 weeks)
- Multimodal VLMs (2 weeks) - for multimodal clinical AI
Total: 10 weeks for healthcare AI specialization
Progress Tracking
Recommended approach:
- Start with prerequisites - Ensure you have required background
- Follow week-by-week structure - Don’t skip ahead
- Complete hands-on exercises - Theory alone isn’t enough
- Self-assess - Check understanding before moving on
- Build projects - Apply knowledge to real problems
Completion criteria: Can explain concepts clearly, implement algorithms from scratch, and apply to new problems.
Explore Library Content
While paths provide structure, the Library offers reference material:
- Concepts → - Individual concept explanations
- Papers → - Research paper analyses
- Examples → - Implementation guides
- Blog → - Applications and insights
Difference: Library = reference, Paths = curriculum