Course Repository CM0091 Artificial Intelligence at Universidad EAFIT
| INSTRUCTOR | Juan David Martínez Vargas ([email protected]) |
|---|---|
| LECTURES | Tuesday 7:30 – 9:00 33-203, Thursday 7:30 - 9:00 33-202 |
| MATERIAL | repo |
Sketchnote by Tomomi Imura
- Neural Networks and Deep Learning, which are at the core of modern AI. We will illustrate the concepts behind these important topics using code in two of the most popular frameworks - TensorFlow and PyTorch.
- Neural Architectures for working with images and text. We will cover recent models but may be a bit lacking in the state-of-the-art.
- State of the art Generative AI applications.
| Event | Topic | Material | Starting Date | Final Date |
|---|---|---|---|---|
| Assignment 1 (20%) | Fully Connected Nets and Backpropagation | Week 05 | Week 08 | |
| Assignment 2 (20%) | Application of Computer Vision | Week 08 | Week 10 | |
| Assignment 3 (20%) | Application of Transformers and NLP | Week 10 | Week 12 | |
| Assignment 4 (20%) | Application of GenAI | Week 14 | Week 16 | |
| Final Project (20%) | AI Applications | Week 12 | Week 18 |
- Lecture01.pdf — Introduction to AI, DL and ML
- Lecture01b.pdf — Linear Algebra for DL
- Homework:
- Review the Pytorch in one our blog
- Review the pytorch basics notebook
- Review the broadcasting notebook
-
Lecture02.pdf — Linear Regression from a Deep Learning Perspective
-
Lecture02b.pdf — Logistic and Softmax Regression from a Deep Learning Perspective
-
BiasVariance.pdf — Bias–Variance Trade-off and Decomposition
-
Notebooks:
- L02_linear-regr-scratch.ipynb — Linear regression implemented from scratch
- L02b_softmax_regression_scratch.ipynb — Logistic and softmax regression from scratch
- L02_fitting.ipynb — Model fitting and training dynamics
- L02_regularization.ipynb — Regularization techniques
- L02_hyperparameter_tunning.ipynb — Hyperparameter tuning basics
-
Homework:
- Review the bias–variance trade-off and relate it to underfitting and overfitting
- Run and modify the linear and softmax regression notebooks
- Experiment with regularization strength and observe its effect on generalization
-
Lecture03.pdf — Feed-Forward Neural Networks (FFNNs)
-
Lecture03b.pdf — Optimization for Machine Learning
(SGD, Momentum, RMSProp, Adam, AdamW) -
Lecture03c.pdf — Backpropagation and Regularization in Neural Networks
-
Notebooks:
- L03_FFNNs.ipynb — Feed-forward neural networks from scratch
- L03_mlp_pytorch_softmax_crossentr.ipynb — MLPs in PyTorch with softmax and cross-entropy loss
- L03_sgd_scheduler_momentum.ipynb — Optimization strategies: SGD, momentum, and learning-rate scheduling
- L03-autograd_tutorial.ipynb — PyTorch autograd and gradient computation
-
Homework:
- Explain the role of backpropagation in training neural networks
- Compare different optimizers (SGD vs Adam) in terms of convergence behavior
- Modify the MLP architecture (depth, width) and observe training dynamics
- Experiment with learning rates and schedulers and analyze their effect on performance
-
Lecture04.pdf — Training Neural Networks with PyTorch (Step-by-step)
-
Notebooks:
- L04_mnist.ipynb — Step-by-step NN training in PyTorch (MNIST)
- L04_asl.ipynb — Homework: apply the training pipeline to the ASL dataset
-
Lecture05.pdf — Convolutional Neural Networks (CNNs) Basics
-
conv2D.pdf — Convolutions and Backpropagation (Mathematical Foundations)
-
Notebooks:
- L05_Convolucion.ipynb — Understanding the convolution operation step-by-step
- L05_CNNBackpropagation.ipynb — Backpropagation through convolutional layers
- L05_ConvNetsPyTorch.ipynb — Building CNNs in PyTorch
- L05_CNN-CIFAR-10.ipynb — Homework: Train and evaluate a CNN on CIFAR-10
-
Lecture06.pdf — Common CNN Architectures and Transfer Learning
-
BatchNorm.pdf — Batch Normalization (1D and 2D)
-
Notebooks:
- L06_CNN_Transfer.ipynb — Example: applying transfer learning to an image dataset
- L06_CNN_Transfer_HW.ipynb — Homework: Fine-tune a pretrained CNN on a new dataset
-
Computational resources: I strongly recommend creating (free) accounts on the following platforms:
-
Deep Learning books:
-
Artificial Intelligence Books:
-
Large Language Models:
-
Online courses:
