Go to file
2025-01-27 19:39:18 +00:00
final_projects Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture01_02 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture03_06 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture07_11 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture12 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture13_17 Gradient of Loss wrt Inputs 2025-01-01 23:51:37 +00:00
lecture18_22 Fix grouping of lessons up to lesson 22 2025-01-19 19:05:35 +00:00
lecture23_24 Lecture 26, RMSProp optimizer 2025-01-21 02:13:32 +00:00
lecture25_27 Lecture 28. Overfitting and generalization 2025-01-27 02:14:20 +00:00
lecture28_31 Lecture 30. L1 and L2 Regularization 2025-01-27 19:39:18 +00:00
README.md Lecture 23 and 24 exports 2025-01-20 21:44:52 +00:00

Purpose

Following along with the playlist created by Vizuara on Youtube (https://www.youtube.com/playlist?list=PLPTV0NXA_ZSj6tNyn_UadmUeU3Q3oR-hu).

The primary objective is to gain a foundational understanding of simple neural networks including forward propagation, activation layers, backward propagation, gradient descent,learning rate decay, and momentum.

Lecture Contents

Lectures 1-2 use same handout.

Lectures 3-6 use same handout.

Lectures 7-11 use same handout.

Lecture 12 uses same handout.

Lectures 13-17 use same handout.

Lectures 18-22 use same handout.

Lectures 23-24 use same handout.

Lectures 25-27 use same handout.

Lectures 28-31 use same handout.