Go to file
2025-01-19 19:04:25 +00:00
final_projects Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture01_02 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture03_06 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture07_11 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture12 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture13_17 Gradient of Loss wrt Inputs 2025-01-01 23:51:37 +00:00
lecture18_21 Exported most recent notes 2025-01-19 19:04:25 +00:00
lecture22 Lecture 21-22 2025-01-19 19:03:38 +00:00
lecture23_24 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture25_26 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture27 Added remaining handouts 2024-12-08 03:22:05 +00:00
lecture28_31 Added remaining handouts 2024-12-08 03:22:05 +00:00
README.md Lectures 18-20 2025-01-19 18:55:33 +00:00

Purpose

Following along with the playlist created by Vizuara on Youtube (https://www.youtube.com/playlist?list=PLPTV0NXA_ZSj6tNyn_UadmUeU3Q3oR-hu).

The primary objective is to gain a foundational understanding of simple neural networks including forward propagation, activation layers, backward propagation, and gradient descent.

Lecture Contents

Lectures 1-2 use same handout.

Lectures 3-6 use same handout.

Lectures 7-11 use same handout.

Lecture 12 uses same handout.

Lectures 13-17 use same handout.

Lectures 18-21 use same handout.

Lecture 22 uses same handout.

Lectures 23-24 use same handout.

Lectures 25-26 use same handout.

Lecture 27 uses same handout.

Lectures 28-31 use same handout.