Skip to content

Latest commit

 

History

History
22 lines (16 loc) · 1.43 KB

README.md

File metadata and controls

22 lines (16 loc) · 1.43 KB

Fundamentals of DL Library

This repository contains re-implementations of fundamental deep learning libraries, inspired by Andrej Karpathy's popular Makemore and Micrograd projects. These projects help users understand the inner workings of deep learning models, gradient-based optimization, and building neural networks from scratch.

Contents

1. Makemore

Makemore is a character-level language model that generates text based on the input data. This project includes:

  • Implementation of a simple feedforward neural network for text generation.
  • Backpropagation and gradient calculation from scratch.
  • Understanding how to train a neural network on character-level data.

2. Micrograd

Micrograd is a minimalistic deep learning library that implements the core idea of automatic differentiation (autodiff) for small neural networks. The project includes:

  • Basic tensor-like class with support for gradients.
  • Forward and backward passes through the network using autodiff.
  • Training a neural network from scratch without relying on deep learning libraries like PyTorch or TensorFlow.

Acknowledgements

Special thanks to Andrej Karpathy for his inspirational deep-learning content, which served as the foundation for the Makemore and Micrograd projects. The link to the original YouTube videos is given below.

Neural Networks: Zero to Hero