Skip to content

Muawiya-contact/-micrograd-mini

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

🧠 Micrograd-Mini

Micrograd-Mini is a tiny scalar-valued autograd engine with a basic neural network framework β€” built from scratch, inspired by Andrej Karpathy's micrograd.

It supports reverse-mode automatic differentiation (backpropagation) and is capable of training a simple neural network (MLP) on tasks like XOR. Great for learning the internals of backprop!


πŸ”₯ Features

  • βœ… Reverse-mode autodiff (backpropagation)
  • βœ… Dynamic computation graph (DAG)
  • βœ… Custom Value class with gradients
  • βœ… Tanh and ReLU activations
  • βœ… Basic Neuron, Layer, and MLP implementation
  • βœ… Trains on XOR dataset

πŸ“ Project Structure

micrograd-mini/
    β”œβ”€β”€ engine.py # Core autodiff engine (Value class)
    β”œβ”€β”€ nn.py # Neural network components
    β”œβ”€β”€ train.py # Training loop for XOR
    β”œβ”€β”€ example.py # Better example using XOR dataset 
└── README.md # Project info

πŸš€ Getting Started

πŸ”§ Requirements

  • Python 3.7+
  • No external libraries needed (pure Python)

▢️ Run Training

python train.py

πŸ“ˆ Example Output

--- Final Predictions after Training ---

Input: [0.0, 0.0] => Predicted: 0.01 | Target: 0.0
Input: [0.0, 1.0] => Predicted: 0.98 | Target: 1.0
Input: [1.0, 0.0] => Predicted: 0.97 | Target: 1.0
Input: [1.0, 1.0] => Predicted: 0.03 | Target: 0.0

Training complete! 🎯

🧠 Learn by Building

Want to really understand backpropagation and gradients?

  • Dive into engine.py and explore the Value class

  • Inspect how operations dynamically build a graph

  • See how .backward() traverses it for gradient computation β€” just like real frameworks!

πŸ™ Attribution

This project is heavily inspired by micrograd by Andrej Karpathy, licensed under the MIT License.


πŸͺͺ License

This project is licensed under the MIT License. See the LICENSE file for details.


✨ Author

Built with ❀️ by Muawiya, as part of a deep dive into AI, neural nets, and autodiff fundamentals.

🌐 Connect With Me