Skip to content

Small-scale implementation of an automatic differentiation engine, inspired by PyTorch's autograd system

Notifications You must be signed in to change notification settings

sindhu213/nanoautograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nanoautograd

nanoautograd is a lightweight Python library that provides a minimalistic automatic differentiation engine, inspired by PyTorch's autograd system. It allows for the construction of computational graphs and automatic computation of gradients via backpropagation. It supports basic tensor operations and activation functions.

Installation

  1. Clone the repository:

    git clone https://github.com/sindhu213/nanoautograd.git
    cd nanoautograd
  2. (Optional) Create and activate a virtual environment:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
  3. Install the required dependencies:

    pip install -r requirements.txt

Usage

from nanoautograd.tensor import Tensor

# Create input tensors
a = Tensor(2.0, requires_grad=True)
b = Tensor(3.0, requires_grad=True)

# Perform operations
c = a * b
d = c + a

# Compute gradients
d.backward()

# Access gradients
print(f"Gradient of a: {a.grad}")
print(f"Gradient of b: {b.grad}")

Contributing

Contributions are welcome! If you'd like to contribute, please fork the repository and submit a pull request.

About

Small-scale implementation of an automatic differentiation engine, inspired by PyTorch's autograd system

Topics

Resources

Stars

Watchers

Forks

Languages