Week 6, 2025 (Feb 3 - Feb 9)
I completed my Autograd Engine project this week! (well, sort of)
Well, I was thinking of what to call it when I first started it, and I ultimately thought it would be a good idea to call it chibigrad since it's a tiny grad engine I built from scratch.
I did a lot of refactoring for the Tensor
as well as the Linear
classes, and also added a simple ReLU activation function.
I realized how many bugs and errors my project had (especially gradient accumulation, broadcasting, backprop logic, handling of different data types, etc.) because of the tests I had added, and this resulted in me adding a lot more tests to verify the correctness of my project. Thank god for language models since I would not have been able to write all them tests!
So this week was mostly about refactoring and adding tests.
The engine works relatively nice, surprisingly, and I am quite happy with the results.
The repo link is here. I yapped quite a bit on this page as you can clearly notice, but you might probably get a better understanding of the project by reading the repo code since it's pretty simple and sweet.
From here on, I think I will start jumping into more neural network architecture design and optimization projects.