Did something similar a while back [1], best way to learn neural nets and backprop. Just using Numpy also makes sure you get the math right without having to deal with higher level frameworks or c++ libraries.
[1] https://github.com/santinic/claudioflow
Its nice! Yeah a lot of the heavy lifting is done by Numpy.
Isn't this what Karpathy does as well in the Zero to Hero lecture series on YT? I am sure this is great as well!
If you are asking about the "micrograd" video then yes a little bit. "micrograd" is for scalars and we use tensors in the book. If you are reading the book I would recommend to first complete the series or atleast the "micrograd" video.
This is good. Its well positioned for software engineers to understand DL stuff beyond the frameworks.
It's alright, but a C version would be even better to fully grasp the implementation details of tensors etc. Shelling out to numpy isn't particularly exciting.
I agree! What NumPy is doing is actually quite beautiful. I was thinking of writing a custom c++ backend for this thing. Lets see what happens this year.
If someone is interested in low level tensor implementation details they could benefit from a course/book “let’s build numpy in C”. No need to complicate DL library design discussion with that stuff.
Perhaps obvious to some, but this does not seem to be about learning in the traditional sense, nor a library in the book sense, unfortunately.