You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+11-27
Original file line number
Diff line number
Diff line change
@@ -4,17 +4,20 @@ Inspired by [@karpathy's - micrograd](https://github.com/karpathy/micrograd).
4
4
Autograd engine is the technical implementation of backpropogation algorithm that allows neural nets to learn.
5
5
And micrograd is the simplest implementation of the autograd engine, but.. its only in python.
6
6
7
-
This is a simple re-implementation of micrograd using cpp.
8
-
It was made with the major intention for personal learning. So..
7
+
This is a simple re-implementation of micrograd using c and cpp.
8
+
It was made with the major intention for personal learning.
9
9
10
10
### Getting started.
11
-
Dive into `cpp-micrograd` to get started.
11
+
Dive into `cpp-micrograd` to get started with cpp implementation.
12
12
There's a simple getting started code, to create a basic neural net that models the AND logic gate.
13
13
14
+
Dive into `c-micrograd` to get started with c implementation.
15
+
There's a simple getting started code, to create a basic neural net that predicts if a number is odd or even.
16
+
14
17
### Who will find this repo useful?
15
-
1. If you love micrograd, but would wanna also have a cpp version for it.
18
+
1. If you love micrograd, but would wanna also have a c or cpp version for it.
16
19
2. If you want a crisp backprop theory and annotated code of the autograd engine.
17
-
3. If you wanna learn cpp by building neural nets, then this could be a good start (it was my purpose).
20
+
3. If you wanna learn c or cpp by building neural nets, then this could be a good start (it was my purpose).
18
21
19
22
### Notable insightful material:
20
23
1.`digin-micrograd-theory` consists of fundamental "to-the-point" theory behind autograd. It's based on Karpathy's explanation, customized for getting started with this repository.
@@ -25,33 +28,14 @@ Given the recent breakthrough of C/C++ versions of neural nets, like gerganov's
25
28
26
29
Albeit a toy version, it gives a good understanding of how c++ would implement basic neural nets . IMO a very good start to understanding and using c/c++ neural nets like ggml, because no matter how complex and versatile the network, the basic autograd computation graph will always be same and omnipresent.
27
30
28
-
**The vision** is to make a C implementation too, and then go all the way to launching Cuda kernels, while making it as educational as possible.
31
+
**The vision** is to go from c/c++ all the way to a CuDA implementation, while making it as educational as possible. C and C++ are done, only CuDA remains, so stay tuned!
29
32
30
33
### Contributions
31
-
I am also a novice cpp programmer, so my implementations can be very sub-optimal.
34
+
I am also a novice c/cpp programmer, so my implementations can be very sub-optimal.
32
35
To make this repository actually useful, it will definitely need contributions from anyone who can make any part better.
33
36
So we will have open contributions for anyone interested.
34
37
So feel free to add a PR/issue, it is free-form for now.
35
38
36
39
### Credits
37
40
1.[@karpathy](https://github.com/karpathy) for the perfect [NeuralNets](https://www.youtube.com/watch?v=VMj-3S1tku0&list=PLAqhIrjkxbuWI23v9cThsA9GvCAUhRvKZ) course
38
-
2.[ChatGPT](https://chat.openai.com/) for being the perfect co-pilot!
39
-
40
-
41
-
42
-
<!-- I implemented cpp-micrograd a C++ version of Karpathy's autograd engine.
43
-
44
-
https://github.com/10-zin/cpp-micrograd
45
-
46
-
I have been working on this for almost 3 weeks on and off. Tho I really kicked it off last week once I got the main engine with backward pass working.
47
-
48
-
Given the recent breakthrough of C/C++ versions of neural nets, like gerganov's llama.cpp, it made a lot of sense to build some neural nets with C/C++, hence cpp-micrograd.
49
-
50
-
Albeit a toy version, it gives a good understanding of how c++ would implement basic neural nets . IMO a very good start to understanding and using c/c++ neural nets, like ggml as no matter how complex the network, the basic autograd computation graph will always be the core.
51
-
52
-
The vision is to make a C implementation too, and then go all the way to launching Cuda kernels, while making it as educational as possible.
53
-
Here is what value you can get from the repo currently.
54
-
If you love micrograd, but would wanna also have a cpp version for it.
55
-
If you want a crisp backprop theory and annotated code of the autograd engine.
56
-
If you wanna learn cpp by building neural nets, then this could be a good start (it was my purpose).
57
-
So.. if you find it interesting, do support the project with stars, and contributions. Thanks for reading! -->
41
+
2.[ChatGPT](https://chat.openai.com/) for being the perfect co-pilot!
0 commit comments