Skip to content

Commit d4fa1fb

Browse files
committed
Add C support
1 parent 4550885 commit d4fa1fb

File tree

1 file changed

+11
-27
lines changed

1 file changed

+11
-27
lines changed

Diff for: README.md

+11-27
Original file line numberDiff line numberDiff line change
@@ -4,17 +4,20 @@ Inspired by [@karpathy's - micrograd](https://github.com/karpathy/micrograd).
44
Autograd engine is the technical implementation of backpropogation algorithm that allows neural nets to learn.
55
And micrograd is the simplest implementation of the autograd engine, but.. its only in python.
66

7-
This is a simple re-implementation of micrograd using cpp.
8-
It was made with the major intention for personal learning. So..
7+
This is a simple re-implementation of micrograd using c and cpp.
8+
It was made with the major intention for personal learning.
99

1010
### Getting started.
11-
Dive into `cpp-micrograd` to get started.
11+
Dive into `cpp-micrograd` to get started with cpp implementation.
1212
There's a simple getting started code, to create a basic neural net that models the AND logic gate.
1313

14+
Dive into `c-micrograd` to get started with c implementation.
15+
There's a simple getting started code, to create a basic neural net that predicts if a number is odd or even.
16+
1417
### Who will find this repo useful?
15-
1. If you love micrograd, but would wanna also have a cpp version for it.
18+
1. If you love micrograd, but would wanna also have a c or cpp version for it.
1619
2. If you want a crisp backprop theory and annotated code of the autograd engine.
17-
3. If you wanna learn cpp by building neural nets, then this could be a good start (it was my purpose).
20+
3. If you wanna learn c or cpp by building neural nets, then this could be a good start (it was my purpose).
1821

1922
### Notable insightful material:
2023
1. `digin-micrograd-theory` consists of fundamental "to-the-point" theory behind autograd. It's based on Karpathy's explanation, customized for getting started with this repository.
@@ -25,33 +28,14 @@ Given the recent breakthrough of C/C++ versions of neural nets, like gerganov's
2528

2629
Albeit a toy version, it gives a good understanding of how c++ would implement basic neural nets . IMO a very good start to understanding and using c/c++ neural nets like ggml, because no matter how complex and versatile the network, the basic autograd computation graph will always be same and omnipresent.
2730

28-
**The vision** is to make a C implementation too, and then go all the way to launching Cuda kernels, while making it as educational as possible.
31+
**The vision** is to go from c/c++ all the way to a CuDA implementation, while making it as educational as possible. C and C++ are done, only CuDA remains, so stay tuned!
2932

3033
### Contributions
31-
I am also a novice cpp programmer, so my implementations can be very sub-optimal.
34+
I am also a novice c/cpp programmer, so my implementations can be very sub-optimal.
3235
To make this repository actually useful, it will definitely need contributions from anyone who can make any part better.
3336
So we will have open contributions for anyone interested.
3437
So feel free to add a PR/issue, it is free-form for now.
3538

3639
### Credits
3740
1. [@karpathy](https://github.com/karpathy) for the perfect [NeuralNets](https://www.youtube.com/watch?v=VMj-3S1tku0&list=PLAqhIrjkxbuWI23v9cThsA9GvCAUhRvKZ) course
38-
2. [ChatGPT](https://chat.openai.com/) for being the perfect co-pilot!
39-
40-
41-
42-
<!-- I implemented cpp-micrograd a C++ version of Karpathy's autograd engine.
43-
44-
https://github.com/10-zin/cpp-micrograd
45-
46-
I have been working on this for almost 3 weeks on and off. Tho I really kicked it off last week once I got the main engine with backward pass working.
47-
48-
Given the recent breakthrough of C/C++ versions of neural nets, like gerganov's llama.cpp, it made a lot of sense to build some neural nets with C/C++, hence cpp-micrograd.
49-
50-
Albeit a toy version, it gives a good understanding of how c++ would implement basic neural nets . IMO a very good start to understanding and using c/c++ neural nets, like ggml as no matter how complex the network, the basic autograd computation graph will always be the core.
51-
52-
The vision is to make a C implementation too, and then go all the way to launching Cuda kernels, while making it as educational as possible.
53-
Here is what value you can get from the repo currently.
54-
If you love micrograd, but would wanna also have a cpp version for it.
55-
If you want a crisp backprop theory and annotated code of the autograd engine.
56-
If you wanna learn cpp by building neural nets, then this could be a good start (it was my purpose).
57-
So.. if you find it interesting, do support the project with stars, and contributions. Thanks for reading! -->
41+
2. [ChatGPT](https://chat.openai.com/) for being the perfect co-pilot!

0 commit comments

Comments
 (0)