aboutsummaryrefslogtreecommitdiff
path: root/projects/autograd.md
blob: 63d8ee43fa39d2cd43cddffb20c680c4190bec7a (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
name = Autograd
description = An implementation of autograd / backpropagation.
tags = Python
url = /projects/autograd
template = project.html
links = [
    href = https://github.com/compromyse/autograd
    label = SOURCE
]
priority = 1
---

All you need to run a simple neural network using autograd is the following code:

The code defines a data set `X`, expected output (or ground truth) `y`. It then trains the neural network by performing backward propagation (`.backward()`), then applies the calculated gradients through `.optimise()` along with a learning rate of `0.01`.

```py
from src.nn import MLP
from src.loss import mse

X = [
    [ 0.0, 1.0, 2.0 ],
    [ 2.0, 1.0, 0.0 ],
    [ 2.0, 2.0, 2.0 ],
    [ 3.0, 3.0, 3.0 ]
]

y = [ 1.0, -1.0, 1.0, -1.0 ]
n = MLP(3, [ 4, 4, 1 ])

for i in range(400):
    pred = [ n(x) for x in X ]
    loss = mse(y, pred)
    loss.zero_grad()
    loss.backward()
    n.optimise(0.01)

    print(pred)
```