Autograd in Pytorch¶
Reference¶
https://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html
https://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html#differentiation-in-autograd
https://www.youtube.com/watch?v=tIeHLnjs5U8 (3Blue1Brown)
[62]:
import torch
a = torch.tensor([2., 3.], requires_grad=True)
b = torch.tensor([6., 4.], requires_grad=True)
\begin{align*} y = a^3 + b^2 + 1 \end{align*}
[63]:
y = a**3 + b**2 + 1
y
[63]:
tensor([45., 44.], grad_fn=<AddBackward0>)
\begin{align*} \frac{dy}{dy} = 1 \\\\ \frac{dy}{da} = 3a^2 \\\\ \frac{dy}{db} = 2b \end{align*}
[64]:
external_grad = torch.tensor([1., 1.])
y.backward(gradient=external_grad)
[66]:
a.grad == 3 * a ** 2
[66]:
tensor([True, True])
[67]:
b.grad == 2 * b
[67]:
tensor([True, True])