Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
Followup CR
Browse files Browse the repository at this point in the history
  • Loading branch information
larroy committed Jul 19, 2019
1 parent b33fa14 commit eadfce9
Showing 1 changed file with 10 additions and 10 deletions.
20 changes: 10 additions & 10 deletions docs/api/python/autograd/autograd.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,10 +100,10 @@ The pattern to calculate higher order gradients is the following:
```python
from mxnet import ndarray as nd
from mxnet import autograd as ag
x=nd.array([1,2,3])
x = nd.array([1,2,3])
x.attach_grad()
def f(x):
# A function which supports higher oder gradients
# Any function which supports higher oder gradient
return nd.log(x)
```

Expand All @@ -117,28 +117,28 @@ Using mxnet.autograd.grad multiple times:
```python
with ag.record():
y = f(x)
x_grad = ag.grad(y, x, create_graph=True, retain_graph=True)[0]
x_grad_grad = ag.grad(x_grad, x, create_graph=False, retain_graph=True)[0]
print(f"dy/dx: {x_grad}")
print(f"d2y/dx2: {x_grad_grad}")
x_grad = ag.grad(heads=y, variables=x, create_graph=True, retain_graph=True)[0]
x_grad_grad = ag.grad(heads=x_grad, variables=x, create_graph=False, retain_graph=False)[0]
print(f"dL/dx: {x_grad}")
print(f"d2L/dx2: {x_grad_grad}")
```

Running backward on the backward graph:

```python
with ag.record():
y = f(x)
x_grad = ag.grad(y, x, create_graph=True, retain_graph=True)[0]
x_grad = ag.grad(heads=y, variables=x, create_graph=True, retain_graph=True)[0]
x_grad.backward()
x_grad_grad = x.grad
print(f"dy/dx: {x_grad}")
print(f"d2y/dx2: {x_grad_grad}")
print(f"dL/dx: {x_grad}")
print(f"d2L/dx2: {x_grad_grad}")

```

Both methods are equivalent, except that in the second case, retain_graph on running backward is set
to False by default. But both calls are running a backward pass as on the graph as usual to get the
gradient of the first gradient `y_grad` with respect to `x` evaluated at the value of `x`.
gradient of the first gradient `x_grad` with respect to `x` evaluated at the value of `x`.



Expand Down

0 comments on commit eadfce9

Please sign in to comment.