Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[DOC] Corrected some typos #21078

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@ c.backward()
```

You can see that `b` is a linear function of `a`, and `c` is chosen from `b`.
The gradient with respect to `a` be will be either `[c/a[0], 0]` or `[0,
The gradient with respect to `a` will be either `[c/a[0], 0]` or `[0,
c/a[1]]`, depending on which element from `b` is picked. You see the results of
this example with this code:

Expand All @@ -233,7 +233,7 @@ along this axis is the same as summing that axis and multiplying by `1/3`.
You can control gradients for different ndarray operations. For instance,
perhaps you want to check that the gradients are propagating properly?
the `attach_grad()` method automatically detaches itself from the gradient.
Therefore, the input up until y will no longer look like it has `x`. To
Therefore, the input up until `y` will no longer look like it has `x`. To
illustrate this notice that `x.grad` and `y.grad` is not the same in the second
example.

Expand Down