-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[xdoctest][task 200] reformat example code with google style in python/paddle/tensor/creation.py
#56685
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTMeow 🐾
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
就是 clone 那里感觉有点奇怪,接口好像有变动 ~ 看看能不能改一下 ~
python/paddle/tensor/creation.py
Outdated
>>> import paddle | ||
|
||
x = paddle.ones([2]) | ||
x.stop_gradient = False | ||
clone_x = paddle.clone(x) | ||
>>> x = paddle.ones([2]) | ||
>>> x.stop_gradient = False | ||
>>> clone_x = paddle.clone(x) | ||
|
||
y = clone_x**3 | ||
y.backward() | ||
print(clone_x.grad) # [3] | ||
print(x.grad) # [3] | ||
>>> y = clone_x**3 | ||
>>> y.backward() | ||
>>> print(clone_x.grad) | ||
None | ||
>>> print(x.grad.numpy()) | ||
[3. 3.] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这段示例代码参考
Paddle/test/legacy_test/test_assign_op.py
Lines 220 to 235 in 81659f7
def test_clone(self): | |
self.python_api = paddle.clone | |
x = paddle.ones([2]) | |
x.stop_gradient = False | |
x.retain_grads() | |
clone_x = paddle.clone(x) | |
clone_x.retain_grads() | |
y = clone_x**3 | |
y.backward() | |
np.testing.assert_array_equal(x, [1, 1]) | |
np.testing.assert_array_equal(clone_x.grad.numpy(), [3, 3]) | |
np.testing.assert_array_equal(x.grad.numpy(), [3, 3]) | |
paddle.enable_static() |
改一下吧 ~ 主要是加上 retain_grads
,不然有点反直觉 ~
pytorch 不加 retain_grad
会有 warning ,个人感觉这样用不太合适 ... ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
帮我看看改的对不对😍😍😍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
不用这么麻烦,怪我 ~~~ 🤣
>>> import paddle
>>> x = paddle.ones([2])
>>> x.stop_gradient = False
>>> x.retain_grads()
>>> clone_x = paddle.clone(x)
>>> clone_x.retain_grads()
>>> y = clone_x**3
>>> y.backward()
>>> print(clone_x.grad.numpy())
[3. 3.]
>>> print(x.grad.numpy())
[3. 3.]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTMeow 🐾
@megemini 再看看呢?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM ~ 赞!
…on/paddle/tensor/creation.py` (PaddlePaddle#56685) * [Doctest]fix No.200, test=docs_preview * fix output * add retain_grads * fix style
PR types
Others
PR changes
Others
Description
修改如下文件的示例代码,使其通过
xdoctest
检查:python/paddle/tensor/creation.py
预览:
Related links
@sunzhongkai588 @SigureMo @megemini