-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[xdoctest] reformat example code with google style in No. 203 - 211 #56473
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
@SigureMo 发起Review申请🐕 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
distributed 这里的 print,建议同时保留原来的注释 ~ 如果 CI 能够验证 print 的话最好,如果不能验证的话,原有的注释也比较清楚。
[[[1, 2, 3], [4, 5, 6]], [[13, 14, 15], [16, 17, 18]]] (2 GPUs, out for rank 0) | ||
[[[7, 8, 9], [10, 11, 12]], [[19, 20, 21], [22, 23, 24]]] (2 GPUs, out for rank 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(2 GPUs, out for rank 0)
不应该出现在 print 里面吧 ~ 可以放到上面语句里面作为注释。
[0, 2] (2 GPUs, out for rank 0) | ||
[1, 3] (2 GPUs, out for rank 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
同上
[[0., 0.], [1., 1.]] (2 GPUs, out for rank 0) | ||
[[0., 0.], [0., 0.], [1., 1.], [1., 1.]] (2 GPUs, out for rank 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
同上
>>> task.wait() | ||
>>> out = data.numpy() | ||
>>> print(out) | ||
[[1, 2, 3], [1, 2, 3]] (2 GPUs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
同上
python/paddle/distributed/communication/stream/reduce_scatter.py
Outdated
Show resolved
Hide resolved
python/paddle/distributed/communication/stream/reduce_scatter.py
Outdated
Show resolved
Hide resolved
请 @megemini 再review一下 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM ~
都是 distributed 目录下的示例,目前没有环境可以测,看结果没啥问题~ 🫣
不过,中文文档中没有 distributed/communication 目录
…addlePaddle#56473) * 203 * 204 * 205 * 206 * 207 * 208 * 209 * 210 * 211 * Update all_to_all.py * Apply suggestions from code review
PR types
Others
PR changes
Others
Description
#55629