Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[xdoctest] reformat example code with google style in No. 203 - 211 #56473

Merged
merged 11 commits into from
Aug 23, 2023

Conversation

Liyulingyue
Copy link
Contributor

PR types

Others

PR changes

Others

Description

#55629

@paddle-bot
Copy link

paddle-bot bot commented Aug 20, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added contributor External developers status: proposed labels Aug 20, 2023
@Liyulingyue
Copy link
Contributor Author

@SigureMo 发起Review申请🐕

Copy link
Contributor

@megemini megemini left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

distributed 这里的 print,建议同时保留原来的注释 ~ 如果 CI 能够验证 print 的话最好,如果不能验证的话,原有的注释也比较清楚。

Comment on lines 172 to 173
[[[1, 2, 3], [4, 5, 6]], [[13, 14, 15], [16, 17, 18]]] (2 GPUs, out for rank 0)
[[[7, 8, 9], [10, 11, 12]], [[19, 20, 21], [22, 23, 24]]] (2 GPUs, out for rank 1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(2 GPUs, out for rank 0) 不应该出现在 print 里面吧 ~ 可以放到上面语句里面作为注释。

Comment on lines 309 to 310
[0, 2] (2 GPUs, out for rank 0)
[1, 3] (2 GPUs, out for rank 1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

Comment on lines 329 to 330
[[0., 0.], [1., 1.]] (2 GPUs, out for rank 0)
[[0., 0.], [0., 0.], [1., 1.], [1., 1.]] (2 GPUs, out for rank 1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

>>> task.wait()
>>> out = data.numpy()
>>> print(out)
[[1, 2, 3], [1, 2, 3]] (2 GPUs)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

@luotao1 luotao1 added the HappyOpenSource Pro 进阶版快乐开源活动,更具挑战性的任务 label Aug 21, 2023
@luotao1
Copy link
Contributor

luotao1 commented Aug 22, 2023

@megemini 再review一下

Copy link
Contributor

@megemini megemini left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ~

都是 distributed 目录下的示例,目前没有环境可以测,看结果没啥问题~ 🫣

不过,中文文档中没有 distributed/communication 目录

英文
image

中文
image

@SigureMo @sunzhongkai588

@luotao1 luotao1 merged commit 8fe86eb into PaddlePaddle:develop Aug 23, 2023
BeingGod pushed a commit to BeingGod/Paddle that referenced this pull request Sep 9, 2023
…addlePaddle#56473)

* 203

* 204

* 205

* 206

* 207

* 208

* 209

* 210

* 211

* Update all_to_all.py

* Apply suggestions from code review
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers HappyOpenSource Pro 进阶版快乐开源活动,更具挑战性的任务
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants