Skip to content

Commit 68a6130

Browse files
author
Choi TaeHo
authored
Update README.md
1 parent 0046c10 commit 68a6130

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ CUDA_VISIBLE_DEVICES='0,1' python -m src.tools.check_dist --num-gpu 2
1616

1717
## Multi Machines
1818
### Main Machine
19-
[For collective communication](https://tutorials.pytorch.kr/intermediate/dist_tuto.html#collective-communication) in pytorch, it needs to execute process in main machine.
19+
[For collective communication](https://pytorch.org/tutorials/intermediate/dist_tuto.html#collective-communication) in pytorch, it needs to execute process in main machine.
2020
They automatically set main machine IP address and unused port number for TCP communication.
2121

2222
For main process, you must set `machine-rank` to zero and `num-machine` to the number of machines.

0 commit comments

Comments
 (0)