module llutil.launcher.launcher
Global Variables
- TYPE_CHECKING
function get_current_launcher
class ElasticLauncher
A helper Configurable
class for torchrun
and torch.distributed.launch
.
PyTorch's elastic launch ability is embeded in this Configurable, for details please see here.
HyperGraph.run()
uses this class to launch multiple processes. Directly usage is also possible (see the example below).
Example:
def worker(launcher):
print("rank", launcher.rank)
print("local_rank", launcher.local_rank)
print("device", launcher.assigned_device)
if __name__ == "__main__":
launcher = ElasticLauncher("cuda:*").freeze()
launcher(worker, launcher)