Skip to content

module llutil.launcher.launcher

Global Variables

  • TYPE_CHECKING

function get_current_launcher

get_current_launcher()  'ElasticLauncher'

class ElasticLauncher

A helper Configurable class for torchrun and torch.distributed.launch.

PyTorch's elastic launch ability is embeded in this Configurable, for details please see here.

HyperGraph.run() uses this class to launch multiple processes. Directly usage is also possible (see the example below).

Example:

def worker(launcher):
     print("rank", launcher.rank)
     print("local_rank", launcher.local_rank)
     print("device", launcher.assigned_device)


if __name__ == "__main__":
     launcher = ElasticLauncher("cuda:*").freeze()
     launcher(worker, launcher)

method __init__

__init__(*args, **kwds)  None

property assigned_device


property devices


property dist_backend


property group_rank


property group_world_size


property local_rank


property local_world_size


property master_addr


property master_port


property max_restarts


property rank


property rdzv_id


property restart_count


property role_name


property role_rank


property role_world_size


property world_size