@@ -8,66 +8,85 @@ The Dask-MPI project makes it easy to deploy Dask from within an existing MPI
88environment, such as one created with the common MPI command-line launchers
99``mpirun `` or ``mpiexec ``. Such environments are commonly found in high performance
1010supercomputers, academic research institutions, and other clusters where MPI
11- has already been installed. Dask-MPI provides a convenient interface for
12- launching your cluster either from within a batch script or directly from the
13- command-line.
11+ has already been installed.
1412
15- Example:
16- --------
13+ Dask-MPI provides two convenient interfaces to launch Dask, either from within
14+ a batch script or directly from the command-line.
1715
18- You can launch a Dask cluster directly from the command-line using the ``dask-mpi ``
19- command and specifying a scheduler JSON file.
16+ Batch Script Example
17+ --------------------
18+
19+ You can turn your batch Python script into an MPI executable
20+ with the ``dask_mpi.initialize `` function.
21+
22+ .. code-block :: python
23+
24+ from dask_mpi import initialize
25+ initialize()
26+
27+ from dask.distributed import Client
28+ client = Client() # Connect this local process to remote workers
29+
30+ This makes your Python script launchable directly with ``mpirun `` or ``mpiexec ``.
2031
2132.. code-block :: bash
2233
23- mpirun -np 4 dask-mpi --scheduler-file /path/to/scheduler.json
34+ mpirun -np 4 python my_client_script.py
2435
25- You can then access this cluster from a batch script or an interactive session
26- (such as a Jupyter Notebook) by referencing the scheduler file .
36+ This deploys the Dask scheduler and workers as well as the user's Client
37+ process within a single cohesive MPI computation .
2738
28- .. code-block :: python
2939
30- from dask.distributed import Client
31- client = Client( scheduler_file = ' /path/to/scheduler.json ' )
40+ Command Line Example
41+ --------------------
3242
43+ Alternatively you can launch a Dask cluster directly from the command-line
44+ using the ``dask-mpi `` command and specifying a scheduler file where Dask can
45+ write connection information.
3346
34- Example:
35- --------
47+ .. code-block :: bash
3648
37- Alternatively, you can turn your batch Python script into an MPI executable
38- simply by using the ``initialize `` function.
49+ mpirun -np 4 dask-mpi --scheduler-file ~ /dask-scheduler.json
3950
40- .. code-block :: python
51+ You can then access this cluster either from a separate batch script or from an
52+ interactive session (such as a Jupyter Notebook) by referencing the same scheduler
53+ file that ``dask-mpi `` created.
4154
42- from dask_mpi import initialize
43- initialize()
55+ .. code-block :: python
4456
4557 from dask.distributed import Client
46- client = Client() # Connect this local process to remote workers
58+ client = Client(scheduler_file = ' ~/dask-scheduler.json ' )
4759
48- which makes your Python script launchable directly with ``mpirun `` or ``mpiexec ``.
4960
50- .. code-block :: bash
61+ Use Job Queuing System Directly
62+ -------------------------------
63+
64+ You can also use `Dask Jobqueue <https://jobqueue.dask.org >`_ to deploy Dask
65+ directly on a job queuing system like SLURM, SGE, PBS, LSF, Torque, or others.
66+ This can be especially nice when you want to dynamically scale your cluster
67+ during your computation, or for interactive use.
5168
52- mpirun -np 4 python my_client_script.py
5369
5470.. toctree ::
5571 :maxdepth: 1
72+ :hidden:
5673 :caption: Getting Started
5774
5875 install
59- interactive
6076 batch
77+ interactive
6178
6279.. toctree ::
6380 :maxdepth: 1
81+ :hidden:
6482 :caption: Detailed use
6583
6684 cli
6785 api
6886
6987.. toctree ::
7088 :maxdepth: 1
89+ :hidden:
7190 :caption: Help & Reference
7291
7392 howitworks
0 commit comments