Skip to content

runai port-forward


Forward one or more local ports to the selected job or a pod within the job. The forwarding session ends when the selected job terminates or the terminal is interrupted.


  1. Port forward connections from localhost:8080 (localhost is the default) to on port 8090.

    runai port-forward <job-name> --port 8080:8090

  2. Port forward connections from to on port 8080.

    runai port-forward <job-name> --port 8080 --address

  3. Port forward multiple connections from localhost:8080 to on port 8090 and localhost:6443 to on port 443.

    runai port-forward <job-name> --port 8080:8090 --port 6443:443

  4. Port forward into a specific pod in a multi-pod job.

    runai port-forward <job-name> --port 8080:8090 --pod <pod-name>

Global flags

--loglevel <string>—Set the logging level. Choose: (default "info").

-p | --project <string>—Specify the project name. To change the default project use runai config project <project name>.


--address <string> | [local-interface-ip\host] |localhost | [privileged]—The listening address of your local machine. (default "localhost").

-h | --help—Help for the command.

--port—forward ports based on one of the following arguments:

  • <stringArray>—a list of port forwarding combinations.

  • [local-port]:[remote-port]—different local and remote ports.

  • [local-port=remote-port]—the same port is used for both local and remote.

--pod—Specify a pod of a running job. To get a list of the pods of a specific job, run the command runai describe <job-name>.

--pod-running-timeout—The length of time (like 5s, 2m, or 3h, higher than zero) to wait until the pod is running. Default is 10 minutes.

Filter based flags

--mpi—search only for mpi jobs.

--interactive—search only for interactive jobs.

--pytorch—search only for pytorch jobs.

--tf—search only for tensorflow jobs.

--train—search only for training jobs.

Last update: 2023-04-27
Created: 2023-04-13