Skip to content

Run:ai Documentation Library

Welcome to the Run:ai documentation area. For an introduction about what is the Run:ai Platform see Run:ai platform on the run.ai website.

The Run:ai documentation is targeting three personas:

  • Run:ai Administrator - Is responsible for the setup and the day-to-day administration of the product. Administrator documentation can be found here.

  • Researcher - Using Run:ai to submit Jobs. Researcher documentation can be found here.

  • Developer - Using various APIs to manipulate Jobs and integrate with other systems. Developer documentation can be found here.

How to get support

To get support use the following channels:

  • On the Run:ai user interface at <company-name>.run.ai, use the 'Contact Support' link on the top right.

  • Or submit a ticket by clicking the button below:

Submit a Ticket

Community

Run:ai provides its customers with access to the Run:ai Customer Community portal in order to submit tickets, track ticket progress and update support cases.

Customer Community Portal

Reach out to customer support for credentials.

Run:ai Cloud Status Page

Run:ai cloud availability is monitored at status.run.ai.

Collect Logs to Send to Support

As an IT Administrator, you can collect Run:ai logs to send to support:

Note

The tar file packages the logs of Run:ai components only. It does not include logs of researcher containers that may contain private information.

Example Code

Code for the Docker images referred to on this site is available at https://github.com/run-ai/docs/tree/master/quickstart.

The following images are used throughout the documentation:

Image Description Source
gcr.io/run-ai-demo/quickstart Basic training image. Multi-GPU support https://github.com/run-ai/docs/tree/master/quickstart/main
gcr.io/run-ai-demo/quickstart-distributed Distributed training using MPI and Horovod https://github.com/run-ai/docs/tree/master/quickstart/distributed
zembutsu/docker-sample-nginx Build (interactive) with Connected Ports https://github.com/zembutsu/docker-sample-nginx
gcr.io/run-ai-demo/quickstart-hpo Hyperparameter Optimization https://github.com/run-ai/docs/tree/master/quickstart/hpo
gcr.io/run-ai-demo/quickstart-x-forwarding Use X11 forwarding from Docker image https://github.com/run-ai/docs/tree/master/quickstart/x-forwarding
gcr.io/run-ai-demo/pycharm-demo Image used for tool integration (PyCharm and VSCode) https://github.com/run-ai/docs/tree/master/quickstart/python%2Bssh
gcr.io/run-ai-demo/example-triton-client and gcr.io/run-ai-demo/example-triton-server Basic Inference https://github.com/run-ai/models/tree/main/models/triton

Contributing to the documentation

This documentation is made better by a number of individuals from our customer and partner community. If you see something worth fixing, please comment at the bottom of the page or create a pull request via GitHub. The public GitHub repository can be found on the top-right of this page.