Skip to content

Run:ai Documentation Library

Welcome to the Run:ai documentation area. For an introduction about what is the Run:ai Platform see Run:ai platform on the run.ai website

The Run:ai documentation is targeting three personas:

  • Run:ai Administrator - Responsible for the setup and the day-to-day administration of the product. Administrator documentation can be found here.

  • Researcher - Using Run:ai to submit Jobs. Researcher documentation can be found here.

  • Developer - Using various APIs to manipulate Jobs and integrate with other systems. Developer documentation can be found here.

How to get support

To get support use the following channels:

  • Write to support@run.ai.

  • On the navigation bar of the Run:ai user interface at <company-name>.run.ai, use the 'Support' button.

  • Or submit a ticket by clicking the button below:

Submit a Ticket

Community

Run:AI provides its customers with access to the Run:AI Customer Community portal in order to submit tickets, track ticket progress and update support cases.

Customer Community Portal

Reach out to support@run.ai for credentials.

Run:ai Cloud Status Page

Run:ai cloud availabilty is monitored at status.run.ai.

Collect Logs to Send to Support

As an IT Administrator, you can collect Run:ai logs to send to support:

  • Install the Run:ai Administrator command-line interface.
  • Use one of the two options:
    1. One time collection: Run runai-adm collect-logs. The command will generate a compressed file containing all of the existing Run:ai log files.
    2. Continuous send Run runai-adm -d <HOURS_DURATION>. The command will send Run:ai logs directly to Run:ai support for the duration stated. Data sent will not include current logs. Only logs created going forward will be sent.

Note

Both options include logs of Run:ai components. They do not include logs of researcher containers that may contain private information.

Example Code

Code for the Docker images referred to on this site is available at https://github.com/run-ai/docs/tree/master/quickstart.

The following images are used throughout the documentation:

Image Description Source
gcr.io/run-ai-demo/quickstart Basic training image. Multi-GPU support https://github.com/run-ai/docs/tree/master/quickstart/main
gcr.io/run-ai-demo/quickstart-distributed Distributed training using MPI and Horovod https://github.com/run-ai/docs/tree/master/quickstart/distributed
zembutsu/docker-sample-nginx Build (interactive) with Connected Ports https://github.com/zembutsu/docker-sample-nginx
gcr.io/run-ai-demo/quickstart-hpo Hyperparameter Optimization https://github.com/run-ai/docs/tree/master/quickstart/hpo
gcr.io/run-ai-demo/quickstart-x-forwarding Use X11 forwarding from Docker image https://github.com/run-ai/docs/tree/master/quickstart/x-forwarding
gcr.io/run-ai-demo/pycharm-demo Image used for tool integration (PyCharm and VSCode) https://github.com/run-ai/docs/tree/master/quickstart/python%2Bssh
gcr.io/run-ai-demo/example-triton-client and gcr.io/run-ai-demo/example-triton-server Basic Inference https://github.com/run-ai/models/tree/main/models/triton

Last update: 2022-11-15
Created: 2020-07-16