Skip to content

Run:ai Documentation Library

Welcome to the Run:ai documentation area. For an introduction about what is the Run:ai Platform see Run:ai platform on the run.ai website.

The Run:ai documentation is targeting four personas:

  • Infrastructure Administrator - An IT person, responsible for the installation, setup and IT maintenance of the Run:ai product. Infrastructure Administrator documentation can be found here.

  • Platform Administrator - Responsible for the day-to-day administration of the product. Platform Administrator documentation can be found here.

  • Researcher — Using Run:ai to spin up notebooks, submit Workloads, prompt models, etc. Researcher documentation can be found here.

  • Developer — Using various APIs to automate work with Run:ai. Developer documentation can be found here.

How to Get Support

To get support use the following channels:

  • On the Run:ai user interface at <company-name>.run.ai, use the 'Contact Support' link on the top right.

  • Or submit a ticket by clicking the button below:

Submit a Ticket

Community

Run:ai provides its customers with access to the Run:ai Customer Community portal to submit tickets, track ticket progress and update support cases.

Customer Community Portal

Reach out to customer support for credentials.

Run:ai Cloud Status Page

Run:ai cloud availability is monitored at status.run.ai.

Collect Logs to Send to Support

As an IT Administrator, you can collect Run:ai logs to send to support. For more information see logs collection.

Example Code

Code for the Docker images referred to on this site is available at https://github.com/run-ai/docs/tree/master/quickstart.

The following images are used throughout the documentation:

Image Description Source
runai.jfrog.io/demo/quickstart Basic training image. Multi-GPU support https://github.com/run-ai/docs/tree/master/quickstart/main
runai.jfrog.io/demo/quickstart-distributed Distributed training using MPI and Horovod https://github.com/run-ai/docs/tree/master/quickstart/distributed
zembutsu/docker-sample-nginx Build (interactive) with Connected Ports https://github.com/zembutsu/docker-sample-nginx
runai.jfrog.io/demo/quickstart-x-forwarding Use X11 forwarding from Docker image https://github.com/run-ai/docs/tree/master/quickstart/x-forwarding
runai.jfrog.io/demo/pycharm-demo Image used for tool integration (PyCharm and VSCode) https://github.com/run-ai/docs/tree/master/quickstart/python%2Bssh
runai.jfrog.io/demo/example-triton-client and runai.jfrog.io/demo/example-triton-server Basic Inference https://github.com/run-ai/models/tree/main/models/triton

Contributing to the documentation

This documentation is made better by individuals from our customer and partner community. If you see something worth fixing, please comment at the bottom of the page or create a pull request via GitHub. The public GitHub repository can be found on the top-right of this page.