Skip to content

Integrations with Run:ai

Integration support

Support for third-party integrations varies. When noted below, the integration is supported out of the box with Run:ai. For other integrations, our Customer Success team has prior experience assisting customers with setup. In many cases, the NVIDIA Enterprise Support Portal may include additional reference documentation provided on an as-is basis.

Integrations

Tool Category Run:ai support details Additional Information
Triton Orchestration Supported Usage via docker base image. Quickstart inference example
Spark Orchestration Community Support
It is possible to schedule Spark workflows with the Run:ai Scheduler. Sample code: How to Run Spark job with Run:ai.
Kubeflow Pipelines Orchestration Community Support It is possible to schedule kubeflow pipelines with the Run:ai Scheduler. Sample code: How to integrate Run:ai with Kubeflow
Apache Airflow Orchestration Community Support It is possible to schedule Airflow workflows with the Run:ai Scheduler. Sample code: How to integrate Run:ai with Apache Airflow
Argo workflows Orchestration Community Support It is possible to schedule Argo workflows with the Run:ai Scheduler. Sample code: How to integrate Run:ai with Argo Workflows
SeldonX Orchestration Community Support It is possible to schedule Seldon Core workloads with the Run:ai Scheduler. Sample code: How to integrate Run:ai with Seldon Core
Jupyter Notebook Development Supported Run:ai provides integrated support with Jupyter Notebooks. Quickstart example
Jupyter Hub Development Community Support It is possible to submit Run:ai workloads via JupyterHub. Sample code: How to connect JupyterHub with Run:ai
PyCharm Development Supported Containers created by Run:ai can be accessed via PyCharm. PyCharm example
VScode Development Supported - Containers created by Run:ai can be accessed via Visual Studio Code. example
- You can automatically launch Visual Studio code web from the Run:ai console. example.
Kubeflow notebooks Development Community Support It is possible to schedule kubeflow notebooks with the Run:ai Scheduler. Sample code: How to integrate Run:ai with Kubeflow
Ray training, inference, data processing. Community Support It is possible to schedule Ray jobs with the Run:ai Scheduler. Sample code: How to Integrate Run:ai with Ray
TensorBoard Experiment tracking Supported Run:ai comes with a preset Tensorboard Environment asset. TensorBoard example.
Additional sample
Weights & Biases Experiment tracking Community Support It is possible to schedule W&B workloads with the Run:ai Scheduler. Sample code: How to integrate with Weights and Biases
Additional samples here
ClearML Experiment tracking Community Support It is possible to schedule ClearML workloads with the Run:ai Scheduler. Sample code: How to integrate Run:ai with ClearML
MLFlow Model Serving Community Support It is possible to use ML Flow together with the Run:ai Scheduler. Sample code: How to integrate Run:ai with MLFlow
Additional MLFlow sample
Hugging Face Repositories Supported Run:ai provides an out of the box integration with Hugging Face
Docker Registry Repositories Supported Run:ai allows using a docker registry as a Credentials asset.
S3 Storage Supported Run:ai communicates with S3 by defining a data source asset.
Github Storage Supported Run:ai communicates with GitHub by defining it as a data source asset
Tensorflow Training Supported Run:ai provides out of the box support for submitting TensorFlow workloads via API or by submitting workloads via user interface.
Pytorch Training Supported Run:ai provides out of the box support for submitting PyTorch workloads via API or by submitting workloads via user interface.
Kubeflow MPI Training Supported Run:ai provides out of the box support for submitting MPI workloads via API or by submitting workloads via user interface
XGBoost Training Supported Run:ai provides out of the box support for submitting XGBoost workloads via API or by submitting workloads via user interface
Karpenter Cost Optimization Supported Run:ai provides out of the box support for Karpenter to save cloud costs. Integration notes with Karpenter can be found here

Kubernetes Workloads Integration

Kubernetes has several built-in resources that encapsulate running Pods. These are called Kubernetes Workloads and should not be confused with Run:ai Workloads.

Examples of such resources are a Deployment that manages a stateless application, or a Job that runs tasks to completion.

Run:ai natively runs Run:ai Workloads. A Run:ai workload encapsulates all the resources needed to run, creates them, and deletes them together. However, Run:ai, being an open platform allows the scheduling of any Kubernetes Workflow.

For more information see Kubernetes Workloads Integration.