Skip to content

Overview

Authentication Overview

To access Run:ai resources, you have to authenticate. The purpose of this document is to explain how authentication works at Run:ai.

Authentication Endpoints

Generally speaking, there are two authentication endpoints:

  • The Run:ai control plane.
  • Run:ai GPU clusters.

Both endpoints are accessible via APIs as well as a user interface.

Identity Service

Run:ai holds an includes and internal identity service. The identity service ensures users are who they claim to be and gives them the right kinds of access to Run:ai.

Users

Out of the box, The Run:ai identity service provides a way to create users and associate them with access roles.

It is also possible to configure the Run:ai identity service to connect to a company directory using the SAML protocol. For more information see single sign-on.

Authentication Method

Both endpoints described above are protected via time-limited oauth2-like JWT authentication tokens.

There are two ways of getting a token:

Authentication Flows

Run:ai control plane

You can use the Run:ai user interface to provide user/password. These are validated against the identity service. Run:ai will return a token with the right access rights for continued operation.

YOu can also use a client application to get a token and then connect directly to the administration API endpoint.

Run:ai GPU Clusters

The Run:ai GPU cluster is a Kubernetes cluster. All communication into Kubernetes flows through the Kubernetes API server.

To facilitate authentication via Run:ai the Kubernetes API server must be configured to use the Run:ai identity service to validate authentication tokens. For more information on how to configure the Kubernetes API server see Kubernetes configuration under researcher authentication.

See also


Last update: March 23, 2022