Skip to content

External access to Containers

Introduction

Researchers working with containers. many times need to remotely access the container. Some examples:

  • Using a Jupyter notebook that runs within the container
  • Using PyCharm to run python commands remotely.
  • Using TensorBoard to view machine learning visualizations

This requires exposing container ports. When using docker, the way Researchers expose ports is by declaring them when starting the container. Run:ai has similar syntax.

Run:ai is based on Kubernetes. Kubernetes offers an abstraction of the container's location. This complicates the exposure of ports. Kubernetes offers several options:

Method Description Prerequisites
Port Forwarding Simple port forwarding allows access to the container via local and/or remote port. None
NodePort Exposes the service on each Node’s IP at a static port (the NodePort). You’ll be able to contact the NodePort service from outside the cluster by requesting <NODE-IP>:<NODE-PORT> regardless of which node the container actually resides in. None
LoadBalancer Exposes the service externally using a cloud provider’s load balancer. Only available with cloud providers

See https://kubernetes.io/docs/concepts/services-networking/service for further details on these four options.

See Also


Last update: 2022-10-27
Created: 2020-07-19