Skip to content

Researcher Library: Extended Reporting on Workload Progress

The Run:AI Researcher Library is a python library you can add to your deep learning python code. The reporting module in the library will externalize information about the run which can then be available for users of the Run:AI user interface (https://app.run.ai)

With the reporter module, you can externalize information such as progress, accuracy, and loss over time/epoch and more. In addition, you can externalize custom metrics of your choosing.

Sending Metrics

Python Deep-Learning Code

In your command line run:

pip install runai

In your python code add:

import runai.reporter

To send a number-based metric report, write:

reportMetric(<reporter_metric_name>, <reporter_metric_value>)

For example,

reportMetric("accuracy", 0.34)

To send a text-based metric report, write:

reportParameter(<reporter_param_name>, <reporter_param_value>)

For example,

reportParameter("state", "Training Model")

For the sake of uniformity with the Keras implementation (see below), we recommend sending the following metrics:

Metric Type Frequency of Send Description
accuracy numeric Each step Current accuracy of run
loss numeric Each step Current result of loss function of run
learning_rate numeric Once Defined learning rate of run
step numeric Each Step Current step of run
number_of_layers numeric Once Number of layers defined for the run
optimizer_name text Once Name of Deep Learning Optimizer
batch_size numeric Once Size of batch
epoch numeric Each epoch Current Epoch number
overall_epochs numeric  Once Total number of epochs

epoch and overall_epochs are especially important since the job progress bar is computed by dividing these parameters.

Automatic Sending of Metrics for Keras-Based Scripts

For Keras based deep learning runs, there is a python code that automates the task of sending metrics. Install the library as above and reference runai.reporter from your code. Then write:

runai.reporter.autolog()

The above metrics will automatically be sent going forward.

Adding the Metrics to the User interface

The metrics show up in the Job list of the user interface. To add a metric to the UI

  • Integrate the reporter library into your code
  • Send a metrics via the reporter library
  • Run the workload once to send initial data.
  • Go to Jobs list: https://app.run.ai/jobs
  • On the top right, use the settings wheel and select the metrics you have added

mceclip0.png


Last update: August 3, 2020