Databricks Runtime ML

Databricks Runtime ML provides a ready-to-go environment for machine learning and data science. It contains multiple popular libraries, including TensorFlow, Keras, and XGBoost. It also supports distributed TensorFlow training using Horovod.

Databricks Runtime ML lets you start a Databricks cluster with all of the libraries required for distributed TensorFlow training. It ensures the compatibility of the libraries included on the cluster (between TensorFlow and CUDA / cuDNN, for example) and substantially decreases the cluster start-up time compared to using init scripts.

Note

The current version of Databricks Runtime ML is in beta.

Note

Databricks Runtime ML is available only in the Azure Databricks Premium Plan.

What’s in Databricks Runtime ML?

Databricks Runtime ML is built on Databricks Runtime. For example, Databricks Runtime 4.1 ML Beta is built on Databricks Runtime 4.1. The libraries included in the base Databricks Runtime are listed in the Databricks Runtime Release Notes.

Databricks Runtime ML includes additional libraries to support machine learning. See Databricks Runtime Release Notes for an up-to-date list of libraries for the current release.

Create a cluster using Databricks Runtime ML

When you create a cluster, select a Databricks Runtime ML version from the Databricks Runtime Version drop-down. Both CPU and GPU-enabled ML runtimes are available.

../../_images/mlruntime-dbr-dropdown.png

If you select a GPU-enabled ML runtime, you are prompted to select a compatible Driver Type and Worker Type. Incompatible instance types are grayed out in the drop-downs. GPU-enabled instance types are listed under the GPU-Accelerated label.

Warning

Libraries in your workspace that automatically attach to all clusters can conflict with the libraries included in Databricks Runtime ML. Before you create a cluster with Databricks Runtime ML, clear the Attach automatically to all clusters checkbox for conflicting libraries.

License

By using this version of Databricks Runtime, you agree to the terms and conditions outlined in the NVIDIA End User License Agreement (EULA) with respect to the CUDA, cuDNN, and Tesla libraries, and the NVIDIA End User License Agreement (with NCCL Supplement) for the NCCL library.