Databricks Runtime Versioning and Support Lifecycle

Azure Databricks offers two types of runtime for the clusters that you create:

  • Databricks Runtime is the set of core components that run on the clusters managed by Azure Databricks. It includes Apache Spark but also adds a number of components and updates that substantially improve the usability, performance, and security of big data analytics.
  • Databricks Light (also known as Data Engineering Light) is the Azure Databricks packaging of the Apache Spark runtime and excludes many of the components that Databricks Runtime adds to open source Spark, in order to provide a runtime option for jobs that don’t need the advanced benefits provided by Databricks Runtime.

You can choose from among many supported runtime versions when you create a cluster.

../../_images/runtime-version.png

Runtime components

Databricks Runtime consists of the following components:

  • Apache Spark: each runtime version contains a specific Apache Spark version
  • Delta Lake: a next-generation storage layer built on top of Apache Spark that provides ACID transactions, optimized layouts and indexes, and execution engine improvements for building data pipelines.
  • Databricks Serverless: a layer on top of Apache Spark that provides fine-grained resource sharing to optimize cloud costs
  • Ubuntu and its accompanying system libraries
  • Pre-installed Java, Scala, Python, and R libraries
  • GPU libraries for GPU-enabled clusters
  • Databricks services that integrate with other components of the platform, such as notebooks, jobs, and cluster manager

Databricks Light is the Azure Databricks packaging of the open source Apache Spark runtime. It excludes many of the libraries and services listed above. For details, see Overview of Databricks Light.

The Databricks Runtime Release Notes list the library versions included in each runtime version.

Versioning

New versions of Databricks Runtime are released on a regular basis.

  • Major Releases are represented by an increment to the version number that precedes the decimal point (the jump from 3.5 to 4.0, for example). They are released when there are major changes, some of which may not be backwards-compatible.
  • Feature Releases are represented by an increment to the version number that follows the decimal point (the jump from 3.4 to 3.5, for example). Each major release includes multiple feature releases. Feature releases are always backwards compatible with previous releases within their major release.
  • Long Term Support releases are represented by an “-LTS” suffix (for example, 3.5-LTS). For some major releases, we declare a “canonical” feature version, for which we provide two full years of support. See Support lifecycle for Databricks Runtime versions for more information.

Databricks Light versions are released after a new open-source Apache Spark feature (x.y) version is released and are named after the Apache Spark release (for example, Databricks Light 2.4 is based on Apache Spark 2.4). For more information, see Support lifecycle for Databricks Light versions.

Support lifecycle for Databricks Runtime versions

Phase Guarantees
Beta Support SLAs are not applicable. For more information, see Databricks Runtime releases.
Full Support

Major stability and security fixes are backported.

Full support for Databricks Runtime versions lasts for six months, with the exception of Long Term Support (LTS) versions, which Databricks supports for two years.

End of support (EOS)

Version is unsupported:

  • Workloads running on these versions receive no Databricks support
  • Databricks will not backport fixes

The end-of-support date is announced and published on this page at release time.

End of life (EOL) Databricks reserves the right to completely remove a release version from the API at any time after support ends, without prior notice.

Support lifecycle for Databricks Light versions

Phase Guarantees
Beta Support SLAs are not applicable. For more information, see Databricks Runtime releases.
Full Support

Major stability and security fixes are backported.

Full support lasts the lesser of (a) 12 months after release or (b) two months after the next Databricks Light minor release.

End of support (EOS)

Version is unsupported:

  • Workloads running on these versions receive no Databricks support
  • Databricks will not backport fixes

The end-of-support date is the lesser of (a) 12 months after release or (b) two months after the next Databricks Light minor release.

End of life (EOL) Databricks reserves the right to completely remove a release version from the API at any time after support ends, without prior notice.

List of releases

Current Databricks Runtime releases

Version Spark Version Release Date End-of-Support Announcement End-of-Support Date
5.5 Spark 2.4 Jul 10, 2019 Jul 10, 2019 Jan 10, 2020
5.4 Spark 2.4 Jun 03, 2019 Jun 03, 2019 Dec 03, 2019
5.3 Spark 2.4 Apr 03, 2019 Aug 03, 2019 Dec 03, 2019
5.2 Spark 2.4 Jan 24, 2019 May 27, 2019 Sep 30, 2019
5.1 Spark 2.4 Dec 18, 2018 Apr 18, 2019 Aug 19, 2019
3.5-LTS Spark 2.2 Dec 21, 2017 Jan 02, 2019 Jan 02, 2020

Current Databricks Light releases

Version Spark Version Release Date End-of-Support Announcement End-of-Support Date
2.4 Spark 2.4 Feb 27, 2019

Unsupported Databricks Runtime releases

Version Spark Version Release Date End-of-Support Announcement End-of-Support Date
4.3 Spark 2.3 Aug 09, 2018 Dec 09, 2018 Apr 09, 2019
4.2 Spark 2.3 Jul 05, 2018 Nov 05, 2018 Mar 05, 2019
4.1 Spark 2.3 May 17, 2018 Sep 17, 2018 Jan 17, 2019
4.0 Spark 2.3 Mar 01, 2018 Jul 01, 2018 Nov 01, 2018
3.4 Spark 2.2 Nov 20, 2017 Mar 31, 2018 Jul 31, 2018

REST API version string

The structure of a Databricks runtime version string in the REST API is:

Databricks Runtime:

<M>.<F>.x[-cpu][-gpu][-ml][-hls][conda]-scala<scala-version>

where

  • M - Databricks Runtime major release
  • F - Databricks Runtime feature release
  • cpu - CPU version (with -ml only)
  • gpu - GPU-enabled
  • ml - Machine learning
  • hls - Health and life sciences
  • conda - with Conda
  • scala-version - version of Scala used to compile Spark: 2.10 or 2.11

For example, 3.5.x-scala2.10 and 4.1.x-gpu-scala2.11. The List of releases tables map Databricks Runtime versions to the Spark version contained in the Runtime.

Databricks Light:

apache-spark.<M>.<F>.x-scala<scala-version>

where

  • M - Apache Spark major release
  • F - Apache Spark feature release
  • scala-version - version of Scala used to compile Spark: 2.10 or 2.11

For example, apache-spark-2.4.x-scala2.11.