The following Databricks Runtime releases will be deprecated soon:
|Version||Spark Version||Release Date||Deprecation Date|
|4.0||Spark 2.3||Mar 01, 2018||Nov 01, 2018|
|3.4||Spark 2.2||Nov 20, 2017||Jul 30, 2018|
For more information about the Databricks Runtime deprecation policy and schedule, see Databricks Runtime Versions.
Periodically, Azure Databricks releases maintenance updates to supported Databricks Runtime versions. For a complete list of maintenance updates for all supported versions, see Databricks Runtime Maintenance Updates.
Some versions of the Databricks cluster images are auto-updating. Auto-updating is not available in any cluster image or Databricks Runtime released after the Spark 2.1 (Auto-updating) cluster image.
- When should I use an auto-updating image?
- We encourage users always to upgrade to the latest version, because it often includes critical bug fixes. You can use auto-updating images if you don’t want to track and upgrade versions manually.
- When should I not use an auto-updating image?
Do not use auto-updating images if you are using the following libraries.
These are Spark’s own internal libraries, and they use Spark internal APIs. Because Spark doesn’t promise to avoid breaking internal APIs between maintenance releases, you must use these libraries with the same version of Spark.
Since these libraries are not included in Databricks cluster images, you usually must attach them by yourself. When you use an auto-updating image, it won’t upgrade attached libraries when upgrading Spark automatically. Therefore the attached library version will become inconsistent with Spark’s version, and you may see some incompatibility errors, such as NoSuchMethodError or ClassNotFoundException.