Secret Management Workflow Example

In this workflow example, we use the Secrets feature to set up JDBC credentials for connecting to an Azure Data Lake Store. All commands in this example use Databricks CLI (version 0.7.1 and above) to manage secrets.


This topic describes how to create and manage secrets using the Databricks CLI (version 0.7.1 and above). Alternatively, you can use the Secrets API programmatically. See Secrets API.

Set up the secret

We start by creating a secret scope called jdbc with secrets username and password to bootstrap the Spark JDBC data source.

First we create our scope:

databricks secrets create-scope --scope jdbc

Now we bootstrap our secrets: username and password. We execute the following commands and enter the secret values in the opened editor.

databricks secrets put --scope jdbc --key username

databricks secrets put --scope jdbc --key password

Use the secret in a notebook

In a notebook, we read the secrets that are stored in our secret scope jdbc to bootstrap our JDBC connector:

val driverClass = ""
val connectionProperties = new java.util.Properties()
connectionProperties.setProperty("Driver", driverClass)

val jdbcUsername = dbutils.secrets.get(scope = "jdbc", key = "username")
val jdbcPassword = dbutils.secrets.get(scope = "jdbc", key = "password")
connectionProperties.put("user", s"${jdbcUsername}")
connectionProperties.put("password", s"${jdbcPassword}")

We can now use these ConnectionProperties with the JDBC connector to talk to our data source. Note that the values fetched from the scope are never displayed in the notebook (see Notebook Secret Redaction).

Grant access to another group

After verifying that the credentials were bootstrapped correctly, we want to share these credentials with the datascience group to use for their analysis.

We grant the datascience group read-only permission to these credentials by making the following request:

databricks secrets put-acl --scope jdbc --principal datascience --permission READ