Secret Workflow Example

In this workflow example, we use secrets to set up JDBC credentials for connecting to an Azure Data Lake Store.

Create a secret scope

Create a secret scope called jdbc.

Follow the instructions in Create an Azure Key Vault-backed secret scope.

Create secrets

The method for creating the secrets depends on whether you are using an Azure Key Vault-backed scope or a Databricks-backed scope.

Create the secrets in an Azure Key Vault-backed scope

Add the secrets username and password using the Azure SetSecret REST API or Azure portal UI:

../../_images/azure-kv-secrets.png

Create the secrets in a Databricks-backed scope

Add the secrets username and password. Run the following commands and enter the secret values in the opened editor.

databricks secrets put --scope jdbc --key username
databricks secrets put --scope jdbc --key password

Use the secrets in a notebook

In a notebook, read the secrets that are stored in the secret scope jdbc to configure a JDBC connector:

val driverClass = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
val connectionProperties = new java.util.Properties()
connectionProperties.setProperty("Driver", driverClass)

val jdbcUsername = dbutils.secrets.get(scope = "jdbc", key = "username")
val jdbcPassword = dbutils.secrets.get(scope = "jdbc", key = "password")
connectionProperties.put("user", s"${jdbcUsername}")
connectionProperties.put("password", s"${jdbcPassword}")

You can now use these ConnectionProperties with the JDBC connector to talk to your data source. The values fetched from the scope are never displayed in the notebook (see Secret Redaction).

Grant access to another group

After verifying that the credentials were configured correctly, share these credentials with the datascience group to use for their analysis.

Grant the datascience group read-only permission to these credentials by making the following request:

databricks secrets put-acl --scope jdbc --principal datascience --permission READ