Access control is available only in the Azure Databricks Premium Plan.
By default, all users can create and modify jobs unless an administrator enables jobs access control. With jobs access control, individual permissions determine a user’s abilities. This topic describes the individual permissions and how to enable and configure jobs access control.
There are five permission levels for jobs: No Permissions, Can View, Can Manage Run, Is Owner, and Can Manage. The Can Manage permission is reserved for administrators. The table lists the abilities for each permission.
|Ability||No Permissions||Can View||Can Manage Run||Is Owner||Can Manage (admin)|
|View job details and settings||x||x||x||x||x|
|View results, Spark UI, logs of a job run||x||x||x||x|
|Edit job settings||x||x|
- The creator of a job has Is Owner permission.
- A job cannot have more than one owner.
- A job cannot have a group as an owner.
- Jobs triggered through Run Now assume the permissions of the job owner and not the user who issued Run Now. For example, even if job A is configured to run on an existing cluster accessible only to the job owner (user A), a user (user B) with Can Manage Run permission can start a new run of the job.
- You can view notebook run results only if you have the Can View or higher permission on the job. This allows jobs access control to be intact even if the job notebook was renamed, moved, or deleted.
- Jobs access control applies to jobs displayed in the Databricks Jobs UI and their runs. It doesn’t apply to runs spawned by notebook workflows or runs submitted by API whose ACLs are bundled with the notebooks.
Go to the Admin Console.
Select the Access Control tab.
Click the Enable button next to Cluster and Jobs Access Control.
Click Confirm to confirm the change.