Here is the overall flow to mount the adls store in DataBricks using Oauth
steps to mount data lake file system in azure data bricks
1st step is to register an app in azure directory
this creates the application (client id) and the directory ( tenant ) id.
within Azure Ad app registration -> create a client secret -> once generated you have to copy the key value
once its hidden , it stays hidden forever – hence very important to rememeber to store the secret.
this secret key gets exchanged for a token at the time when we are trying to mount the file system
- next step – store key in key vault
open up key vault -> click on generate /import -> paste in the secret generated in the previous step
- once this step is done , go to data bricks
- why there is no direct link to create a scope is beyond me , but there are two options – web method or databricks cli , i will use the web method to create the scope , will cover the databricks cli later – its my preferred approach but i have not
first step – go to key vault and get the dns name and resource id
once you get this – go to the web page as shown in step 6 below
and copy the corresponding DNS name and resource id
in this case we have created a scope called dbtravelscope
At this point we have created a scope with the client secret stored. We should be able to proceed with the steps outlined in this link below to get the adls mounted on Data Bricks