Member-only story

Nidhi Gupta
3 min readFeb 12, 2025

Storage Credential/External Location in Unity Catalog Enabled Azure Databricks Workspace

Photo by Lia Trevarthen on Unsplash

Storage credentials and external locations are subjected to unity catalogue access control policies that control which users can access the storage credential and the external location. This way Unity Catalog provides more fine-grained access control.

In legacy hive metastore access data lake storage using

Access Keys:

spark.conf.set("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net","<access-keys>")

SAS Tokens:

spark.conf.set("fs.azure.account.auth.type.<storage-account-name>.dfs.core.windows.net","SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account-name>.dfs.core.windows.net","org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account-name>.dfs.core.windows.net","<SAS-token>")

Service principals:

client_id ="*****"
tenant_id="*****"
client_secret="*****"

spark.conf.set("fs.azure.account.auth.type.<storage-account-name>.dfs.windows.core.windows.net","OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type.<storage-account-name>.dfs.core.windows.net","org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")…

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

Nidhi Gupta
Nidhi Gupta

Written by Nidhi Gupta

Azure Data Engineer 👨‍💻.Heading towards cloud technologies expertise✌️.

Responses (1)

Write a response