The solution must m...
 
Notifications
Clear all

The solution must meet the following requirements:

1 Posts
1 Users
0 Likes
368 Views
(@colmenerocarmelo)
Noble Member
Joined: 2 years ago
Posts: 746
Topic starter  

HOTSPOT

You plan to deploy Azure Databricks to support a machine learning application. Data engineers will mount an Azure Data Lake Storage account to the Databricks file system. Permissions to folders are granted directly to the data engineers.

You need to recommend a design for the planned Databrick deployment.

The solution must meet the following requirements:

✑ Ensure that the data engineers can only access folders to which they have permissions.

✑ Minimize development effort.

✑ Minimize costs.

What should you include in the recommendation? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Show Answer Hide Answer

Suggested Answer:

Explanation:

Box 1: Standard

Choose Standard to minimize costs.

Box 2: Credential passthrough

Athenticate automatically to Azure Data Lake Storage Gen1 (ADLS Gen1) and Azure Data Lake Storage Gen2 (ADLS Gen2) from Azure Databricks clusters using the same Azure Active Directory (Azure AD) identity that you use to log into Azure Databricks. When you enable Azure Data Lake Storage credential passthrough for your cluster, commands that you run on that cluster can read and write data in Azure Data Lake Storage without requiring you to configure service principal credentials for access to storage.


   
Quote

Latest AZ-305 V1 Dumps Valid Version

Latest And Valid Q&A | Instant Download | Once Fail, Full Refund
Share: