Notifications
Clear all

Solution:

1 Posts
1 Users
0 Likes
199 Views
(@sitesmario)
Posts: 734
Noble Member
Topic starter
 

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this scenario, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have an Azure Storage account that contains 100 GB of files. The files contain text and numerical values. 75% of the rows contain description data that has an average length of 1.1 MB.

You plan to copy the data from the storage account to an Azure SQL data warehouse.

You need to prepare the files to ensure that the data copies quickly.

Solution: You modify the files to ensure that each row is less than 1 MB.

Does this meet the goal?

  • A . Yes
    B. No

Show Answer Hide Answer

Suggested Answer: A

Explanation:

When exporting data into an ORC File Format, you might get Java out-of-memory errors when there are large text columns. To work around this limitation, export only a subset of the columns.

References: https://docs.microsoft.com/en-us/azure/sql-data-warehouse/guidance-for-loading-data
 
Posted : 14/11/2022 7:50 am

Latest DP-203 V1 Dumps Valid Version

Latest And Valid Q&A | Instant Download | Once Fail, Full Refund
Share: