Which approach will...
 
Notifications
Clear all

Which approach will MOST effectively meet these requirements?

1 Posts
1 Users
0 Likes
47 Views
Man
 Man
(@shortinoman)
Noble Member
Joined: 6 months ago
Posts: 369
Topic starter  

A company wants to migrate its existing on-premises Oracle database to Amazon Aurora PostgreSQL. The migration must be completed with minimal downtime using AWS DMS. A Database Specialist must validate that the data was migrated accurately from the source to the target before the cutover. The migration must have minimal impact on the performance of the source database.

Which approach will MOST effectively meet these requirements?

  • A . Use the AWS Schema Conversion Tool (AWS SCT) to convert source Oracle database schemas to the target Aurora DB cluster. Verify the datatype of the columns.
  • B . Use the table metrics of the AWS DMS task created for migrating the data to verify the statistics for the tables being migrated and to verify that the data definition language (DDL) statements are completed.
  • C . Enable the AWS Schema Conversion Tool (AWS SCT) premigration validation and review the premigration checklist to make sure there are no issues with the conversion.
  • D . Enable AWS DMS data validation on the task so the AWS DMS task compares the source and target records, and reports any mismatches.

Show Answer Hide Answer

Suggested Answer: D

Explanation:

"To ensure that your data was migrated accurately from the source to the target, we highly recommend that you use data validation." https://docs.aws.amazon.com/dms/latest/userguide/CHAP_BestPractices.html

Reference: https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Validating.html

   
Quote
Share: