DBS-C01 Practice Questions

AWS Certified Database - Specialty

PDF

$35

Number Of Questions

198

Last Update

3/08/2022

Certification

AWS Certified Database

Exam Code

DBS-C01

Following are the features that makes us unique.

 Just 1 day study required to pass exam 100% Passing Assurance
 100% Money Back Guarantee Free 3 Months Updates

Customer Support

90% Result grantee

Money Back Grantee

Demo Questions

Question # 1
In North America, a business launched a mobile game that swiftly expanded to 10 milliondaily active players. The game’s backend is hosted on AWS and makes considerable useof a TTL-configured Amazon DynamoDB table.When an item is added or changed, its TTL is set to 600 seconds plus the current epochtime. The game logic is reliant on the purging of outdated data in order to compute rewardspoints properly. At times, items from the table are read that are many hours beyond theirTTL expiration.How should a database administrator resolve this issue?

A. Use a client library that supports the TTL functionality for DynamoDB.
B. Include a query filter expression to ignore items with an expired TTL.
C. Set the ConsistentRead parameter to true when querying the table.
D. Create a local secondary index on the TTL attribute.
ANSWER : B

Question # 2
A Database Specialist needs to define a database migration strategy to migrate an onpremises Oracle database to an Amazon Aurora MySQL DB cluster. The company requiresnear-zero downtime for the data migration. The solution must also be cost-effective.Which approach should the Database Specialist take?

A. Dump all the tables from the Oracle database into an Amazon S3 bucket usingdatapump (expdp). Run data transformations in AWS Glue. Load the data from the S3bucket to the Aurora DB cluster.
B. Order an AWS Snowball appliance and copy the Oracle backup to the Snowballappliance. Once the Snowball data is delivered to Amazon S3, create a new Aurora DBcluster. Enable the S3 integration to migrate the data directly from Amazon S3 to AmazonRDS.
C. Use the AWS Schema Conversion Tool (AWS SCT) to help rewrite database objects toMySQL during the schema migration. Use AWS DMS to perform the full load and changedata capture (CDC) tasks.
D. Use AWS Server Migration Service (AWS SMS) to import the Oracle virtual machineimage as an Amazon EC2 instance. Use the Oracle Logical Dump utility to migrate theOracle data from Amazon EC2 to an Aurora DB cluster.
ANSWER : C

Question # 3
A business is transferring its on-premises database workloads to the Amazon WebServices (AWS) Cloud. A database professional migrating an Oracle database with a hugetable to Amazon RDS has picked AWS DMS. The database professional observes thatAWS DMS is consuming considerable time migrating the data.Which activities would increase the pace of data migration? (Select three.)

A. Create multiple AWS DMS tasks to migrate the large table.
B. Configure the AWS DMS replication instance with Multi-AZ.
C. Increase the capacity of the AWS DMS replication server.
D. Establish an AWS Direct Connect connection between the on-premises data center andAWS.
E. Enable an Amazon RDS Multi-AZ configuration.
F. Enable full large binary object (LOB) mode to migrate all LOB data for all large tables.
ANSWER : A,C,D

Question # 4
A significant automotive manufacturer is switching a mission-critical finance application’sdatabase to Amazon DynamoDB. According to the company’s risk and compliance policy,any update to the database must be documented as a log entry for auditing purposes.Each minute, the system anticipates about 500,000 log entries. Log entries should be keptin Apache Parquet files in batches of at least 100,000 records per file.How could a database professional approach these needs while using DynamoDB?

A. Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda functiontriggered by the stream. Write the log entries to an Amazon S3 object.
B. Create a backup plan in AWS Backup to back up the DynamoDB table once a day.Create an AWS Lambda function that restores the backup in another table and comparesboth tables for changes. Generate the log entries and write them to an Amazon S3 object.
C. Enable AWS CloudTrail logs on the table. Create an AWS Lambda function that readsthe log files once an hour and filters DynamoDB API actions. Write the filtered log files toAmazon S3.
D. Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda functiontriggered by the stream. Write the log entries to an Amazon Kinesis Data Firehose deliverystream with buffering and Amazon S3 as the destination.
ANSWER : D

Question # 5
A business need a data warehouse system that stores data consistently and in a highlyorganized fashion. The organization demands rapid response times for end-user inquiriesincluding current-year data, and users must have access to the whole 15-year datasetwhen necessary. Additionally, this solution must be able to manage a variable volume ofincoming inquiries. Costs associated with storing the 100 TB of data must be maintained toa minimum.Which solution satisfies these criteria?

A. Leverage an Amazon Redshift data warehouse solution using a dense storage instancetype while keeping all the data on local Amazon Redshift storage. Provision enoughinstances to support high demand.
B. Leverage an Amazon Redshift data warehouse solution using a dense storage instanceto store the most recent data. Keep historical data on Amazon S3 and access it using theAmazon Redshift Spectrum layer. Provision enough instances to support high demand.
C. Leverage an Amazon Redshift data warehouse solution using a dense storage instanceto store the most recent data. Keep historical data on Amazon S3 and access it using theAmazon Redshift Spectrum layer. Enable Amazon Redshift Concurrency Scaling.
D. Leverage an Amazon Redshift data warehouse solution using a dense storage instanceto store the most recent data. Keep historical data on Amazon S3 and access it using theAmazon Redshift Spectrum layer. Leverage Amazon Redshift elastic resize.
ANSWER : C

--TESTIMONIALS--

What our clients say about us