We provide real DBS-C01 exam questions and answers braindumps in two formats. Download PDF & Practice Tests. Pass Amazon-Web-Services DBS-C01 Exam quickly & easily. The DBS-C01 PDF type is available for reading and printing. You can print more and practice many times. With the help of our Amazon-Web-Services DBS-C01 dumps pdf and vce product and material, you can easily pass the DBS-C01 exam.

Check DBS-C01 free dumps before getting the full version:

NEW QUESTION 1
A company just migrated to Amazon Aurora PostgreSQL from an on-premises Oracle database. After the migration, the company discovered there is a period of time every day around 3:00 PM where the response time of the application is noticeably slower. The company has narrowed down the cause of this issue to the database and not the application.
Which set of steps should the Database Specialist take to most efficiently find the problematic PostgreSQL query?

  • A. Create an Amazon CloudWatch dashboard to show the number of connections, CPU usage, and diskspace consumptio
  • B. Watch these dashboards during the next slow period.
  • C. Launch an Amazon EC2 instance, and install and configure an open-source PostgreSQL monitoring toolthat will run reports based on the output error logs.
  • D. Modify the logging database parameter to log all the queries related to locking in the database and thencheck the logs after the next slow period for this information.
  • E. Enable Amazon RDS Performance Insights on the PostgreSQL databas
  • F. Use the metrics to identify anyqueries that are related to spikes in the graph during the next slow period.

Answer: D

NEW QUESTION 2
A company is looking to migrate a 1 TB Oracle database from on-premises to an Amazon Aurora PostgreSQL DB cluster. The company’s Database Specialist discovered that the Oracle database is storing 100 GB of large binary objects (LOBs) across multiple tables. The Oracle database has a maximum LOB size of 500 MB with an average LOB size of 350 MB. The Database Specialist has chosen AWS DMS to migrate the data with the largest replication instances.
How should the Database Specialist optimize the database migration using AWS DMS?

  • A. Create a single task using full LOB mode with a LOB chunk size of 500 MB to migrate the data and LOBstogether
  • B. Create two tasks: task1 with LOB tables using full LOB mode with a LOB chunk size of 500 MB and task2without LOBs
  • C. Create two tasks: task1 with LOB tables using limited LOB mode with a maximum LOB size of 500 MB andtask 2 without LOBs
  • D. Create a single task using limited LOB mode with a maximum LOB size of 500 MB to migrate data andLOBs together

Answer: C

NEW QUESTION 3
An ecommerce company has tasked a Database Specialist with creating a reporting dashboard that visualizes critical business metrics that will be pulled from the core production database running on Amazon Aurora. Data that is read by the dashboard should be available within 100 milliseconds of an update.
The Database Specialist needs to review the current configuration of the Aurora DB cluster and develop a cost-effective solution. The solution needs to accommodate the unpredictable read workload from the reporting dashboard without any impact on the write availability and performance of the DB cluster.
Which solution meets these requirements?

  • A. Turn on the serverless option in the DB cluster so it can automatically scale based on demand.
  • B. Provision a clone of the existing DB cluster for the new Application team.
  • C. Create a separate DB cluster for the new workload, refresh from the source DB cluster, and set up ongoingreplication using AWS DMS change data capture (CDC).
  • D. Add an automatic scaling policy to the DB cluster to add Aurora Replicas to the cluster based on CPUconsumption.

Answer: A

NEW QUESTION 4
A company wants to migrate its existing on-premises Oracle database to Amazon Aurora PostgreSQL. The migration must be completed with minimal downtime using AWS DMS. A Database Specialist must validate that the data was migrated accurately from the source to the target before the cutover. The migration must have minimal impact on the performance of the source database.
Which approach will MOST effectively meet these requirements?

  • A. Use the AWS Schema Conversion Tool (AWS SCT) to convert source Oracle database schemas to the target Aurora DB cluste
  • B. Verify the datatype of the columns.
  • C. Use the table metrics of the AWS DMS task created for migrating the data to verify the statistics for the tables being migrated and to verify that the data definition language (DDL) statements are completed.
  • D. Enable the AWS Schema Conversion Tool (AWS SCT) premigration validation and review the premigrationchecklist to make sure there are no issues with the conversion.
  • E. Enable AWS DMS data validation on the task so the AWS DMS task compares the source and targetrecords, and reports any mismatches.

Answer: D

NEW QUESTION 5
A company is using Amazon with Aurora Replicas for read-only workload scaling. A Database Specialist needs to split up two read-only applications so each application always connects to a dedicated replica. The Database Specialist wants to implement load balancing and high availability for the read-only applications.
Which solution meets these requirements?

  • A. Use a specific instance endpoint for each replica and add the instance endpoint to each read-onlyapplication connection string.
  • B. Use reader endpoints for both the read-only workload applications.
  • C. Use a reader endpoint for one read-only application and use an instance endpoint for the other read-onlyapplication.
  • D. Use custom endpoints for the two read-only applications.

Answer: B

NEW QUESTION 6
A team of Database Specialists is currently investigating performance issues on an Amazon RDS for MySQL DB instance and is reviewing related metrics. The team wants to narrow the possibilities down to specific database wait events to better understand the situation.
How can the Database Specialists accomplish this?

  • A. Enable the option to push all database logs to Amazon CloudWatch for advanced analysis
  • B. Create appropriate Amazon CloudWatch dashboards to contain specific periods of time
  • C. Enable Amazon RDS Performance Insights and review the appropriate dashboard
  • D. Enable Enhanced Monitoring will the appropriate settings

Answer: C

NEW QUESTION 7
A company is looking to move an on-premises IBM Db2 database running AIX on an IBM POWER7 server. Due to escalating support and maintenance costs, the company is exploring the option of moving the workload to an Amazon Aurora PostgreSQL DB cluster.
What is the quickest way for the company to gather data on the migration compatibility?

  • A. Perform a logical dump from the Db2 database and restore it to an Aurora DB cluste
  • B. Identify the gaps andcompatibility of the objects migrated by comparing row counts from source and target tables.
  • C. Run AWS DMS from the Db2 database to an Aurora DB cluste
  • D. Identify the gaps and compatibility of theobjects migrated by comparing the row counts from source and target tables.
  • E. Run native PostgreSQL logical replication from the Db2 database to an Aurora DB cluster to evaluate themigration compatibility.
  • F. Run the AWS Schema Conversion Tool (AWS SCT) from the Db2 database to an Aurora DB cluster.Create a migration assessment report to evaluate the migration compatibility.

Answer: D

NEW QUESTION 8
A gaming company has recently acquired a successful iOS game, which is particularly popular during theholiday season. The company has decided to add a leaderboard to the game that uses Amazon DynamoDB.The application load is expected to ramp up over the holiday season.
Which solution will meet these requirements at the lowest cost?

  • A. DynamoDB Streams
  • B. DynamoDB with DynamoDB Accelerator
  • C. DynamoDB with on-demand capacity mode
  • D. DynamoDB with provisioned capacity mode with Auto Scaling

Answer: C

NEW QUESTION 9
A company is running its line of business application on AWS, which uses Amazon RDS for MySQL at the persistent data store. The company wants to minimize downtime when it migrates the database to Amazon Aurora.
Which migration method should a Database Specialist use?

  • A. Take a snapshot of the RDS for MySQL DB instance and create a new Aurora DB cluster with the option to migrate snapshots.
  • B. Make a backup of the RDS for MySQL DB instance using the mysqldump utility, create a new Aurora DB cluster, and restore the backup.
  • C. Create an Aurora Replica from the RDS for MySQL DB instance and promote the Aurora DB cluster.
  • D. Create a clone of the RDS for MySQL DB instance and promote the Aurora DB cluster.

Answer: A

NEW QUESTION 10
A company wants to automate the creation of secure test databases with random credentials to be stored safely for later use. The credentials should have sufficient information about each test database to initiate a connection and perform automated credential rotations. The credentials should not be logged or stored anywhere in an unencrypted form.
Which steps should a Database Specialist take to meet these requirements using an AWS CloudFormation template?

  • A. Create the database with the MasterUserName and MasterUserPassword properties set to the default value
  • B. Then, create the secret with the user name and password set to the same default value
  • C. Add a Secret Target Attachment resource with the SecretId and TargetId properties set to the Amazon Resource Names (ARNs) of the secret and the databas
  • D. Finally, update the secret’s password value with a randomly generated string set by the GenerateSecretString property.
  • E. Add a Mapping property from the database Amazon Resource Name (ARN) to the secret AR
  • F. Then, create the secret with a chosen user name and a randomly generated password set by the GenerateSecretString propert
  • G. Add the database with the MasterUserName and MasterUserPassword properties set to the user name of the secret.
  • H. Add a resource of type AWS::SecretsManager::Secret and specify the GenerateSecretString property.Then, define the database user name in the SecureStringTemplate templat
  • I. Create a resource for the database and reference the secret string for the MasterUserName and MasterUserPassword propertie
  • J. Then, add a resource of type AWS::SecretsManagerSecretTargetAttachment with the SecretId and TargetId properties set to the Amazon Resource Names (ARNs) of the secret and the database.
  • K. Create the secret with a chosen user name and a randomly generated password set by the GenerateSecretString propert
  • L. Add an SecretTargetAttachment resource with the SecretId property set to the Amazon Resource Name (ARN) of the secret and the TargetId property set to a parameter value matching the desired database AR
  • M. Then, create a database with the MasterUserName and MasterUserPassword properties set to the previously created values in the secret.

Answer: C

NEW QUESTION 11
A Database Specialist is working with a company to launch a new website built on Amazon Aurora with several Aurora Replicas. This new website will replace an on-premises website connected to a legacy relational database. Due to stability issues in the legacy database, the company would like to test the resiliency of Aurora.
Which action can the Database Specialist take to test the resiliency of the Aurora DB cluster?

  • A. Stop the DB cluster and analyze how the website responds
  • B. Use Aurora fault injection to crash the master DB instance
  • C. Remove the DB cluster endpoint to simulate a master DB instance failure
  • D. Use Aurora Backtrack to crash the DB cluster

Answer: B

NEW QUESTION 12
A Database Specialist is setting up a new Amazon Aurora DB cluster with one primary instance and three Aurora Replicas for a highly intensive, business-critical application. The Aurora DB cluster has one mediumsized primary instance, one large-sized replica, and two medium sized replicas. The Database Specialist did not assign a promotion tier to the replicas.
In the event of a primary failure, what will occur?

  • A. Aurora will promote an Aurora Replica that is of the same size as the primary instance
  • B. Aurora will promote an arbitrary Aurora Replica
  • C. Aurora will promote the largest-sized Aurora Replica
  • D. Aurora will not promote an Aurora Replica

Answer: A

NEW QUESTION 13
A company has a production Amazon Aurora Db cluster that serves both online transaction processing (OLTP) transactions and compute-intensive reports. The reports run for 10% of the total cluster uptime while the OLTP transactions run all the time. The company has benchmarked its workload and determined that a six-node Aurora DB cluster is appropriate for the peak workload.
The company is now looking at cutting costs for this DB cluster, but needs to have a sufficient number of nodes in the cluster to support the workload at different times. The workload has not changed since the previous benchmarking exercise.
How can a Database Specialist address these requirements with minimal user involvement?

  • A. Split up the DB cluster into two different clusters: one for OLTP and the other for reportin
  • B. Monitor and set up replication between the two clusters to keep data consistent.
  • C. Review all evaluate the peak combined workloa
  • D. Ensure that utilization of the DB cluster node is at an acceptable leve
  • E. Adjust the number of instances, if necessary.
  • F. Use the stop cluster functionality to stop all the nodes of the DB cluster during times of minimal workloa
  • G. The cluster can be restarted again depending on the workload at the time.
  • H. Set up automatic scaling on the DB cluste
  • I. This will allow the number of reader nodes to adjust automatically to the reporting workload, when needed.

Answer: D

NEW QUESTION 14
The Development team recently executed a database script containing several data definition language (DDL) and data manipulation language (DML) statements on an Amazon Aurora MySQL DB cluster. The release accidentally deleted thousands of rows from an important table and broke some application functionality. This was discovered 4 hours after the release. Upon investigation, a Database Specialist tracked the issue to a DELETE command in the script with an incorrect WHERE clause filtering the wrong set of rows.
The Aurora DB cluster has Backtrack enabled with an 8-hour backtrack window. The Database Administrator also took a manual snapshot of the DB cluster before the release started. The database needs to be returned to the correct state as quickly as possible to resume full application functionality. Data loss must be minimal.
How can the Database Specialist accomplish this?

  • A. Quickly rewind the DB cluster to a point in time before the release using Backtrack.
  • B. Perform a point-in-time recovery (PITR) of the DB cluster to a time before the release and copy the deleted rows from the restored database to the original database.
  • C. Restore the DB cluster using the manual backup snapshot created before the release and change the application configuration settings to point to the new DB cluster.
  • D. Create a clone of the DB cluster with Backtrack enable
  • E. Rewind the cloned cluster to a point in time before the releas
  • F. Copy deleted rows from the clone to the original database.

Answer: D

NEW QUESTION 15
A large company is using an Amazon RDS for Oracle Multi-AZ DB instance with a Java application. As a part of its disaster recovery annual testing, the company would like to simulate an Availability Zone failure and record how the application reacts during the DB instance failover activity. The company does not want to make any code changes for this activity.
What should the company do to achieve this in the shortest amount of time?

  • A. Use a blue-green deployment with a complete application-level failover test
  • B. Use the RDS console to reboot the DB instance by choosing the option to reboot with failover
  • C. Use RDS fault injection queries to simulate the primary node failure
  • D. Add a rule to the NACL to deny all traffic on the subnets associated with a single Availability Zone

Answer: C

NEW QUESTION 16
A company is running a finance application on an Amazon RDS for MySQL DB instance. The application is governed by multiple financial regulatory agencies. The RDS DB instance is set up with security groups to allow access to certain Amazon EC2 servers only. AWS KMS is used for encryption at rest.
Which step will provide additional security?

  • A. Set up NACLs that allow the entire EC2 subnet to access the DB instance
  • B. Disable the master user account
  • C. Set up a security group that blocks SSH to the DB instance
  • D. Set up RDS to use SSL for data in transit

Answer: D

NEW QUESTION 17
A company is about to launch a new product, and test databases must be re-created from production data. The company runs its production databases on an Amazon Aurora MySQL DB cluster. A Database Specialist needs to deploy a solution to create these test databases as quickly as possible with the least amount of administrative effort.
What should the Database Specialist do to meet these requirements?

  • A. Restore a snapshot from the production cluster into test clusters
  • B. Create logical dumps of the production cluster and restore them into new test clusters
  • C. Use database cloning to create clones of the production cluster
  • D. Add an additional read replica to the production cluster and use that node for testing

Answer: D

NEW QUESTION 18
An online gaming company is planning to launch a new game with Amazon DynamoDB as its data store. The database should be designated to support the following use cases:
DBS-C01 dumps exhibit Update scores in real time whenever a player is playing the game.
DBS-C01 dumps exhibit Retrieve a player’s score details for a specific game session.
A Database Specialist decides to implement a DynamoDB table. Each player has a unique user_id and each game has a unique game_id.
Which choice of keys is recommended for the DynamoDB table?

  • A. Create a global secondary index with game_id as the partition key
  • B. Create a global secondary index with user_id as the partition key
  • C. Create a composite primary key with game_id as the partition key and user_id as the sort key
  • D. Create a composite primary key with user_id as the partition key and game_id as the sort key

Answer: B

NEW QUESTION 19
A Database Specialist has migrated an on-premises Oracle database to Amazon Aurora PostgreSQL. The schema and the data have been migrated successfully. The on-premises database server was also being used to run database maintenance cron jobs written in Python to perform tasks including data purging and generating data exports. The logs for these jobs show that, most of the time, the jobs completed within 5 minutes, but a few jobs took up to 10 minutes to complete. These maintenance jobs need to be set up for Aurora PostgreSQL.
How can the Database Specialist schedule these jobs so the setup requires minimal maintenance and provides high availability?

  • A. Create cron jobs on an Amazon EC2 instance to run the maintenance jobs following the required schedule.
  • B. Connect to the Aurora host and create cron jobs to run the maintenance jobs following the requiredschedule.
  • C. Create AWS Lambda functions to run the maintenance jobs and schedule them with Amazon CloudWatchEvents.
  • D. Create the maintenance job using the Amazon CloudWatch job scheduling plugin.

Answer: D

NEW QUESTION 20
A Database Specialist migrated an existing production MySQL database from on-premises to an Amazon RDS for MySQL DB instance. However, after the migration, the database needed to be encrypted at rest using AWS KMS. Due to the size of the database, reloading, the data into an encrypted database would be too time-consuming, so it is not an option.
How should the Database Specialist satisfy this new requirement?

  • A. Create a snapshot of the unencrypted RDS DB instanc
  • B. Create an encrypted copy of the unencryptedsnapsho
  • C. Restore the encrypted snapshot copy.
  • D. Modify the RDS DB instanc
  • E. Enable the AWS KMS encryption option that leverages the AWS CLI.
  • F. Restore an unencrypted snapshot into a MySQL RDS DB instance that is encrypted.
  • G. Create an encrypted read replica of the RDS DB instanc
  • H. Promote it the master.

Answer: A

NEW QUESTION 21
A Database Specialist needs to define a database migration strategy to migrate an on-premises Oracle database to an Amazon Aurora MySQL DB cluster. The company requires near-zero downtime for the data migration. The solution must also be cost-effective.
Which approach should the Database Specialist take?

  • A. Dump all the tables from the Oracle database into an Amazon S3 bucket using datapump (expdp).Rundata transformations in AWS Glu
  • B. Load the data from the S3 bucket to the Aurora DB cluster.
  • C. Order an AWS Snowball appliance and copy the Oracle backup to the Snowball applianc
  • D. Once theSnowball data is delivered to Amazon S3, create a new Aurora DB cluste
  • E. Enable the S3 integration tomigrate the data directly from Amazon S3 to Amazon RDS.
  • F. Use the AWS Schema Conversion Tool (AWS SCT) to help rewrite database objects to MySQL during theschema migratio
  • G. Use AWS DMS to perform the full load and change data capture (CDC) tasks.
  • H. Use AWS Server Migration Service (AWS SMS) to import the Oracle virtual machine image as an AmazonEC2 instanc
  • I. Use the Oracle Logical Dump utility to migrate the Oracle data from Amazon EC2 to anAurora DB cluster.

Answer: D

NEW QUESTION 22
A company developed an AWS CloudFormation template used to create all new Amazon DynamoDB tables in its AWS account. The template configures provisioned throughput capacity using hard-coded values. The company wants to change the template so that the tables it creates in the future have independently configurable read and write capacity units assigned.
Which solution will enable this change?

  • A. Add values for the rcuCount and wcuCount parameters to the Mappings section of the template.ConfigureDynamoDB to provision throughput capacity using the stack’s mappings.
  • B. Add values for two Number parameters, rcuCount and wcuCount, to the templat
  • C. Replace the hard-codedvalues with calls to the Ref intrinsic function, referencing the new parameters.
  • D. Add values for the rcuCount and wcuCount parameters as outputs of the templat
  • E. Configure DynamoDBto provision throughput capacity using the stack outputs.
  • F. Add values for the rcuCount and wcuCount parameters to the Mappings section of the template.Replacethe hard-coded values with calls to the Ref intrinsic function, referencing the new parameters.

Answer: B

NEW QUESTION 23
A media company is using Amazon RDS for PostgreSQL to store user data. The RDS DB instance currently has a publicly accessible setting enabled and is hosted in a public subnet. Following a recent AWS Well- Architected Framework review, a Database Specialist was given new security requirements.
Only certain on-premises corporate network IPs should connect to the DB instance.
Connectivity is allowed from the corporate network only. Which combination of steps does the Database Specialist need to take to meet these new requirements?
(Choose three.)

  • A. Modify the pg_hba.conf fil
  • B. Add the required corporate network IPs and remove the unwanted IPs.
  • C. Modify the associated security grou
  • D. Add the required corporate network IPs and remove the unwanted IPs.
  • E. Move the DB instance to a private subnet using AWS DMS.
  • F. Enable VPC peering between the application host running on the corporate network and the VPC associated with the DB instance.
  • G. Disable the publicly accessible setting.
  • H. Connect to the DB instance using private IPs and a VPN.

Answer: DEF

NEW QUESTION 24
A Database Specialist is migrating an on-premises Microsoft SQL Server application database to Amazon RDS for PostgreSQL using AWS DMS. The application requires minimal downtime when the RDS DB instance goes live.
What change should the Database Specialist make to enable the migration?

  • A. Configure the on-premises application database to act as a source for an AWS DMS full load with ongoing change data capture (CDC)
  • B. Configure the AWS DMS replication instance to allow both full load and ongoing change data capture(CDC)
  • C. Configure the AWS DMS task to generate full logs to allow for ongoing change data capture (CDC)
  • D. Configure the AWS DMS connections to allow two-way communication to allow for ongoing change datacapture (CDC)

Answer: A

NEW QUESTION 25
A company is running an Amazon RDS for PostgeSQL DB instance and wants to migrate it to an Amazon Aurora PostgreSQL DB cluster. The current database is 1 TB in size. The migration needs to have minimal downtime.
What is the FASTEST way to accomplish this?

  • A. Create an Aurora PostgreSQL DB cluste
  • B. Set up replication from the source RDS for PostgreSQL DB instance using AWS DMS to the target DB cluster.
  • C. Use the pg_dump and pg_restore utilities to extract and restore the RDS for PostgreSQL DB instance to the Aurora PostgreSQL DB cluster.
  • D. Create a database snapshot of the RDS for PostgreSQL DB instance and use this snapshot to create the Aurora PostgreSQL DB cluster.
  • E. Migrate data from the RDS for PostgreSQL DB instance to an Aurora PostgreSQL DB cluster using an Aurora Replic
  • F. Promote the replica during the cutover.

Answer: C

NEW QUESTION 26
......

Recommend!! Get the Full DBS-C01 dumps in VCE and PDF From Certshared, Welcome to Download: https://www.certshared.com/exam/DBS-C01/ (New 85 Q&As Version)