Exam Code: AWS-Certified-Big-Data-Specialty (Practice Exam Latest Test Questions VCE PDF)
Exam Name: Amazon AWS Certified Big Data - Speciality
Certification Provider: Amazon
Free Today! Guaranteed Training- Pass AWS-Certified-Big-Data-Specialty Exam.

Check AWS-Certified-Big-Data-Specialty free dumps before getting the full version:

NEW QUESTION 1
A company needs to monitor the read and write IOPs metrics for their AWS MySQL RDS instance and
send real-time alerts to their operations team. Which AWS services can accomplish this? Choose 2 answers

  • A. Amazon Simple Email Service
  • B. Amazon CloudWatch
  • C. Amazon Simple Queue Service
  • D. Amazon Route 53
  • E. Amazon Simple Notification Service

Answer: BE

NEW QUESTION 2
A company is deploying a two tier, highly available web application to AWS. Which Service provides
durable storage for static content while utilizing lower overall CPU resources for the web tier?

  • A. Amazon EBS volume
  • B. Amazon S3
  • C. Amazon EC2 instance store
  • D. Amazon RDS instance

Answer: B

NEW QUESTION 3
An administrator is deploying Spark on Amazon EMR for two distinct use cases: machine learning
algorithms and ad hoc querying. All data will be stored in Amazon S3. Two separate clusters for each use case will be deployed. The data volumes on Amazon S3 are less than 10 GB.
How should the administrator align instance types with the cluster’s purpose?

  • A. Machine Learning on C instance types and ad-hoc queries on R instance types
  • B. Machine Learning on R instance types and ad-hoc queries on G2 instance types
  • C. Machine Learning on T instance types and ad-hoc queries on M instance types
  • D. Machine Learning on D instance types and ad-hoc queries on I instance types

Answer: D

NEW QUESTION 4
A data engineer is running a DWH on a 25-node Redshift cluster of a SaaS service. The data engineer
needs to build a dashboard that will be used by customers. Five big customers represent 80% of usage, and there is a long tail of dozens of smaller customers. The data engineer has selected the dashboarding tool.
How should the data engineer make sure that the larger customer workloads do NOT interfere with the smaller customer workloads?

  • A. Apply query filters based on customer-id that can NOT be changed by the user and apply distribution keys on customer id
  • B. Place the largest customers into a single user group with a dedicated query queue and place the rest of the customer into a different query queue
  • C. Push aggregations into an RDS for Aurora instanc
  • D. Connect the dashboard application to Aurora rather than Redshift for faster queries
  • E. Route the largest customers to a dedicated Redshift cluster, Raise the concurrency of the multi- tenant Redshift cluster to accommodate the remaining customers

Answer: D

NEW QUESTION 5
Your company wants to start working with AWS, but has not yet opened an account. With which of
the following services should you begin local development?

  • A. Amazon DynamoDB
  • B. Amazon Simple Queue Service
  • C. Amazon Simple Email Service
  • D. Amazon CloudSearch

Answer: A

NEW QUESTION 6
You have been tasked with deployment a solution for your company that will store images, which the
marketing department will use for its campaigns. Employees are able to upload images via a web interface, and once uploaded, each image must be resized and watermarked with the company logo. Image resize and watermark is not time-sensitive and can be completed days after upload if required.
How should you design this solution in the most highly available and cost-effective way?

  • A. Configure your web application to upload images to the Amazon Elastic Transcoder servic
  • B. Use the Amazon Elastic Transcoder watermark feature to add the company logo as a watermark on your images and then upload the final image into an Amazon s3 bucket
  • C. Configure your web application to upload images to Amazon S3, and send the Amazon S3 bucket URI to an Amazon SQS queu
  • D. Create an Auto Scaling group and configure it to use Spot instances, specifying a price you are willing to pa
  • E. Configure the instances in this Auto Scaling group to poll the SQS queue for new images and then resize and watermark the image before uploading the final images into Amazon S3
  • F. Configure your web application to upload images to Amazon S3, and send the S3 object URI to an Amazon SQS queu
  • G. Create an Auto Scaling launch configuration that uses Spot instances, specifying a price you are willing to pa
  • H. Configure the instances in this Auto Scaling group to poll the Amazon SQS queue for new images and then resize and watermark the image before uploading the new images into Amazon S3 and deleting the message from the Amazon SQS queue
  • I. Configure your web application to upload images to the local storage of the web serve
  • J. Create a cronjob to execute a script daily that scans this directory for new files and then uses the Amazon EC2 Service API to launch 10 new Amazon EC2 instances, which will resize and watermark the images daily

Answer: C

NEW QUESTION 7
An Amazon Kinesis stream needs to be encrypted. Which approach should be used to accomplish this task?

  • A. Perform a client-side encryption of the data before it enters the Amazon Kinesis stream on the producer
  • B. Use a partition key to segment the data by MD5 hash functions which makes indecipherable while in transit
  • C. Perform a client-side encryption of the data before it enters the Amazon Kinesis stream on the consumer
  • D. Use a shard to segment the data which has built-in functionality to make it indecipherable while in transit

Answer: B

NEW QUESTION 8
A company is building a new application is AWS. The architect needs to design a system to collect application log events. The design should be a repeatable pattern that minimizes data loss if an application instance fails, and keeps a durable copy of all log data for at least 30 days.
What is the simplest architecture that will allow the architect to analyze the logs?

  • A. Write them directly to a Kinesis Firehos
  • B. Configure Kinesis Firehose to load the events into an Amazon Redshift cluster for analysis.
  • C. Write them to a file on Amazon Simple Storage Service (S3). Write an AWS lambda function that runs in response to the S3 events to load the events into Amazon Elasticsearch service for analysis.
  • D. Write them to the local disk and configure the Amazon cloud watch Logs agent to lead the data into CloudWatch Logs and subsequently into Amazon Elasticsearch Service.
  • E. Write them to CloudWatch Logs and use an AWS Lambda function to load them into HDFS on an Amazon Elastic MapReduce (EMR) cluster for analysis.

Answer: A

NEW QUESTION 9
You are deploying an application to collect votes for a very popular television show. Millions of users
will submit votes using mobile devices. The votes must be collected into a durable, scalable, and highly available data store for real-time public tabulation. Which service should you use?

  • A. Amazon DynamoDB
  • B. Amazon Redshift
  • C. Amazon Kinesis
  • D. Amazon Simple Queue Service

Answer: C

NEW QUESTION 10
You have launched an Amazon Elastic Compute Cloud (EC2) instance into a public subnet with a primary private IP address assigned, an internet gateway is attached to the VPC, and the public route table is configured to send all internet-based internet. Why is the internet unreachable from this instance?

  • A. The Internet gateway security group must allow all outbound traffic
  • B. The instance does not have a public IP address
  • C. The instance “Source/Destination check” property must be enabled
  • D. The instance security group must allow all inbound traffic

Answer: B

NEW QUESTION 11
A company’s social media manager requests more staff on the weekends to handle an increase in
customer contacts from a particular region. The company needs a report to visualize the trends on weekends over the past 6 months using QuickSight.
How should the data be represented?

  • A. A line graph plotting customer contacts v
  • B. time, with a line for each region
  • C. A pie chart per region plotting customer contacts per day of week
  • D. A map of the regions with a heatmap overlay to show the volume of customer contacts
  • E. A bar graph plotting region vs volume of social media contacts

Answer: A

NEW QUESTION 12
When an Auto Scaling group is running in Amazon Elastic Compute Cloud (EC2), your application
rapidly scales up and down in response to load within a 10-minutes window; however, after the load peaks, you begin to see problems in your configuration management system where previously terminated Amazon EC2 resources are still showing as active.
What would be a reliable and efficient way to handle the cleanup of Amazon EC2 resources with your configuration management systems? Choose 2 answers

  • A. Write a script that is run by a daily cron job on an Amazon EC2 instance and that executes API Describe calls of the EC2 Auto Scaling group and removes terminated instances from the configuration management system
  • B. Configure an Amazon Simple Queue Service (SQS) queue for Auto Scaling actions that has a script that listens for new messages and removes terminated instances from the configuration management system
  • C. Use your existing configuration management system to control the launching and bootstrapping of instances to reduce the number of moving parts in the automation
  • D. Write a small script that is run during Amazon EC2 instance shutdown to de-register the resource from the configuration management system
  • E. Use Amazon Simple Workflow Service (SWF) to maintain an Amazon DynamoDB database that contains a whitelist of instances that have been previously launched, and allow the Amazon SWFworker to remove information from the configuration management system

Answer: AD

NEW QUESTION 13
A company uses Amazon Redshift for its enterprise data warehouse. A new op-premises PostgreSQL
OLTP DB must be integrated into the data warehouse. Each table in the PostgreSQL DB has an indexed last_modified timestamp column. The data warehouse has a staging layer to load source data into the data warehouse environment for further processing.
The data log between the source PostgreSQL DB and the Amazon Redshift staging layer should NOT exceed four hours.
What is the most efficient technique to meet these requirements?

  • A. Create a DBLINK on the source DB to connect to Amazon Redshif
  • B. Use a PostgreSQL trigger on the source table to capture the new insert/update/delete event and execute the event on the Amazon Redshift staging table.
  • C. Use a PostgreSQL trigger on the source table to capture the new insert/update/delete event and write it to Amazon Kinesis Stream
  • D. Use a KCL application to execute the event on the Amazon Redshift staging table.
  • E. Extract the incremental changes periodically using a SQL quer
  • F. Upload the changes to multiple Amazon Simple Storage Service (S3) objects and run the COPY command to load the Amazon Redshift staging table.
  • G. Extract the incremental changes periodically using a SQL quer
  • H. Upload the changes to a single Amazon Simple Storage Service (S3) object run the COPY command to load to the Amazon Redshift staging layer.

Answer: C

NEW QUESTION 14
An organization is designing an application architecture. The application will have over 100 TB of data
and will support transactions that arrive at rates from hundreds per second to tens of thousands per second, depending on the day of the week and time of day. All transaction data must be durably and reliably stored. Certain read operations must be performed with strong consistency.
Which solutions meets these requirements?

  • A. Use Amazon DynamoDB as the data store and use strong consistent reads when necessary
  • B. Use an Amazon Relational Database Service (RDS) instance sized to meet the maximum transaction rate and with the High Availability option enabled.
  • C. Deploy a NoSQL data store on top of an Amazon Elastic MapReduce (EMR) cluster, and select the HDFS High Durability option.
  • D. Use Amazon Redshift with synchronous replication to Amazon Simple Storage Service (S3) and row- level locking for strong consistency.

Answer: A

NEW QUESTION 15
A web-hosting company is building a web analytics tools to capture clickstream data from all of the
websites hosted within its platform and to provide near-real-time business intelligence. This entire system is built on AWS services. The web-hosting company is interested in using Amazon kinesis to collect this data and perform sliding window analytics. What is the most reliable and fault-tolerant technique to get each website to send data to Amazon Kinesis with every click?

  • A. After receiving a request each web server sends it to Amazon kinesis using the Amazon kinesis PutRecord APL Use the SessionID as a parturition key and set up a loop to retry until a successresponse is received
  • B. After receiving a request each web server sends it to Amazon kinesis using the Amazon Kinesis Producer Library addRecord method
  • C. Each web server bluffers the request until the count reaches 500 and sends them to Amazon kinesis using the Amazon kinesis PutRecord API call
  • D. After receiving a request each web server sends it to Amazon Kinesis using the Amazon kinesis PutRecord AP
  • E. Use the exponential back off algorithm for retries until a successful response is received

Answer: A

NEW QUESTION 16
Your customers located around the globe require low-latency access to private video files. Which
configuration meets these requirements?

  • A. Use Amazon CloudFront with signed URLs
  • B. Use Amazon EC2 with provisioned IOPS Amazon EBS volumes
  • C. Use Amazon S3 with signed URLs
  • D. Use Amazon S3 with access control lists

Answer: A

NEW QUESTION 17
The operations team and the development team want a single place to view both operating system
and application logs.
How should you implement this using AWS services? Choose two answers

  • A. Using AWS CloudFormation, create a CloudWatch Logs LogGroup and send the operating system and application logs of interest using the CloudWatch Logs Agent
  • B. Using AWS CloudFormation and configuration management, set up remote logging to send events via UDP packets to CloudTrail
  • C. Using configuration management, set up remote logging to send events to Amazon Kinesis and insert these into Amazon CloudSearch or Amazon Redshift, depending on available analytic tools
  • D. Using AWS CloudFormation, create a CloudWatch Logs LogGrou
  • E. Because the CloudWatch log agent automatically sends all operating system logs, you only have to configure the application logs for sending off-machine
  • F. Using AWS CloudFormation, merge the application logs with the operating system logs, and use IAM Roles to allow both teams to have access to view console output from Amazon EC2

Answer: AC

NEW QUESTION 18
Which of the following notification endpoints or clients are supported by Amazon Simple Notification Service? Choose 2 answers

  • A. Email
  • B. CloudFront distribution
  • C. File Transfer Protocol
  • D. Short Message Service
  • E. Simple Network Management Protocol

Answer: BC

NEW QUESTION 19
You are managing the AWS account of a big organization. The organization has more than
1000+ employees and they want to provide access to the various services to most of the employees. Which of the below mentioned options is the best possible solution in this case?

  • A. The user should create a separate IAM user for each employee and provide access to them as per the policy
  • B. The user should create an IAM role and attach STS with the rol
  • C. The user should attach that role to the EC2 instance and setup AWS authentication on that server
  • D. The user should create IAM groups as per the organization’s departments and add each user to the group for better access control
  • E. Attach an IAM role with the organization’s authentication service to authorize each user forvarious AWS services

Answer: D

NEW QUESTION 20
You are working with customer who has 10 TB of archival data that they want to migrate to Amazon Glacier. The customer has a 1Mbps connection to the Internet. Which service or feature provide the fastest method of getting the data into Amazon Glacier?

  • A. Amazon Glacier multipart upload
  • B. AWS Storage Gateway
  • C. VM Import/Export
  • D. AWS Import/Export

Answer: D

NEW QUESTION 21
A company is using Amazon Machine Learning as part of a medical software application. The application will predict the most likely blood type for a patient based on a variety of other clinical tests that are available when blood type knowledge is unavailable.
What is the appropriate model choice and target attribute combination for the problem?

  • A. Multi-class classification model with a categorical target attribute
  • B. Regression model with a numeric target attribute
  • C. Binary Classification with a categorical target attribute
  • D. K-Nearest Neighbors model with a multi-class target attribute

Answer: C

NEW QUESTION 22
......

Recommend!! Get the Full AWS-Certified-Big-Data-Specialty dumps in VCE and PDF From DumpSolutions, Welcome to Download: https://www.dumpsolutions.com/AWS-Certified-Big-Data-Specialty-dumps/ (New 243 Q&As Version)