Master the Associate-Cloud-Engineer Google Cloud Certified - Associate Cloud Engineer content and be ready for exam day success quickly with this Passleader Associate-Cloud-Engineer exams. We guarantee it!We make it a reality and give you real Associate-Cloud-Engineer questions in our Google Associate-Cloud-Engineer braindumps.Latest 100% VALID Google Associate-Cloud-Engineer Exam Questions Dumps at below page. You can use our Google Associate-Cloud-Engineer braindumps and pass your exam.

Free demo questions for Google Associate-Cloud-Engineer Exam Dumps Below:

NEW QUESTION 1
You have a Compute Engine instance hosting an application used between 9 AM and 6 PM on weekdays. You want to back up this instance daily for disaster recovery purposes. You want to keep the backups for 30 days. You want the Google-recommended solution with the least management overhead and the least number of services. What should you do?

  • A. * 1. Update your instances’ metadata to add the following value: snapshot–schedule: 0 1 * * ** 2. Update your instances’ metadata to add the following value: snapshot–retention: 30
  • B. * 1. In the Cloud Console, go to the Compute Engine Disks page and select your instance’s disk.* 2. In the Snapshot Schedule section, select Create Schedule and configure the following parameters:–Schedule frequency: Daily–Start time: 1:00 AM – 2:00 AM–Autodelete snapshots after 30 days
  • C. * 1. Create a Cloud Function that creates a snapshot of your instance’s disk.* 2. Create a Cloud Function that deletes snapshots that are older than 30 day
  • D. 3.Use Cloud Scheduler to trigger both Cloud Functions daily at 1:00 AM.
  • E. * 1. Create a bash script in the instance that copies the content of the disk to Cloud Storage.* 2. Create a bash script in the instance that deletes data older than 30 days in the backup Cloud Storage bucket.* 3. Configure the instance’s crontab to execute these scripts daily at 1:00 AM.

Answer: B

NEW QUESTION 2
You are hosting an application from Compute Engine virtual machines (VMs) in us–central1–a. You want to adjust your design to support the failure of a single Compute Engine zone, eliminate downtime, and minimize cost. What should you do?

  • A. – Create Compute Engine resources in us–central1–b.–Balance the load across both us–central1–a and us–central1–b.
  • B. – Create a Managed Instance Group and specify us–central1–a as the zone.–Configure the Health Check with a short Health Interval.
  • C. – Create an HTTP(S) Load Balancer.–Create one or more global forwarding rules to direct traffic to your VMs.
  • D. – Perform regular backups of your application.–Create a Cloud Monitoring Alert and be notified if your application becomes unavailable.–Restore from backups when notified.

Answer: C

NEW QUESTION 3
You are managing a Data Warehouse on BigQuery. An external auditor will review your company's processes, and multiple external consultants will need view access to the data. You need to provide them with view access while following Google-recommended practices. What should you do?

  • A. Grant each individual external consultant the role of BigQuery Editor
  • B. Grant each individual external consultant the role of BigQuery Viewer
  • C. Create a Google Group that contains the consultants and grant the group the role of BigQuery Editor
  • D. Create a Google Group that contains the consultants, and grant the group the role of BigQuery Viewer

Answer: D

NEW QUESTION 4
Your company wants to standardize the creation and management of multiple Google Cloud resources using Infrastructure as Code. You want to minimize the amount of repetitive code needed to manage the environment What should you do?

  • A. Create a bash script that contains all requirement steps as gcloud commands
  • B. Develop templates for the environment using Cloud Deployment Manager
  • C. Use curl in a terminal to send a REST request to the relevant Google API for each individual resource.
  • D. Use the Cloud Console interface to provision and manage all related resources

Answer: B

NEW QUESTION 5
The core business of your company is to rent out construction equipment at a large scale. All the equipment that is being rented out has been equipped with multiple sensors that send event information every few seconds. These signals can vary from engine status, distance traveled, fuel level, and more. Customers are billed based on the consumption monitored by these sensors. You expect high throughput – up to thousands of events per hour per device – and need to retrieve consistent data based on the time of the event. Storing and retrieving individual signals should be atomic. What should you do?

  • A. Create a file in Cloud Storage per device and append new data to that file.
  • B. Create a file in Cloud Filestore per device and append new data to that file.
  • C. Ingest the data into Datastor
  • D. Store data in an entity group based on the device.
  • E. Ingest the data into Cloud Bigtabl
  • F. Create a row key based on the event timestamp.

Answer: D

NEW QUESTION 6
You are running multiple VPC-native Google Kubernetes Engine clusters in the same subnet. The IPs available for the nodes are exhausted, and you want to ensure that the clusters can grow in nodes when needed. What should you do?

  • A. Create a new subnet in the same region as the subnet being used.
  • B. Add an alias IP range to the subnet used by the GKE clusters.
  • C. Create a new VPC, and set up VPC peering with the existing VPC.
  • D. Expand the CIDR range of the relevant subnet for the cluster.

Answer: C

Explanation:
To create a VPC peering connection, first create a request to peer with another VPC.

NEW QUESTION 7
You are assisting a new Google Cloud user who just installed the Google Cloud SDK on their VM. The server needs access to Cloud Storage. The user wants your help to create a new storage bucket. You need to make this change in multiple environments. What should you do?

  • A. Use a Deployment Manager script to automate creating storage buckets in an appropriate region
  • B. Use a local SSD to improve performance of the VM for the targeted workload
  • C. Use the gsutii command to create a storage bucket in the same region as the VM
  • D. Use a Persistent Disk SSD in the same zone as the VM to improve performance of the VM

Answer: A

NEW QUESTION 8
The sales team has a project named Sales Data Digest that has the ID acme-data-digest You need to set up similar Google Cloud resources for the marketing team but their resources must be organized independently of the sales team. What should you do?

  • A. Grant the Project Editor role to the Marketing learn for acme data digest
  • B. Create a Project Lien on acme-data digest and then grant the Project Editor role to the Marketing team
  • C. Create another protect with the ID acme-marketing-data-digest for the Marketing team and deploy the resources there
  • D. Create a new protect named Meeting Data Digest and use the ID acme-data-digest Grant the Project Editor role to the Marketing team.

Answer: C

NEW QUESTION 9
You need to add a group of new users to Cloud Identity. Some of the users already have existing Google accounts. You want to follow one of Google's recommended practices and avoid conflicting accounts. What should you do?

  • A. Invite the user to transfer their existing account
  • B. Invite the user to use an email alias to resolve the conflict
  • C. Tell the user that they must delete their existing account
  • D. Tell the user to remove all personal email from the existing account

Answer: B

NEW QUESTION 10
You are analyzing Google Cloud Platform service costs from three separate projects. You want to use this information to create service cost estimates by service type, daily and monthly, for the next six months using standard query syntax. What should you do?

  • A. Export your bill to a Cloud Storage bucket, and then import into Cloud Bigtable for analysis.
  • B. Export your bill to a Cloud Storage bucket, and then import into Google Sheets for analysis.
  • C. Export your transactions to a local file, and perform analysis with a desktop tool.
  • D. Export your bill to a BigQuery dataset, and then write time window-based SQL queries for analysis.

Answer: D

NEW QUESTION 11
You need to enable traffic between multiple groups of Compute Engine instances that are currently running two different GCP projects. Each group of Compute Engine instances is running in its own VPC. What should you do?

  • A. Verify that both projects are in a GCP Organizatio
  • B. Create a new VPC and add all instances.
  • C. Verify that both projects are in a GCP Organizatio
  • D. Share the VPC from one project and request that the Compute Engine instances in the other project use this shared VPC.
  • E. Verify that you are the Project Administrator of both project
  • F. Create two new VPCs and add all instances.
  • G. Verify that you are the Project Administrator of both project
  • H. Create a new VPC and add all instances.

Answer: B

NEW QUESTION 12
You are the team lead of a group of 10 developers. You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment. What should you do?

  • A. Create a single budget for all projects and configure budget alerts on this budget.
  • B. Create a separate billing account per sandbox project and enable BigQuery billing export
  • C. Create a Data Studio dashboard to plot the spending per billing account.
  • D. Create a budget per project and configure budget alerts on all of these budgets.
  • E. Create a single billing account for all sandbox projects and enable BigQuery billing export
  • F. Create a Data Studio dashboard to plot the spending per project.

Answer: C

NEW QUESTION 13
You are asked to set up application performance monitoring on Google Cloud projects A, B, and C as a single pane of glass. You want to monitor CPU, memory, and disk. What should you do?

  • A. Enable API and then share charts from project A, B, and C.
  • B. Enable API and then give the metrics.reader role to projects A, B, and C.
  • C. Enable API and then use default dashboards to view all projects in sequence.
  • D. Enable API, create a workspace under project A, and then add project B and C.

Answer: D

NEW QUESTION 14
Your company publishes large files on an Apache web server that runs on a Compute Engine instance. The Apache web server is not the only application running in the project. You want to receive an email when the egress network costs for the server exceed 100 dollars for the current month as measured by Google Cloud Platform (GCP). What should you do?

  • A. Set up a budget alert on the project with an amount of 100 dollars, a threshold of 100%, and notificationtype of “email.”
  • B. Set up a budget alert on the billing account with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”
  • C. Export the billing data to BigQuer
  • D. Create a Cloud Function that uses BigQuery to sum the egress network costs of the exported billing data for the Apache web server for the current month and sends an email if it is over 100 dollar
  • E. Schedule the Cloud Function using Cloud Scheduler to run hourly.
  • F. Use the Stackdriver Logging Agent to export the Apache web server logs to Stackdriver Logging.Create a Cloud Function that uses BigQuery to parse the HTTP response log data in Stackdriver for the current month and sends an email if the size of all HTTP responses, multiplied by current GCP egress prices, totals over 100 dollar
  • G. Schedule the Cloud Function using Cloud Scheduler to run hourly.

Answer: D

NEW QUESTION 15
You need to run an important query in BigQuery but expect it to return a lot of records. You want to find out how much it will cost to run the query. You are using on-demand pricing. What should you do?

  • A. Arrange to switch to Flat-Rate pricing for this query, then move back to on-demand.
  • B. Use the command line to run a dry run query to estimate the number of bytes rea
  • C. Then convert that bytes estimate to dollars using the Pricing Calculator.
  • D. Use the command line to run a dry run query to estimate the number of bytes returne
  • E. Then convert that bytes estimate to dollars using the Pricing Calculator.
  • F. Run a select count (*) to get an idea of how many records your query will look throug
  • G. Then convert that number of rows to dollars using the Pricing Calculator.

Answer: B

NEW QUESTION 16
You have an application that uses Cloud Spanner as a backend database. The application has a very predictable traffic pattern. You want to automatically scale up or down the number of Spanner nodes depending on traffic. What should you do?

  • A. Create a cron job that runs on a scheduled basis to review stackdriver monitoring metrics, and then resize the Spanner instance accordingly.
  • B. Create a Stackdriver alerting policy to send an alert to oncall SRE emails when Cloud Spanner CPU exceeds the threshol
  • C. SREs would scale resources up or down accordingly.
  • D. Create a Stackdriver alerting policy to send an alert to Google Cloud Support email when Cloud Spanner CPU exceeds your threshol
  • E. Google support would scale resources up or down accordingly.
  • F. Create a Stackdriver alerting policy to send an alert to webhook when Cloud Spanner CPU is over or under your threshol
  • G. Create a Cloud Function that listens to HTTP and resizes Spanner resources accordingly.

Answer: D

NEW QUESTION 17
You want to configure a solution for archiving data in a Cloud Storage bucket. The solution must be
cost-effective. Data with multiple versions should be archived after 30 days. Previous versions are accessed once a month for reporting. This archive data is also occasionally updated at month-end. What should you do?

  • A. Add a bucket lifecycle rule that archives data with newer versions after 30 days to Coldline Storage.
  • B. Add a bucket lifecycle rule that archives data with newer versions after 30 days to Nearline Storage.
  • C. Add a bucket lifecycle rule that archives data from regional storage after 30 days to Coldline Storage.
  • D. Add a bucket lifecycle rule that archives data from regional storage after 30 days to Nearline Storage.

Answer: B

NEW QUESTION 18
......

Thanks for reading the newest Associate-Cloud-Engineer exam dumps! We recommend you to try the PREMIUM Thedumpscentre.com Associate-Cloud-Engineer dumps in VCE and PDF here: https://www.thedumpscentre.com/Associate-Cloud-Engineer-dumps/ (190 Q&As Dumps)