Summer Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dealsixty

Professional-Cloud-DevOps-Engineer Exam Dumps - Google Cloud DevOps Engineer Questions and Answers

Question # 34

You are using Terraform to manage infrastructure as code within a Cl/CD pipeline You notice that multiple copies of the entire infrastructure stack exist in your Google Cloud project, and a new copy is created each time a change to the existing infrastructure is made You need to optimize your cloud spend by ensuring that only a single instance of your infrastructure stack exists at a time. You want to follow Google-recommended practices What should you do?

Options:

A.

Create a new pipeline to delete old infrastructure stacks when they are no longer needed

B.

Confirm that the pipeline is storing and retrieving the terraform. if state file from Cloud Storage with the Terraform gcs backend

C.

Verify that the pipeline is storing and retrieving the terrafom.tfstat* file from a source control

D.

Update the pipeline to remove any existing infrastructure before you apply the latest configuration

Buy Now
Question # 35

Your team is running microservices in Google Kubernetes Engine (GKE) You want to detect consumption of an error budget to protect customers and define release policies What should you do?

Options:

A.

Create SLIs from metrics Enable Alert Policies if the services do not pass

B.

Use the metrics from Anthos Service Mesh to measure the health of the microservices

C.

Create a SLO Create an Alert Policy on select_slo_bum_rate

D.

Create a SLO and configure uptime checks for your services Enable Alert Policies if the services do not pass

Buy Now
Question # 36

You need to enforce several constraint templates across your Google Kubernetes Engine (GKE) clusters. The constraints include policy parameters, such as restricting the Kubernetes API. You must ensure that the policy parameters are stored in a GitHub repository and automatically applied when changes occur. What should you do?  

Options:

A.

Set up a GitHub action to trigger Cloud Build when there is a parameter change. In Cloud Build, run a gcloud CLI command to apply the change.

B.

When there is a change in GitHub, use a webhook to send a request to Cloud Service Mesh, and apply the change.

C.

Configure Config Sync with the GitHub repository. When there is a change in the repository, use Config Sync to apply the change.

D.

Configure Config Connector with the GitHub repository. When there is a change in the repository, use Config Connector to apply the change.

Buy Now
Question # 37

You are running a web application that connects to an AlloyDB cluster by using a private IP address in your default VPC. You need to run a database schema migration in your CI/CD pipeline by using Cloud Build before deploying a new version of your application. You want to follow Google-recommended security practices. What should you do?  

Options:

A.

Set up a Cloud Build private pool to access the database through a static external IP address. Configure the database to only allow connections from this IP address. Execute the schema migration script in the private pool.

B.

Create a service account that has permission to access the database. Configure Cloud Build to use this service account and execute the schema migration script in a private pool.

C.

Add the database username and encrypted password to the application configuration file. Use these credentials in Cloud Build to execute the schema migration script.

D.

Add the database username and password to Secret Manager. When running the schema migration script, retrieve the username and password from Secret Manager.

Buy Now
Question # 38

Your application runs on Google Cloud Platform (GCP). You need to implement Jenkins for deploying application releases to GCP. You want to streamline the release process, lower operational toil, and keep user data secure. What should you do?

Options:

A.

Implement Jenkins on local workstations.

B.

Implement Jenkins on Kubernetes on-premises

C.

Implement Jenkins on Google Cloud Functions.

D.

Implement Jenkins on Compute Engine virtual machines.

Buy Now
Question # 39

You are managing an application that exposes an HTTP endpoint without using a load balancer. The latency of the HTTP responses is important for the user experience. You want to understand what HTTP latencies all of your users are experiencing. You use Stackdriver Monitoring. What should you do?

Options:

A.

• In your application, create a metric with a metricKind set to DELTA and a valueType set to DOUBLE.• In Stackdriver's Metrics Explorer, use a Slacked Bar graph to visualize the metric.

B.

• In your application, create a metric with a metricKind set to CUMULATIVE and a valueType set to DOUBLE.• In Stackdriver's Metrics Explorer, use a Line graph to visualize the metric.

C.

• In your application, create a metric with a metricKind set to gauge and a valueType set to distribution.• In Stackdriver's Metrics Explorer, use a Heatmap graph to visualize the metric.

D.

• In your application, create a metric with a metricKind. set toMETRlc_KIND_UNSPECIFIEDanda valueType set to INT64.• In Stackdriver's Metrics Explorer, use a Stacked Area graph to visualize the metric.

Buy Now
Question # 40

You need to create a Cloud Monitoring SLO for a service that will be published soon. You want to verify that requests to the service will be addressed in fewer than 300 ms at least 90% Of the time per calendar month. You need to identify the metric and evaluation method to use. What should you do?

Options:

A.

Select a latency metric for a request-based method of evaluation.

B.

Select a latency metric for a window-based method of evaluation.

C.

Select an availability metric for a request-based method of evaluation.

D.

Select an availability metric for a window-based method Of evaluation.

Buy Now
Question # 41

You are ready to deploy a new feature of a web-based application to production. You want to use Google Kubernetes Engine (GKE) to perform a phased rollout to half of the web server pods.

What should you do?

Options:

A.

Use a partitioned rolling update.

B.

Use Node taints with NoExecute.

C.

Use a replica set in the deployment specification.

D.

Use a stateful set with parallel pod management policy.

Buy Now
Question # 42

You are configuring your CI/CD pipeline natively on Google Cloud. You want builds in a pre-production Google Kubernetes Engine (GKE) environment to be automatically load-tested before being promoted to the production GKE environment. You need to ensure that only builds that have passed this test are deployed to production. You want to follow Google-recommended practices. How should you configure this pipeline with Binary Authorization?

Options:

A.

Create an attestation for the builds that pass the load test by requiring the lead quality assurance engineer to sign the attestation by using a key stored in Cloud Key Management Service (Cloud KMS).

B.

Create an attestation for the builds that pass the load test by using a private key stored in Cloud Key Management Service (Cloud KMS) authenticated through Workload Identity.

C.

Create an attestation for the builds that pass the load test by using a private key stored in Cloud Key Management Service (Cloud KMS) with a service account JSON key stored as a Kubernetes Secret.

D.

Create an attestation for the builds that pass the load test by requiring the lead quality assurance engineer to sign the attestation by using their personal private key.

Buy Now
Question # 43

Your application artifacts are being built and deployed via a CI/CD pipeline. You want the CI/CD pipeline to securely access application secrets. You also want to more easily rotate secrets in case of a security breach. What should you do?

Options:

A.

Prompt developers for secrets at build time. Instruct developers to not store secrets at rest.

B.

Store secrets in a separate configuration file on Git. Provide select developers with access to the configuration file.

C.

Store secrets in Cloud Storage encrypted with a key from Cloud KMS. Provide the CI/CD pipeline with access to Cloud KMS via IAM.

D.

Encrypt the secrets and store them in the source code repository. Store a decryption key in a separate repository and grant your pipeline access to it

Buy Now
Exam Name: Google Cloud Certified - Professional Cloud DevOps Engineer Exam
Last Update: Jun 15, 2025
Questions: 194
Professional-Cloud-DevOps-Engineer pdf

Professional-Cloud-DevOps-Engineer PDF

$34  $84.99
Professional-Cloud-DevOps-Engineer Engine

Professional-Cloud-DevOps-Engineer Testing Engine

$38  $94.99
Professional-Cloud-DevOps-Engineer PDF + Engine

Professional-Cloud-DevOps-Engineer PDF + Testing Engine

$54  $134.99