Weekend Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code = simple70

Pass the Google Cloud Developer Professional-Cloud-Developer Questions and answers with ExamsMirror

Practice at least 50% of the questions to maximize your chances of passing.
Exam Professional-Cloud-Developer Premium Access

View all detail and faqs for the Professional-Cloud-Developer exam


396 Students Passed

92% Average Score

93% Same Questions
Viewing page 1 out of 6 pages
Viewing questions 1-10 out of questions
Questions # 1:

You are developing an application that will be launched on Compute Engine instances into multiple distinct projects, each corresponding to the environments in your software development process (development, QA, staging, and production). The instances in each project have the same application code but a different configuration. During deployment, each instance should receive the application’s configuration based on the environment it serves. You want to minimize the number of steps to configure this flow.

What should you do?

Options:

A.

When creating your instances, configure a startup script using the gcloud command to determine the project name that indicates the correct environment.

B.

In each project, configure a metadata key “environment” whose value is the environment it serves. Use your deployment tool to query the instance metadata and configure the application based on the “environment” value.

C.

Deploy your chosen deployment tool on an instance in each project. Use a deployment job to retrieve the appropriate configuration file from your version control system, and apply the configuration when deploying the application on each instance.

D.

During each instance launch, configure an instance custom-metadata key named “environment” whose value is the environment the instance serves. Use your deployment tool to query the instance metadata, and configure the application based on the “environment” value.

Questions # 2:

You want to upload files from an on-premises virtual machine to Google Cloud Storage as part of a data

migration. These files will be consumed by Cloud DataProc Hadoop cluster in a GCP environment.

Which command should you use?

Options:

A.

gsutil cp [LOCAL_OBJECT] gs://[DESTINATION_BUCKET_NAME]/

B.

gcloud cp [LOCAL_OBJECT] gs://[DESTINATION_BUCKET_NAME]/

C.

hadoop fs cp [LOCAL_OBJECT] gs://[DESTINATION_BUCKET_NAME]/

D.

gcloud dataproc cp [LOCAL_OBJECT] gs://[DESTINATION_BUCKET_NAME]/

Questions # 3:

You are developing an ecommerce web application that uses App Engine standard environment and Memorystore for Redis. When a user logs into the app, the application caches the user’s information (e.g., session, name, address, preferences), which is stored for quick retrieval during checkout.

While testing your application in a browser, you get a 502 Bad Gateway error. You have determined that the application is not connecting to Memorystore. What is the reason for this error?

Options:

A.

Your Memorystore for Redis instance was deployed without a public IP address.

B.

You configured your Serverless VPC Access connector in a different region than your App Engine instance.

C.

The firewall rule allowing a connection between App Engine and Memorystore was removed during an infrastructure update by the DevOps team.

D.

You configured your application to use a Serverless VPC Access connector on a different subnet in a different availability zone than your App Engine instance.

Questions # 4:

You are monitoring a web application that is written in Go and deployed in Google Kubemetes Engine. You notice an increase in CPU and memory utilization. You need to determine which function is consuming the most CPU and memory resources. What should you do?

Options:

A.

Import the Cloud Profiler package into your application, and initialize the Profiler agent. Review the generated flame graph in the Google Cloud console to identify time-intensive functions.

B.

Create a Cloud Logging query that gathers the web application's logs. Write a Python script that calculates the difference between the

timestamps from the beginning and the end of the application's longest functions to identity time-intensive functions.

C.

Import OpenTelemetry and Trace export packages into your application, and create the trace provider. Review the latency data for your

application on the Trace overview page, and identify which functions cause the most latercy.

D.

Add print commands to the application source code to log when each function is called, and redeploy the application.

Questions # 5:

You are building a CI/CD pipeline that consists of a version control system, Cloud Build, and Container Registry. Each time a new tag is pushed to the repository, a Cloud Build job is triggered, which runs unit tests on the new code builds a new Docker container image, and pushes it into Container Registry. The last step of your pipeline should deploy the new container to your production Google Kubernetes Engine (GKE) cluster. You need to select a tool and deployment strategy that meets the following requirements:

• Zero downtime is incurred

• Testing is fully automated

• Allows for testing before being rolled out to users

• Can quickly rollback if needed

What should you do?

Options:

A.

Trigger a Spinnaker pipeline configured as an A/B test of your new code and, if it is successful, deploy the container to production.

B.

Trigger a Spinnaker pipeline configured as a canary test of your new code and, if it is successful, deploy the container to production.

C.

Trigger another Cloud Build job that uses the Kubernetes CLI tools to deploy your new container to your GKE cluster, where you can perform a canary test.

D.

Trigger another Cloud Build job that uses the Kubernetes CLI tools to deploy your new container to your GKE cluster, where you can perform a shadow test.

Questions # 6:

You are developing a single-player mobile game backend that has unpredictable traffic patterns as users interact with the game throughout the day and night. You want to optimize costs by ensuring that you have enough resources to handle requests, but minimize over-provisioning. You also want the system to handle traffic spikes efficiently. Which compute platform should you use?

Options:

A.

Cloud Run

B.

Compute Engine with managed instance groups

C.

Compute Engine with unmanaged instance groups

D.

Google Kubernetes Engine using cluster autoscaling

Questions # 7:

You work for a web development team at a small startup. Your team is developing a Node.js application using Google Cloud services, including Cloud Storage and Cloud Build. The team uses a Git repository for version control. Your manager calls you over the weekend and instructs you to make an emergency update to one of the company’s websites, and you’re the only developer available. You need to access Google Cloud to make the update, but you don’t have your work laptop. You are not allowed to store source code locally on a non-corporate computer. How should you set up your developer environment?

Options:

A.

Use a text editor and the Git command line to send your source code updates as pull requests from a public computer.

B.

Use a text editor and the Git command line to send your source code updates as pull requests from a virtual machine running on a public computer.

C.

Use Cloud Shell and the built-in code editor for development. Send your source code updates as pull requests.

D.

Use a Cloud Storage bucket to store the source code that you need to edit. Mount the bucket to a public computer as a drive, and use a code editor to update the code. Turn on versioning for the bucket, and point it to the team’s Git repository.

Questions # 8:

Your web application is deployed to the corporate intranet. You need to migrate the web application to Google Cloud. The web application must be available only to company employees and accessible to employees as they travel. You need to ensure the security and accessibility of the web application while minimizing application changes. What should you do?

Options:

A.

Configure the application to check authentication credentials for each HTTP(S) request to the application.

B.

Configure Identity-Aware Proxy to allow employees to access the application through its public IP address.

C.

Configure a Compute Engine instance that requests users to log in to their corporate account. Change the web application DNS to point to the proxy Compute Engine instance. After authenticating, the Compute Engine instance forwards requests to and from the web application.

D.

Configure a Compute Engine instance that requests users to log in to their corporate account. Change the web application DNS to point to the proxy Compute Engine instance. After authenticating, the Compute Engine issues an HTTP redirect to a public IP address hosting the web application.

Questions # 9:

Your application is composed of a set of loosely coupled services orchestrated by code executed on Compute Engine. You want your application to easily bring up new Compute Engine instances that find and use a specific version of a service. How should this be configured?

Options:

A.

Define your service endpoint information as metadata that is retrieved at runtime and used to connect to the desired service.

B.

Define your service endpoint information as label data that is retrieved at runtime and used to connect to the desired service.

C.

Define your service endpoint information to be retrieved from an environment variable at runtime and used to connect to the desired service.

D.

Define your service to use a fixed hostname and port to connect to the desired service. Replace the service at the endpoint with your new version.

Questions # 10:

Your company has a data warehouse that keeps your application information in BigQuery. The BigQuery data warehouse keeps 2 PBs of user data. Recently, your company expanded your user base to include EU users and needs to comply with these requirements:

Your company must be able to delete all user account information upon user request.

All EU user data must be stored in a single region specifically for EU users.

Which two actions should you take? (Choose two.)

Options:

A.

Use BigQuery federated queries to query data from Cloud Storage.

B.

Create a dataset in the EU region that will keep information about EU users only.

C.

Create a Cloud Storage bucket in the EU region to store information for EU users only.

D.

Re-upload your data using to a Cloud Dataflow pipeline by filtering your user records out.

E.

Use DML statements in BigQuery to update/delete user records based on their requests.

Viewing page 1 out of 6 pages
Viewing questions 1-10 out of questions
TOP CODES

TOP CODES

Top selling exam codes in the certification world, popular, in demand and updated to help you pass on the first try.