GCP Associate Cloud Engineer Practice Exam Part 4(2)

Source:

Actual Exam Version:

  1. You want to list all the internal and external IP addresses of all compute instances. Which of the commands below should you run to retrieve this information?

A. gcloud compute instances list.
B. gcloud compute networks list.
C. gcloud compute networks list-ip.
D. gcloud compute instances list-ip.

  1. You are designing an application that lets users upload and share photos. You expect your application to grow really fast and you are targeting a worldwide audience. You want to delete uploaded photos after 30 days. You want to minimize costs while ensuring your application is highly available. Which GCP storage solution should you choose?

A. Persistent SSD on VM instances.
B. Cloud Filestore.
C. Multiregional Cloud Storage bucket.
D. Cloud Datastore database.

  1. You have a web application deployed as a managed instance group. You noticed some of the compute instances are running low on memory. You suspect this is due to JVM memory leak and you want to restart the compute instances to reclaim the leaked memory. Your web application is currently serving live web traffic. You want to ensure that the available capacity does not go below 80% at any time during the restarts and you want to do this at the earliest. What would you do?

A. Perform a rolling-action reboot with max-surge set to 20%.
B. Perform a rolling-action restart with max-unavailable set to 20%.
C. Stop instances in the managed instance group (MIG) one at a time and rely on autohealing to bring them back up.
D. Perform a rolling-action replace with max-unavailable set to 20%.

  1. You are migrating a Python application from your on-premises data centre to Google Cloud. You want to deploy the application Google App Engine, and you modified the python application to use Cloud Pub/Sub instead of RabbitMQ. The application uses a specific service account which has the necessary permissions to publish and subscribe on Cloud Pub/Sub; however, the operations team have not enabled the Cloud Pub/Sub API yet. What should you do?

A. Grant roles/pubsub.admin IAM role to the service account and modify the application code to enable the API before publishing or subscribing.
B. Configure the App Engine Application in GCP Console to use the specific Service Account with the necessary IAM permissions and rely on the automatic enablement of the Cloud Pub/Sub API on the first request to publish or subscribe.
C. Navigate to the APIs & Services section in GCP console and enable Cloud Pub/Sub API.
D. Use deployment manager to configure the App Engine Application to use the specific Service Account with the necessary IAM permissions and rely on the automatic enablement of the Cloud Pub/Sub API on the first request to publish or subscribe.

  1. You created a Kubernetes deployment by running kubectl run nginx -image=nginx labels= “app=prod”. Your Kubernetes cluster is also used by a number of other deployments. How can you find the identifier of the pods for this nginx deployment?

A. kubectl get deployments –output=pods
B. gcloud get pods –selector=”app=prod”
C. kubectl get pods -I “app=prod”
D. gcloud list gke-deployments -filter={pod }

  1. You developed an application that lets users upload statistical files and subsequently run analytics on this data. You chose to use Google Cloud Storage and BigQuery respectively for these requirements as they are highly available and scalable. You have a docker image for your application code, and you plan to deploy on your on-premises Kubernetes clusters. Your on-prem Kubernetes cluster needs to connect to Google Cloud Storage and BigQuery and you want to do this in a secure way following Google recommended practices. What should you do?

A. Create a new service account, with editor permissions, generate and download a key. Use the key to authenticate inside the application.
B. Create a new service account, grant it the least viable privileges to the required services, generate and download a JSON key. Use the JSON key to authenticate inside the application.
C. Use the default service account for App Engine, which already has the required permissions.
D. Use the default service account for Compute Engine, which already has the required permissions.

  1. Your company has deployed several production applications across many Google Cloud Projects. Your operations team requires a consolidated monitoring dashboard for all the projects. What should you do?

A. Set up a shared VPC across all production GCP projects and configure Cloud Monitoring dashboard on one of the projects.
B. Create a Stackdriver account in each project and configure all accounts to use the same service account. Create a monitoring dashboard in one of the projects.
C. Create a single Stackdriver account and link all production GCP projects to it. Configure a monitoring dashboard in the Stackdriver account.
D. Create a Stackdriver account and a Stackdriver group in one of the production GCP projects. Add all other projects as members of the group. Configure a monitoring dashboard in the Stackdriver account.

  1. An engineer from your team accidentally deployed several new versions of NodeJS application on Google App Engine Standard. You are concerned the new versions are serving traffic. You have been asked to produce a list of all the versions of the application that are receiving traffic as well the percent traffic split between them. What should you do?

A. gcloud app versions list -hide-no-traffic
B. gcloud app versions list -show-traffic
C. gcloud app versions list -traffic
D. gcloud app versions list

  1. You work for a big multinational financial company that has several hundreds of Google Cloud Projects for various development, test and production workloads. Financial regulations require your company to store all audit files for three years. What should you do to implement a log retention solution while minimizing storage cost?

A. Export audit logs from Cloud Logging to Cloud Pub/Sub via an export sink. Configure a Cloud Dataflow pipeline to process these messages and store them in Cloud SQL for MySQL.
B. Write a script that exports audit logs from Cloud Logging to BigQuery. Use Cloud Scheduler to trigger the script every hour.
C. Export audit logs from Cloud Logging to Coldline Storage bucket via an export sink.
D. Export audit logs from Cloud Logging to BigQuery via an export sink.

  1. Your company has three GCP projects – for development, test and production environments. The budgeting team in the finance department needs to know the cost estimates for the next financial year to include it in the budget. They have years of experience using SQL and need to group costs by parameters such as duration (day/week/month/quarter), service type, region, etc. How can you enable this?

A. Export billing data to a Google Cloud Storage bucket. Manually copy the data from Cloud Storage bucket to a Google sheet. Ask the budgeting team to apply formulas in the Google sheet to
analyze current costs and estimate future costs.
B. Export billing data to a BigQuery dataset. Ask the budgeting team to run queries against BigQuery to analyze current costs and estimate future costs.
C. Download the costs as CSV file from the Cost Table page. Ask the budgeting team to open this file Microsoft Excel and apply formulas to analyze current costs and estimate future costs.
D. Export billing data to a Google Cloud Storage bucket. Trigger a Cloud Function that reads the data and inserts into Cloud BigTable. Ask the budgeting team to run queries against BigTable to analyze current costs and estimate future costs.

  1. You have files in a Cloud Storage bucket that you need to share with your suppliers. You want to restrict the time that the files are available to your suppliers to 1 hour. You want to follow Google recommended practices. What should you do?

A. Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -m 1h gs:///.
B. Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -d 1h gs:///.
C. Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -p 60m gs:///.
D. Create a JSON key for the Default Compute Engine Service Account. Execute the command gsutil signurl -t 60m gs:///

  1. You want to find a list of regions and the prebuilt images offered by Google Compute Engine. Which commands should you execute to retrieve this information?

A. gcloud compute regions list, gcloud images list
B. gcloud compute regions list, gcloud compute images list
C. gcloud regions list, gcloud images list
D. gcloud regions list, gcloud compute images list

  1. Your company produces documentary videos for a reputed television channel and stores its videos in Google Cloud Storage for long term archival. Videos older than 90 days are accessed only in exceptional circumstances and videos older than one year are no longer needed. How should you optimise the storage to reduce costs?

A. Use a Cloud Function to rewrite the storage class to Coldline for objects older than 90 days. Use another Cloud Function to delete objects older than 365 days from Coldline Storage Class.
B. Use a Cloud Function to rewrite the storage class to Coldline for objects older than 90 days. Use another Cloud Function to delete objects older than 275 days from Coldline Storage Class.
C. Configure a lifecycle rule to transition objects older than 90 days to Coldline Storage Class. Configure another lifecycle rule to delete objects older than 275 days from Coldline Storage Class.
D. Configure a lifecycle rule to transition objects older than 90 days to Coldline Storage Class. Configure another lifecycle rule to delete objects older than 365 days from Coldline Storage Class.

  1. You want to use Google Cloud Storage to host a static website on www.example.com for your staff. You created a bucket example-static-website and uploaded index.html and CSS files to it. You turned on static website hosting on the bucket and set up a CNAME record on www.example.com to point to :.storage.googleapis.com. You access the static website by navigating to www.example.com in the browser but your index page is not displayed. What should you do?

A. Reload the Cloud Storage static website server to load the objects.
B. In example.com zone, modify the CNAME record to storage.googleapis.com/example-static-website
C. Delete the existing bucket, create a new bucket with the name www.example.com and upload the html/css files.
D. In example.com zone, delete the existing CNAME record and set up an A record instead to point to c.storage.googleapis.com.

  1. You developed an enhancement to a production application deployed in App Engine Standard service. Unit testing and user acceptance testing has succeeded, and you deployed the new version to production. Users have started complaining of slow performance after the recent update, and you need to revert to the previous version immediately. How can you do this?

A. Deploy the previous version as a new App Engine Application and use traffic splitting feature to send all traffic to the new application.
B. In the App Engine Console, identify the App Engine application and select Revert.
C. In the App Engine Console, identify the App Engine application versions and make the previous version the default to route all traffic to it.
D. Execute gcloud app restore to rollback to the previous version.

  1. You deployed a java application in a single Google Cloud Compute Engine VM. During peak usage, the application CPU is maxed out and results in stuck threads which ultimately make the system unresponsive, and requires a reboot. Your operations team want to receive an email alert when the CPU utilization is greater than 95% for more than 10 minutes so they can manually change the instance type to another instance that offers more CPU. What should you do?

A. Link the GCP project to a Cloud Monitoring workspace. Configure an Alerting policy based on CPU utilization in Cloud Monitoring and trigger an email notification when the utilization exceeds the threshold.
B. Write a custom script to monitor CPU usage and send an email notification when the usage exceeds the threshold.
C. In Cloud Logging, create logs based metric for CPU usage and store it as a custom metric in Cloud Monitoring. Create an Alerting policy based on CPU utilization in Cloud Monitoring and trigger an email notification when the utilization exceeds the threshold.
D. Link the project to a Cloud Monitoring workspace. Write a custom script that captures CPU utilization every minute and sends to Cloud Monitoring as a custom metric. Add an uptime check based on the CPU utilization.

  1. You plan to deploy an application on an autoscaled managed instances group. The application uses a tomcat server and runs on port 8080. You want to access the application on https://www.example.com. You want to follow Google recommended practices. What services would you use?

A. Google DNS, Google CDN, SSL Proxy Load Balancer
B. Google Domains, Cloud DNS private zone, SSL Proxy Load Balancer
C. Google Domains, Cloud DNS private zone, HTTP(S) Load Balancer
D. Google Domains, Cloud DNS, HTTP(S) Load Balancer

  1. You are hosting a new application on https://www.my-new-gcp-ace-website.com The static content of the application is served from /static path and is hosted in a Cloud Storage bucket. The dynamic content is served from/dynamic path and is hosted on a fleet of compute engine instances belonging to a Managed Instance Group. How can you configure a single GCP Load Balancer to serve content from both paths?

A. Use HAProxy Alpine Docker images to deploy to GKE cluster. Configure HAProxy to route /dynamic/ to the Managed Instance Group (MIG) and/static/to GCS bucket. Create a service of type LoadBalancer. Create a DNS A record on www.my-new-gcp-ace-website.com to point to the address of LoadBalancer.
B. Configure an HTTP(s) Load Balancer and configure it to route requests on /dynamic/to the Managed Instance Group (MIG) and /static/to GCS bucket. Create a DNS A record on www.my-new-gcp-ace-website.com to point to the address of LoadBalancer.
C. Configure an HTTP(s) Load Balancer for the Managed Instance Group (MIG). Configure the necessary TXT DNS records on www.my-new-gcp-ace-website.com to route requests on /dynamic/ to the Managed Instance Group (MIG) and /static/to GCS bucket.
D. Create a CNAME DNS record on www.my-new-gcp-ace-website.com to point to storage.googleapis.com. Configure an HTTP(s) Load Balancer for the Managed Instance Group (MIG). Set up redirection rules in Cloud Storage bucket to forward requests for non-static content to the Load Balancer address.

  1. Your company recently migrated all infrastructure to Google Cloud Platform (GCP) and you want to use Google Cloud Build to build all container images. You want to store the build logs in Google Cloud Storage. You also have a requirement to push the images to Google Container Registry. You wrote a cloud build YAML configuration file with the following contents. steps:

– name: Google Cloud console
args: (‘build’, -t’,’gcr.io/[PROJECT_ID]/[IMAGE_NAME]’
images: [‘gcr.io/[PROJECT_ID]/[IMAGE_NAME]’]

How should you execute Cloud build to satisfy these requirements?

A. Execute gcloud builds run –config=[CONFIG_FILE_PATH]- –gcs-log-dir=[GCS_LOG_DIR] [SOURCE] B. Execute gcloud builds push –config=[CONFIG_FILE_PATH] [SOURCE] C. Execute gcloud builds submit –config=[CONFIG_FILE_PATH] [SOURCE] D. Execute gcloud builds submit –config=[CONFIG_FILE_PATH]–gcs- –log-dir=[GCS_LOG_DIR] [SOURCE]

  1. Your operations team have deployed an update to a production application running in Google Cloud App Engine Standard service. The deployment was successful, but your operations are unable to find this deployment in the production GCP project. What should you do?

A. Review the project settings in the App Engine application configuration files.
B. Review the properties of the active gcloud configurations by executing gcloud config list.
C. Review the project settings in the App Engine deployment YAML file.
D. Review the project settings in the Deployment Manager console.

  1. You have a Cloud Function that is triggered every night by Cloud Scheduler. The Cloud Function creates a snapshot of VMs running in all projects in the department. Your team created a new project ptech-vm, and you now need to provide IAM access to the service account used by the Cloud Function to let it create snapshots of VMs in the new ptech-vm project. You want to follow Google recommended practices. What should you do?

A. Grant Compute Storage Admin IAM role on the ptech-vm project to the service account used by the Cloud Function.
B. Use gcloud to generate a JSON key for the existing service account used by the Cloud Function. Add a metadata tag to all compute engine instances in the ptech-vm project with key: service-account and value: .
C. Set the scope of the service account to Read/Write when provisioning compute engine instances in the ptech-vm project.
D. Use gcloud to generate a JSON key for the existing service account used by the Cloud Function. Register the JSON key as SSH key on all VM instances in the ptech-vm project.

  1. Your company has multiple GCP projects in several regions, and your operations team have created numerous gcloud configurations for most common operational needs. They have asked your help to retrieve an inactive gcloud configuration and the GKE clusters that use it, using the least number of steps. What command should you execute to retrieve this information?

A. Execute gcloud config configurations describe.
B. Execute gcloud config configurations activate, then gcloud config list.
C. Execute kubectl config use-context, then kubectl config view.
D. Execute kubectl config get-contexts.

  1. Your team uses Splunk for centralized logging and you have a number of reports and dashboards based on the logs in Splunk. You want to install Splunk forwarder on all nodes of your new Kubernetes Engine Autoscaled Cluster. The Splunk forwarder forwards the logs to a centralized Splunk Server. You want to minimize operational overhead. What is the best way to install Splunk Forwarder on all nodes in the cluster?

A. SSH to each node and run a script to install the forwarder agent.
B. Include the forwarder agent in a DaemonSet deployment.
C. Use Deployment Manager to orchestrate the deployment of forwarder agents on all nodes.
D. Include the forwarder agent in a StatefulSet deployment.

  1. You work for a leading retail platform that enables its retailers to sell their items to over 200 million users worldwide. You persist all analytics data captured during user navigation to BigQuery. A business analyst wants to run a query to identify products that were popular with buyers in the recent thanksgiving sale. The analyst understands the query needs to iterate through billions of rows to fetch the required information but is not sure of the costs involved in the on-demand pricing model, and has asked you to help estimate the query cost. What should you do?

A. Run the query using bq with the –dry_run flag to estimate the number of bytes read by the query. Make use of the pricing calculator to estimate the query cost.
B. Run the query using bq with the –dry_run flag to estimate the number of bytes returned by the query. Make use of the pricing calculator to estimate the query cost.
C. Execute the query using bq to estimate the number of rows returned by the query. Make use of the pricing calculator to estimate the query cost.
D. Switch to BigQuery flat-rate pricing. Coordinate with the analyst to run the query while on flat-rate pricing and switch back to on-demand pricing.

  1. Your company has many Citrix services deployed in the on-premises datacenter, and they all connect to the Citrix Licensing Server on 10.10.10.10 in the same data centre. Your company wants to migrate the Citrix Licensing Server and all Citrix services to Google Cloud Platform. You want to minimize changes while ensuring the services can continue to connect to the Citrix licensing server. How should you do this in Google Cloud?

A. Use gcloud compute addresses create to reserve 10.10.10.10 as a static external IP and assign it to the Citrix Licensing Server VM Instance.
B. Deploy the Citrix Licensing Server on a Google Compute Engine instance and set its ephemeral IP address to 10.10.10.10.
C. Deploy the Citrix Licensing Server on a Google Compute Engine instance with an ephemeral IP address. Once the server is responding to requests, promote the ephemeral IP address to a static internal IP address.
D. Use gcloud compute addresses create to reserve 10.10.10.10 as a static internal IP and assign it to the Citrix Licensing Server VM Instance.

  1. Your company hosts a number of applications in Google Cloud and requires that log messages from all applications be archived for 10 years to comply with local regulatory requirements. Which approach should you use?

A. Grant the security team access to the logs in each Project
B. 1. Enable Stackdriver Logging API, 2. Configure web applications to send logs to Stackdriver, 3. Export logs to BigQuery
C. 1. Enable Stackdriver Logging API, 2. Configure web applications to send logs to Stackdriver, 3. Export logs to Google Cloud Storage
D. 1. Enable Stackdriver Logging API, 2. Configure web applications to send logs to Stackdriver

  1. You want to deploy a python application to an autoscaled managed instance group on Compute Engine. You want to use GCP deployment manager to do this. What is the fastest way to get the application onto the instances without introducing undue complexity?

A. Include a startup script to bootstrap the python application when creating an instance template by running gcloud compute instance-templates create app-template -startup-script=/scripts/install_app.sh
B. Include a startup script to bootstrap the python application when creating an instance template by running gcloud compute instance-templates create app-template – –metadata-from-file startup-script=/scripts/install_app.sh
C. Once the instance starts up, connect over SSH and install the application.
D. Include a startup script to bootstrap the python application when creating an instance template by running gcloud compute nstance-templates create app-template – metadata-from-file startup-script-url=/scripts/install_app.sh

  1. You deployed your application to a default node pool on the GKE cluster and you want to configure cluster autoscaling for this GKE cluster. For your application to be profitable, you must limit the number of Kubernetes nodes to 10. You want to start small and scale up as traffic increases and scale down when the traffic goes down. What should you do?

A. Update existing GKE cluster to enable autoscaling by running the command gcloud container clusters update [CLUSTER_NAME] enable-autoscaling -min-nodes= 1 –max-nodes=10
B. Set up a stack driver alert to detect slowness in the application. When the alert is triggered, increase nodes in the cluster by running the command gcloud container clusters resize CLUSTER_Name -size .
C. Create a new GKE cluster by running the command gcloud container clusters create [CLUSTER_NAME] –enable-autoscaling –min-nodes= 1 -max-nodes= 10.
D. Redeploy your application To enable autoscaling, add a tag to the instances in the cluster by running the command gcloud compute instances add-tags [INSTANCE] -tags=enable-autoscaling,min-nodes=1,max-nodes=1

  1. Your company plans to store sensitive PII data in a cloud storage bucket. Your compliance department has asked you to ensure the objects in this bucket are encrypted by customer-managed encryption keys. What should you do?

A. In the bucket advanced settings, select Customer-supplied key and then select a Cloud KMS encryption key.
B. In the bucket advanced settings, select Customer-managed key and then select a Cloud KMS encryption key.
C. Recreate the bucket to use a Customer-managed key. Encryption can only be specified at the time of bucket creation.
D. In the bucket advanced settings, select Google-managed key and then select a Cloud KMS encryption key.

  1. You have developed an enhancement for a photo compression application running on the App Engine Standard service in Google Cloud Platform, and you want to canary test this enhancement on a small percentage of live users. How can you do this?

A. Deploy the enhancement as a new App Engine Application in the existing GCP project. Make use of App Engine native routing to have the old App Engine application proxy 1% of the requests to the new App Engine application.
B. Use gcloud app deploy to deploy the enhancement as a new version in the existing application and use –splits flag to split the traffic between the old version and the new version. Assign a weight of 1 to the new version and 99 to the old version.
C. Use gcloud app deploy to deploy the enhancement as a new version in the existing application with -migrate flag.
D. Deploy the enhancement as a new App Engine Application in the existing GCP project. Configure the network load balancer to route 99% of the requests to the old (existing) App Engine Application and 1% to the new App Engine Application.