Weekend Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code = simple70

Pass the Amazon Web Services AWS Certified Professional DOP-C02 Questions and answers with ExamsMirror

Practice at least 50% of the questions to maximize your chances of passing.
Exam DOP-C02 Premium Access

View all detail and faqs for the DOP-C02 exam


424 Students Passed

97% Average Score

94% Same Questions
Viewing page 1 out of 10 pages
Viewing questions 1-10 out of questions
Questions # 1:

A company runs an application on one Amazon EC2 instance. Application metadata is stored in Amazon S3 and must be retrieved if the instance is restarted. The instance must restart or relaunch automatically if the instance becomes unresponsive.

Which solution will meet these requirements?

Options:

A.

Create an Amazon CloudWatch alarm for the StatusCheckFailed metric. Use the recover action to stop and start the instance. Use an S3 event notification to push the metadata to the instance when the instance is back up and running.

B.

Configure AWS OpsWorks, and use the auto healing feature to stop and start the instance. Use a lifecycle event in OpsWorks to pull the metadata from Amazon S3 and update it on the instance.

C.

Use EC2 Auto Recovery to automatically stop and start the instance in case of a failure. Use an S3 event notification to push the metadata to the instance when the instance is back up and running.

D.

Use AWS CloudFormation to create an EC2 instance that includes the UserData property for the EC2 resource. Add a command in UserData to retrieve the application metadata from Amazon S3.

Questions # 2:

A company uses an organization in AWS Organizations to manage multiple AWS accounts. The company's internal auditors have administrative access to a single audit account within the organization. A DevOps engineer needs to provide a solution to give the auditors read-only access to all accounts within the organization, including new accounts created in the future. Which solution will meet these requirements?

Options:

A.

Enable AWS IAM Identity Center for the organization. Create a read-only access permission set. Create a permission group that includes the auditors. Grant access to every account in the organization to the auditor permission group by using the read-only access permission set.

B.

Create an AWS CloudFormation stack set to deploy an IAM role that trusts the audit account and allows read-only access. Enable automatic deployment for the stack set. Set the organization root as a deployment target.

C.

Create an SCP that provides read-only access for users in the audit account. Apply the policy to the organization root.

D.

Enable AWS Config in the organization management account. Create an AWS managed rule to check for a role in each account that trusts the audit account and allows read-only access. Enable automated remediation to create the role if it does not exist.

Questions # 3:

A company has an application and a CI/CD pipeline. The CI/CD pipeline consists of an AWS CodePipeline pipeline and an AWS CodeBuild project. The CodeBuild project runs tests against the application as part of the build process and outputs a test report. The company must keep the test reports for 90 days.

Which solution will meet these requirements?

Options:

A.

Add a new stage in the CodePipeline pipeline after the stage that contains the CodeBuild project. Create an Amazon S3 bucket to store the reports. Configure an S3 deploy action type in the new CodePipeline stage with the appropriate path and format for the reports.

B.

Add a report group in the CodeBuild project buildspec file with the appropriate path and format for the reports. Create an Amazon S3 bucket to store the reports. Configure an Amazon EventBridge rule that invokes an AWS Lambda function to copy the reports to the S3 bucket when a build is completed. Create an S3 Lifecycle rule to expire the objects after 90 days.

C.

Add a new stage in the CodePipeline pipeline. Configure a test action type with the appropriate path and format for the reports. Configure the report expiration time to be 90 days in the CodeBuild project buildspec file.

D.

Add a report group in the CodeBuild project buildspec file with the appropriate path and format for the reports. Create an Amazon S3 bucket to store the reports. Configure the report group as an artifact in the CodeBuild project buildspec file. Configure the S3 bucket as the artifact destination. Set the object expiration to 90 days.

Questions # 4:

A company has a web application that publishes logs that contain metadata for transactions, with a status of success or failure for each log. The logs are in JSON format. The application publishes the logs to an Amazon CloudWatch Logs log group.

The company wants to create a dashboard that displays the number of successful transactions.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.

Create an Amazon OpenSearch Service cluster and an OpenSearch Service subscription filter to send the log group data to the cluster. Create a dashboard within the Dashboards feature in the OpenSearch Service cluster by using a search query for transactions that have a status of success.

B.

Create a CloudWatch subscription filter for the log group that uses an AWS Lambda function. Configure the Lambda function to parse the JSON logs and publish a custom metric to CloudWatch for transactions that have a status of success. Create a CloudWatch dashboard by using a metric graph that displays the custom metric.

C.

Create a CloudWatch metric filter for the log groups with a filter pattern that matches the transaction status property and a value of success. Create a CloudWatch dashboard by using a metric graph that displays the new metric.

D.

Create an Amazon Kinesis data stream that is subscribed to the log group. Configure the data stream to filter incoming log data based on a status of success and to send the filtered logs to an AWS Lambda function. Configure the Lambda function to publish a custom metric to CloudWatch. Create a CloudWatch dashboard by using a metric graph that displays the custom metric.

Questions # 5:

A company is storing 100 GB of log data in csv format in an Amazon S3 bucket SQL developers want to query this data and generate graphs to visualize it. The SQL developers also need an efficient automated way to store metadata from the csv file.

Which combination of steps will meet these requirements with the LEAST amount of effort? (Select THREE.)

Options:

A.

Fitter the data through AWS X-Ray to visualize the data.

B.

Filter the data through Amazon QuickSight to visualize the data.

C.

Query the data with Amazon Athena.

D.

Query the data with Amazon Redshift.

E.

Use the AWS Glue Data Catalog as the persistent metadata store.

F.

Use Amazon DynamoDB as the persistent metadata store.

Questions # 6:

A company has configured an Amazon S3 event source on an AWS Lambda function The company needs the Lambda function to run when a new object is created or an existing object IS modified In a particular S3 bucket The Lambda function will use the S3 bucket name and the S3 object key of the incoming event to read the contents of the created or modified S3 object The Lambda function will parse the contents and save the parsed contents to an Amazon DynamoDB table.

The Lambda function's execution role has permissions to read from the S3 bucket and to write to the DynamoDB table, During testing, a DevOps engineer discovers that the Lambda

function does not run when objects are added to the S3 bucket or when existing objects are modified.

Which solution will resolve this problem?

Options:

A.

Increase the memory of the Lambda function to give the function the ability to process large files from the S3 bucket.

B.

Create a resource policy on the Lambda function to grant Amazon S3 the permission to invoke the Lambda function for the S3 bucket

C.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as an OnFailure destination for the Lambda function

D.

Provision space in the /tmp folder of the Lambda function to give the function the ability to process large files from the S3 bucket

Questions # 7:

A company wants to use AWS CloudFormation for infrastructure deployment. The company has strict tagging and resource requirements and wants to limit the deployment to two Regions. Developers will need to deploy multiple versions of the same application.

Which solution ensures resources are deployed in accordance with company policy?

Options:

A.

Create AWS Trusted Advisor checks to find and remediate unapproved CloudFormation StackSets.

B.

Create a Cloud Formation drift detection operation to find and remediate unapproved CloudFormation StackSets.

C.

Create CloudFormation StackSets with approved CloudFormation templates.

D.

Create AWS Service Catalog products with approved CloudFormation templates.

Questions # 8:

A company wants to use AWS development tools to replace its current bash deployment scripts. The company currently deploys a LAMP application to a group of Amazon EC2 instances behind an Application Load Balancer (ALB). During the deployments, the company unit tests the committed application, stops and starts services, unregisters and re-registers instances with the load balancer, and updates file permissions. The company wants to maintain the same deployment functionality through the shift to using AWS services.

Which solution will meet these requirements?

Options:

A.

Use AWS CodeBuild to test the application. Use bash scripts invoked by AWS CodeDeploy's appspec.yml file to restart services, and deregister and register instances with the ALB. Use the appspec.yml file to update file permissions without a custom script.

B.

Use AWS CodePipeline to move the application from the AWS CodeCommit repository to AWS CodeDeploy. Use CodeDeploy's deployment group to test the application, unregister and re-register instances with the ALB. and restart services. Use the appspec.yml file to update file permissions without a custom script.

C.

Use AWS CodePipeline to move the application source code from the AWS CodeCommit repository to AWS CodeDeploy. Use CodeDeploy to test the application. Use CodeDeploy's appspec.yml file to restart services and update permissions without a custom script. Use AWS CodeBuild to unregister and re-register instances with the ALB.

D.

Use AWS CodePipeline to trigger AWS CodeBuild to test the application. Use bash scripts invoked by AWS CodeDeploy's appspec.yml file to restart services. Unregister and re-register the instances in the AWS CodeDeploy deployment group with the ALB. Update the appspec.yml file to update file permissions without a custom script.

Questions # 9:

A company has an application that runs on AWS Lambda and sends logs to Amazon CloudWatch Logs. An Amazon Kinesis data stream is subscribed to the log groups in CloudWatch Logs. A single consumer Lambda function processes the logs from the data stream and stores the logs in an Amazon S3 bucket.

The company's DevOps team has noticed high latency during the processing and ingestion of some logs.

Which combination of steps will reduce the latency? (Select THREE.)

Options:

A.

Create a data stream consumer with enhanced fan-out. Set the Lambda function that processes the logs as the consumer.

B.

Increase the ParallelizationFactor setting in the Lambda event source mapping.

C.

Configure reserved concurrency for the Lambda function that processes the logs.

D.

Increase the batch size in the Kinesis data stream.

E.

Turn off the ReportBatchltemFailures setting in the Lambda event source mapping.

F.

Increase the number of shards in the Kinesis data stream.

Questions # 10:

A company configured an Amazon S3 event source for an AWS Lambda function. The company needs the Lambda function to run when a new object is created or an existing object is modified in a specific S3 bucket. The Lambda function will use the S3 bucket name and the S3 object key of the incoming event to read the contents of the new or modified S3 object. The Lambda function will parse the contents and save the parsed contents to an Amazon DynamoDB table.

The Lambda function's execution role has permissions to 'eari from the S3 bucket and to Write to the DynamoDB table. During testing, a DevOpS engineer discovers that the Lambda fund on does rot run when objects are added to the S3 bucket or when existing objects are modified.

Which solution will resolve these problems?

Options:

A.

Create an S3 bucket policy for the S3 bucket that grants the S3 bucket permission to invoke the Lambda function.

B.

Create a resource policy for the Lambda function to grant Amazon S3 permission to invoke the Lambda function on the S3 bucket.

C.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as an OnFailure destination for the Lambda function. Update the Lambda function to process messages from the SQS queue and the S3 event notifications.

D.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as the destination for the S3 bucket event notifications. Update the Lambda function's execution role to have permission to read from the SQS queue. Update the Lambda function to consume messages from the SQS queue.

Viewing page 1 out of 10 pages
Viewing questions 1-10 out of questions
TOP CODES

TOP CODES

Top selling exam codes in the certification world, popular, in demand and updated to help you pass on the first try.