Weekend Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: bigdisc65

SAA-C03 Exam Dumps - Amazon Web Services AWS Certified Associate Questions and Answers

Question # 224

A gaming company uses Amazon DynamoDB to store user information such as geographic location, player data, and leaderboards. The company needs to configure continuous backups to an Amazon S3 bucket with a minimal amount of coding. The backups must not affect availability of the application and must not affect the read capacity units (RCUs) that are defined for the table

Which solution meets these requirements?

Options:

A.

Use an Amazon EMR cluster. Create an Apache Hive job to back up the data to Amazon S3.

B.

Export the data directly from DynamoDB to Amazon S3 with continuous backups. Turn on point-in-time recovery for the table.

C.

Configure Amazon DynamoDB Streams. Create an AWS Lambda function to consume the stream and export the data to an Amazon S3 bucket.

D.

Create an AWS Lambda function to export the data from the database tables to Amazon S3 on a regular basis. Turn on point-in-time recovery for the table.

Buy Now
Question # 225

A company needs to implement a new data retention policy for regulatory compliance. As part of this policy, sensitive documents that are stored in an Amazon S3 bucket must be protected from deletion or modification for a fixed period of time.

Which solution will meet these requirements?

Options:

A.

Activate S3 Object Lock on the required objects and enable governance mode.

B.

Activate S3 Object Lock on the required objects and enable compliance mode.

C.

Enable versioning on the S3 bucket. Set a lifecycle policy to delete the objects after a specified period.

D.

Configure an S3 Lifecycle policy to transition objects to S3 Glacier Flexible Retrieval for the retention duration.

Buy Now
Question # 226

A company that uses AWS needs a solution to predict the resources needed for manufacturing processes each month. The solution must use historical values that are currently stored in an Amazon S3 bucket The company has no machine learning (ML) experience and wants to use a managed service for the training and predictions.

Which combination of steps will meet these requirements? (Select TWO.)

Options:

A.

Deploy an Amazon SageMaker model. Create a SageMaker endpoint for inference.

B.

Use Amazon SageMaker to train a model by using the historical data in the S3 bucket.

C.

Configure an AWS Lambda function with a function URL that uses Amazon SageMaker endpoints to create predictions based on the inputs.

D.

Configure an AWS Lambda function with a function URL that uses an Amazon Forecast predictor to create a prediction based on the inputs.

E.

Train an Amazon Forecast predictor by using the historical data in the S3 bucket.

Buy Now
Question # 227

An application uses an Amazon RDS MySQL DB instance. The RDS database is becoming low on disk space. A solutions architect wants to increase the disk space without downtime.

Which solution meets these requirements with the LEAST amount of effort?

Options:

A.

Enable storage autoscaling in RDS.

B.

Increase the RDS database instance size.

C.

Change the RDS database instance storage type to Provisioned IOPS.

D.

Back up the RDS database, increase the storage capacity, restore the database, and stop the previous instance

Buy Now
Question # 228

An ecommerce company has an order-processing application that uses Amazon API Gateway and an AWS Lambda function. The application stores data in an Amazon Aurora PostgreSQL database. During a recent sales event, a sudden surge in customer orders occurred. Some customers experienced timeouts and the application did not process the orders of those customers A solutions architect determined that the CPU utilization and memory utilization were high on the database because of a large number of open connections The solutions architect needs to prevent the timeout errors while making the least possible changes to the application.

Which solution will meet these requirements?

Options:

A.

Configure provisioned concurrency for the Lambda function Modify the database to be a global database in multiple AWS Regions

B.

Use Amazon RDS Proxy to create a proxy for the database Modify the Lambda function to use the RDS Proxy endpoint instead of the database endpoint

C.

Create a read replica for the database in a different AWS Region Use query string parameters in API Gateway to route traffic to the read replica

D.

Migrate the data from Aurora PostgreSQL to Amazon DynamoDB by using AWS Database Migration Service (AWS DMS| Modify the Lambda function to use the OynamoDB table

Buy Now
Question # 229

A company needs to connect several VPCs in the us-east-1 Region that span hundreds of AWS accounts. The company's networking team has its own AWS account to manage the cloud network.

What is the MOST operationally efficient solution to connect the VPCs?

Options:

A.

Set up VPC peering connections between each VPC. Update each associated subnet's route table.

B.

Configure a NAT gateway and an internet gateway in each VPC to connect each VPC through the internet.

C.

Create an AWS Transit Gateway in the networking team's AWS account. Configure static routes from each VPC.

D.

Deploy VPN gateways in each VPC. Create a transit VPC in the networking team's AWS account to connect to each VPC.

Buy Now
Question # 230

A bicycle sharing company is developing a multi-tier architecture to track the location of its bicycles during peak operating hours The company wants to use these data points in its existinganalytics platform A solutions architect must determine the most viable multi-tier option to support this architecture The data points must be accessible from the REST API.

Which action meets these requirements for storing and retrieving location data?

Options:

A.

Use Amazon Athena with Amazon S3

B.

Use Amazon API Gateway with AWS Lambda

C.

Use Amazon QuickSight with Amazon Redshift.

D.

Use Amazon API Gateway with Amazon Kinesis Data Analytics

Buy Now
Question # 231

A company has data collection sensors at different locations. The data collection sensors stream a high volume of data to the company. The company wants to design a platform on AWS to ingest and process high-volume streaming data. The solution must be scalable and support data collection in near real time. The company must store the data in Amazon S3 for future reporting.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon Kinesis Data Firehose to deliver streaming data to Amazon S3.

B.

Use AWS Glue to deliver streaming data to Amazon S3.

C.

Use AWS Lambda to deliver streaming data and store the data to Amazon S3.

D.

Use AWS Database Migration Service (AWS DMS) to deliver streaming data to Amazon S3.

Buy Now
Question # 232

A law firm needs to share information with the public The information includes hundreds of files that must be publicly readable Modifications or deletions of the files by anyone before a designated future date are prohibited.

Which solution will meet these requirements in the MOST secure way?

Options:

A.

Upload all files to an Amazon S3 bucket that is configured for static website hosting. Grant read-only IAM permissions to any AWS principals that access the S3 bucket until the designated date.

B.

Create a new Amazon S3 bucket with S3 Versioning enabled Use S3 Object Lock with a retention period in accordance with the designated date Configure the S3 bucket for static website hosting. Set an S3 bucket policy to allow read-only access to the objrcts.

C.

Create a new Amazon S3 bucket with S3 Versioning enabled Configure an event trigger to run an AWS Lambda function in case of object modification or deletion. Configure the Lambda function to replace the objects with the original versions from a private S3 bucket.

D.

Upload all files to an Amazon S3 bucket that is configured for static website hosting. Select the folder that contains the files. Use S3 Object Lock with a retention period in accordance withthe designated date. Grant read-only IAM permissions to any AWS principals that access the S3 bucket.

Buy Now
Question # 233

A company hosts multiple production applications. One of the applications consists of resources from Amazon EC2, AWS Lambda, Amazon RDS, Amazon Simple Notification Service (Amazon SNS), and Amazon Simple Queue Service (Amazon SQS) across multiple AWS Regions. All company resources are tagged with a tag name of “application” and a value that corresponds to each application. A solutions architect must provide the quickest solution for identifying all of the tagged components.

Which solution meets these requirements?

Options:

A.

Use AWS CloudTrail to generate a list of resources with the application tag.

B.

Use the AWS CLI to query each service across all Regions to report the tagged components.

C.

Run a query in Amazon CloudWatch Logs Insights to report on the components with the application tag.

D.

Run a query with the AWS Resource Groups Tag Editor to report on the resources globally with the application tag.

Buy Now
Question # 234

A company runs production workloads in its AWS account. Multiple teams create and maintain the workloads.

The company needs to be able to detect changes in resource configurations. The company needs to capture changes as configuration items without changing or modifying the existing resources.

Which solution will meet these requirements?

Options:

A.

Use AWS Config. Start the configuration recorder for AWS resources to detect changes in resource configurations.

B.

Use AWS CloudFormation. Initiate drift detection to capture changes in resource configurations.

C.

Use Amazon Detective to detect, analyze, and investigate changes in resource configurations.

D.

Use AWS Audit Manager to capture management events and global service events for resource configurations.

Buy Now
Question # 235

A company has two VPCs named Management and Production. The Management VPC uses VPNs through a customer gateway to connect to a single device in the data center. The Production VPC uses a virtual private gateway AWS Direct Connect connections. The Management and Production VPCs both use a single VPC peering connection to allow communication between the

What should a solutions architect do to mitigate any single point of failure in this architecture?

Options:

A.

Add a set of VPNs between the Management and Production VPCs.

B.

Add a second virtual private gateway and attach it to the Management VPC.

C.

Add a second set of VPNs to the Management VPC from a second customer gateway device.

D.

Add a second VPC peering connection between the Management VPC and the Production VPC.

Buy Now
Question # 236

An ecommerce company is running a seasonal online sale. The company hosts its website on Amazon EC2 instances spanning multiple Availability Zones. The company wants its website to manage sudden traffic increases during the sale.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create an Auto Scaling group that is large enough to handle peak traffic load. Stop half of the Amazon EC2 instances. Configure the Auto Scaling group to use the stopped instances to scale out when traffic increases.

B.

Create an Auto Scaling group for the website. Set the minimum size of the Auto Scaling group so that it can handle high traffic volumes without the need to scale out.

C.

Use Amazon CIoudFront and Amazon ElastiCache to cache dynamic content with an Auto Scaling group set as the origin. Configure the Auto Scaling group with the instances necessary to populate CIoudFront and ElastiCache. Scale in after the cache is fully populated.

D.

Configure an Auto Scaling group to scale out as traffic increases. Create a launch template to start new instances from a preconfigured Amazon Machine Image (AMI).

Buy Now
Question # 237

A company uses Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS) to run its self-managed database The company has 350 TB of data spread across all EBS volumes. The company takes daily EBS snapshots and keeps the snapshots for 1 month. The dally change rate is 5% of the EBS volumes.

Because of new regulations, the company needs to keep the monthly snapshots for 7 years. The company needs to change its backup strategy to comply with the new regulations and to ensure that data is available with minimal administrative effort.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Keep the daily snapshot in the EBS snapshot standard tier for 1 month Copy the monthly snapshot to Amazon S3 Glacier Deep Archive with a 7-year retentionperiod.

B.

Continue with the current EBS snapshot policy. Add a new policy to move the monthly snapshot to Amazon EBS Snapshots Archive with a 7-year retention period.

C.

Keep the daily snapshot in the EBS snapshot standard tier for 1 month Keep the monthly snapshot in the standard tier for 7 years Use incremental snapshots.

D.

Keep the daily snapshot in the EBS snapshot standard tier. Use EBS direct APIs to take snapshots of all the EBS volumes every month. Store the snapshots in an Amazon S3 bucket in the Infrequent Access tier for 7 years.

Buy Now
Question # 238

A company hosts a website on Amazon EC2 instances behind an Application Load Balancer (ALB) The website serves static content Website traffic is increasing and the company is concerned about a potential increase in cost.

What should a solutions architect do to reduce the cost of the website?

Options:

A.

Create an Amazon CloudFront distribution to cache static files at edge locations.

B.

Create an Amazon ElastiCache cluster Connect the ALB to the ElastiCache cluster to serve cached files.

C.

Create an AWS WAF web ACL and associate it with the ALB. Add a rule to the web ACL to cache static files.

D.

Create a second ALB in an alternative AWS Region Route user traffic to the closest Region to minimize data transfer costs

Buy Now
Question # 239

A company runs an application on Amazon EC2 instances. The company needs to implement a disaster recovery (DR) solution for the application. The DR solution needs to have a recovery time objective (RTO) of less than 4 hours. The DR solution also needs to use the fewest possible AWS resources during normal operations.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Create Amazon Machine Images (AMIs) to back up the EC2 instances. Copy the AMIs to a secondary AWS Region. Automate infrastructure deployment in the secondary Region by using AWS Lambda and custom scripts.

B.

Create Amazon Machine Images (AMIs) to back up the EC2 instances. Copy the AMIs to a secondary AWS Region. Automate infrastructure deployment in the secondary Region by using AWS CloudFormation.

C.

Launch EC2 instances in a secondary AWS Region. Keep the EC2 instances in the secondary Region active at all times.

D.

Launch EC2 instances in a secondary Availability Zone. Keep the EC2 instances in the secondary Availability Zone active at all times.

Buy Now
Question # 240

A company moved its on-premises PostgreSQL database to an Amazon RDS for PostgreSQL DB instance. The company successfully launched a new product. The workload on the database has increased.

The company wants to accommodate the larger workload without adding infrastructure.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Buy reserved DB instances for the total workload. Make the Amazon RDS for PostgreSQL DB instance larger.

B.

Make the Amazon RDS for PostgreSQL DB instance a Multi-AZ DB instance.

C.

Buy reserved DB instances for the total workload. Add another Amazon RDS for PostgreSQL DB instance.

D.

Make the Amazon RDS for PostgreSQL DB instance an on-demand DB instance.

Buy Now
Question # 241

A company runs demonstration environments for its customers on Amazon EC2 instances. Each environment is isolated in its own VPC. The company’s operations team needs to be notified when RDP or SSH access to an environment has been established.

Options:

A.

Configure Amazon CloudWatch Application Insights to create AWS Systems Manager OpsItems when RDP or SSH access is detected.

B.

Configure the EC2 instances with an IAM instance profile that has an IAM role with the AmazonSSMManagedInstanceCore policy attached.

C.

Publish VPC flow logs to Amazon CloudWatch Logs. Create required metric filters. Create an Amazon CloudWatch metric alarm with a notification action for when the alarm is in the ALARM state.

D.

Configure an Amazon EventBridge rule to listen for events of type EC2 Instance State-change Notification. Configure an Amazon Simple Notification Service (Amazon SNS) topic as a target. Subscribe the operations team to the topic.

Buy Now
Question # 242

A business application is hosted on Amazon EC2 and uses Amazon S3 for encrypted object storage. The chief information security officer has directed that no application traffic between the two services should traverse the public internet.

Which capability should the solutions architect use to meet the compliance requirements?

Options:

A.

AWS Key Management Service (AWS KMS)

B.

VPC endpoint

C.

Private subnet

D.

Virtual private gateway

Buy Now
Question # 243

A solutions architect needs to copy files from an Amazon S3 bucket to an Amazon Elastic File System (Amazon EFS) file system and another S3 bucket. The files must be copied continuously.New files are added to the original S3 bucket consistently. The copied files should be overwritten only if the source file changes.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS DataSync location for both the destination S3 bucket and the EFS file system. Create a task for the destination S3 bucket and the EFS file system. Set the transfer mode to transfer only data that has changed.

B.

Create an AWS Lambda function. Mount the file system to the function. Set up an S3 event notification to invoke the function when files are created and changed in Amazon S3. Configure the function to copy files to the file system and the destination S3 bucket.

C.

Create an AWS DataSync location for both the destination S3 bucket and the EFS file system. Create a task for the destination S3 bucket and the EFS file system. Set the transfer mode to transfer all data.

D.

Launch an Amazon EC2 instance in the same VPC as the file system. Mount the file system. Create a script to routinely synchronize all objects that changed in the origin S3 bucket to the destination S3 bucket and the mounted file system.

Buy Now
Exam Code: SAA-C03
Exam Name: AWS Certified Solutions Architect - Associate (SAA-C03)
Last Update: Jul 3, 2025
Questions: 1168
SAA-C03 pdf

SAA-C03 PDF

$29.75  $84.99
SAA-C03 Engine

SAA-C03 Testing Engine

$33.25  $94.99
SAA-C03 PDF + Engine

SAA-C03 PDF + Testing Engine

$47.25  $134.99