SAA-C03 Exam Questions

Total 825 Questions

Last Updated Exam : 30-Dec-2024

Topic 1: Exam Pool A

A survey company has gathered data for several years from areas m\ the United States. The company hosts the data in an Amazon S3 bucket that is 3 TB m size and growing. The company has started to share the data with a European marketing firm that has S3 buckets The company wants to ensure that its data transfer costs remain as low as possible Which solution will meet these requirements?


A. Configure the Requester Pays feature on the company's S3 bucket


B. Configure S3 Cross-Region Replication from the company’s S3 bucket to one of the marketing firm's S3 buckets.


C. Configure cross-account access for the marketing firm so that the marketing firm has access to the company’s S3 bucket.


D. Configure the company’s S3 bucket to use S3 Intelligent-Tiering Sync the S3 bucket to one of the marketing firm’s S3 buckets





A.
  Configure the Requester Pays feature on the company's S3 bucket

Explanation: "Typically, you configure buckets to be Requester Pays buckets when you want to share data but not incur charges associated with others accessing the data. For example, you might use Requester Pays buckets when making available large datasets, such as zip code directories, reference data, geospatial information, or web crawling data."

A company hosts a multi-tier web application on Amazon Linux Amazon EC2 instances behind an Application Load Balancer. The instances run in an Auto Scaling group across multiple Availability Zones. The company observes that the Auto Scaling group launches more On-Demand Instances when the application's end users access high volumes of static web content. The company wants to optimize cost. What should a solutions architect do to redesign the application MOST cost-effectively?


A. Update the Auto Scaling group to use Reserved Instances instead of On-Demand Instances.


B. Update the Auto Scaling group to scale by launching Spot Instances instead of On- Demand Instances.


C. Create an Amazon CloudFront distribution to host the static web contents from an Amazon S3 bucket.


D. Create an AWS Lambda function behind an Amazon API Gateway API to host the static website contents.





C.
  Create an Amazon CloudFront distribution to host the static web contents from an Amazon S3 bucket.

Explanation: This answer is correct because it meets the requirements of optimizing cost and reducing the workload on the database. Amazon CloudFront is a content delivery network (CDN) service that speeds up distribution of your static and dynamic web content, such as .html, .css, .js, and image files, to your users. CloudFront delivers your content through a worldwide network of data centers called edge locations. When a user requests content that you’re serving with CloudFront, the request is routed to the edge location that provides the lowest latency (time delay), so that content is delivered with the best possible performance. You can create an Amazon CloudFront distribution to host the static web contents from an Amazon S3 bucket, which is an origin that you define for CloudFront. This way, you can offload the requests for static web content from your EC2 instances to CloudFront, which can improve the performance and availability of your website, and reduce the cost of running your EC2 instances.

A company's website uses an Amazon EC2 instance store for its catalog of items. The company wants to make sure that the catalog is highly available and that the catalog is stored in a durable location.

What should a solutions architect do to meet these requirements?


A. Move the catalog to Amazon ElastiCache for Redis.


B. Deploy a larger EC2 instance with a larger instance store.


C. Move the catalog from the instance store to Amazon S3 Glacier Deep Archive.


D. Move the catalog to an Amazon Elastic File System (Amazon EFS) file system.





D.
  Move the catalog to an Amazon Elastic File System (Amazon EFS) file system.

Explanation: Moving the catalog to an Amazon Elastic File System (Amazon EFS) file system provides both high availability and durability. Amazon EFS is a fully-managed, highly-available, and durable file system that is built to scale on demand. With Amazon EFS, the catalog data can be stored and accessed from multiple EC2 instances in different availability zones, ensuring high availability. Also, Amazon EFS automatically stores files redundantly within and across multiple availability zones, making it a durable storage option.

A company containerized a Windows job that runs on .NET 6 Framework under a Windows container. The company wants to run this job in the AWS Cloud. The job runs every 10 minutes. The job's runtime varies between 1 minute and 3 minutes. Which solution will meet these requirements MOST cost-effectively?


A. Create an AWS Lambda function based on the container image of the job. Configure Amazon EventBridge to invoke the function every 10 minutes.


B. Use AWS Batch to create a job that uses AWS Fargate resources. Configure the job scheduling to run every 10 minutes.


C. Use Amazon Elastic Container Service (Amazon ECS) on AWS Fargate to run the job. Create a scheduled task based on the container image of the job to run every 10 minutes.


D. Use Amazon Elastic Container Service (Amazon ECS) on AWS Fargate to run the job. Create a standalone task based on the container image of the job. Use Windows task scheduler to run the job every 10 minutes.





A.
  Create an AWS Lambda function based on the container image of the job. Configure Amazon EventBridge to invoke the function every 10 minutes.

Explanation: AWS Lambda supports container images as a packaging format for functions. You can use existing container development workflows to package and deploy Lambda functions as container images of up to 10 GB in size. You can also use familiar tools such as Docker CLI to build, test, and push your container images to Amazon Elastic Container Registry (Amazon ECR). You can then create an AWS Lambda function based on the container image of your job and configure Amazon EventBridge to invoke the function every 10 minutes using a cron expression. This solution will be cost-effective as you only pay for the compute time you consume when your function runs.

A company is developing an application that provides order shipping statistics for retrieval by a REST API. The company wants to extract the shipping statistics, organize the data into an easy-to-read HTML format, and send the report to several email addresses at the same time every morning.

Which combination of steps should a solutions architect take to meet these requirements? (Choose two.)


A. Configure the application to send the data to Amazon Kinesis Data Firehose.


B. Use Amazon Simple Email Service (Amazon SES) to format the data and to send the report by email.


C. Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Glue job to query the application's API for the data.


D. Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Lambda function to query the application's API for the data.


E. Store the application data in Amazon S3. Create an Amazon Simple Notification Service (Amazon SNS) topic as an S3 event destination to send the report by





B.
  Use Amazon Simple Email Service (Amazon SES) to format the data and to send the report by email.

D.
  Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Lambda function to query the application's API for the data.

A company needs to store data from its healthcare application. The application's data frequently changes. A new regulation requires audit z access at all levels of the stored data. The company hosts the application on an on-premises infrastructure that is running out of storage capacity. A solutions architect must securely migrate the existing data to AWS while satisfying the new regulation. Which solution will meet these requirements?


A. Use AWS DataSync to move the existing data to Amazon S3. Use AWS CloudTrail to log data events.


B. Use AWS Snowcone to move the existing data to Amazon $3. Use AWS CloudTrail to log management events.


C. Use Amazon S3 Transfer Acceleration to move the existing data to Amazon S3. Use AWS CloudTrail to log data events.


D. Use AWS Storage Gateway to move the existing data to Amazon S3. Use AWS CloudTrail to log management events.





A.
  Use AWS DataSync to move the existing data to Amazon S3. Use AWS CloudTrail to log data events.

Explanation: This answer is correct because it meets the requirements of securely migrating the existing data to AWS and satisfying the new regulation. AWS DataSync is a service that makes it easy to move large amounts of data online between on-premises storage and Amazon S3. DataSync automatically encrypts data in transit and verifies data integrity during transfer. AWS CloudTrail is a service that records AWS API calls for your account and delivers log files to Amazon S3. CloudTrail can log data events, which show the resource operations performed on or within a resource in your AWS account, such as S3 object-level API activity. By using CloudTrail to log data events, you can audit access at all levels of the stored data.

A company is hosting a web application on AWS using a single Amazon EC2 instance that stores user-uploaded documents in an Amazon EBS volume. For better scalability and availability, the company duplicated the architecture and created a second EC2 instance and EBS volume in another Availability Zone placing both behind an Application Load Balancer After completing this change, users reported that, each time they refreshed the website, they could see one subset of their documents or the other, but never all of the documents at the same time.
What should a solutions architect propose to ensure users see all of their documents at once?


A. Copy the data so both EBS volumes contain all the documents.


B. Configure the Application Load Balancer to direct a user to the server with the documents


C. Copy the data from both EBS volumes to Amazon EFS Modify the application to save new documents to Amazon EFS


D. Configure the Application Load Balancer to send the request to both servers Return each document from the correct server.





C.
  Copy the data from both EBS volumes to Amazon EFS Modify the application to save new documents to Amazon EFS

A company wants to use high-performance computing and artificial intelligence to improve its fraud prevention and detection technology. The company requires distributed processing to complete a single workload as quickly as possible. Which solution will meet these requirements?


A. Use Amazon Elastic Kubernetes Service (Amazon EKS) and multiple containers.


B. Use AWS ParallelCluster and the Message Passing Interface (MPI) libraries.


C. Use an Application Load Balancer and Amazon EC2 instances.


D. Use AWS Lambda functions.





B.
  Use AWS ParallelCluster and the Message Passing Interface (MPI) libraries.

Explanation: AWS ParallelCluster is a service that allows you to create and manage high- performance computing (HPC) clusters on AWS. It supports multiple schedulers, including AWS Batch, which can run distributed workloads across multiple EC2 instances1. MPI is a standard for message passing between processes in parallel computing. It provides functions for sending and receiving data, synchronizing processes, and managing communication groups2. By using AWS ParallelCluster and MPI libraries, you can take advantage of the following benefits: You can easily create and configure HPC clusters that meet your specific requirements, such as instance type, number of nodes, network configuration, and storage options1. You can leverage the scalability and elasticity of AWS to run large-scale parallel workloads without worrying about provisioning or managing servers1. You can use MPI libraries to optimize the performance and efficiency of your parallel applications by enabling inter-process communication and data exchange2. You can choose from a variety of MPI implementations that are compatible with AWS ParallelCluster, such as Open MPI, Intel MPI, and MPICH3.

A company is storing sensitive user information in an Amazon S3 bucket The company wants to provide secure access to this bucket from the application tier running on Ama2on EC2 instances inside a VPC.
Which combination of steps should a solutions architect take to accomplish this? (Select TWO.)


A. Configure a VPC gateway endpoint for Amazon S3 within the VPC


B. Create a bucket policy to make the objects to the S3 bucket public


C. Create a bucket policy that limits access to only the application tier running in the VPC


D. Create an IAM user with an S3 access policy and copy the IAM credentials to the EC2 instance


E. Create a NAT instance and have the EC2 instances use the NAT instance to access the S3 bucket





A.
  Configure a VPC gateway endpoint for Amazon S3 within the VPC

C.
  Create a bucket policy that limits access to only the application tier running in the VPC

A company hosts an application on AWS Lambda functions mat are invoked by an Amazon API Gateway API The Lambda functions save customer data to an Amazon Aurora MySQL database Whenever the company upgrades the database, the Lambda functions fail to establish database connections until the upgrad is complete The result is that customer data Is not recorded for some of the event
A solutions architect needs to design a solution that stores customer data that is created during database upgrades
Which solution will meet these requirements?


A. Provision an Amazon RDS proxy to sit between the Lambda functions and the database Configure the Lambda functions to connect to the RDS proxy


B. Increase the run time of me Lambda functions to the maximum Create a retry mechanism in the code that stores the customer data in the database


C. Persist the customer data to Lambda local storage. Configure new Lambda functions to scan the local storage to save the customer data to the database.


D. Store the customer data m an Amazon Simple Queue Service (Amazon SOS) FIFO queue Create a new Lambda function that polls the queue and stores the customer data in the database





D.
  Store the customer data m an Amazon Simple Queue Service (Amazon SOS) FIFO queue Create a new Lambda function that polls the queue and stores the customer data in the database


Page 16 out of 83 Pages
Previous