aiotestking uk

SAA-C03 Exam Questions - Online Test


SAA-C03 Premium VCE File

Learn More 100% Pass Guarantee - Dumps Verified - Instant Download
150 Lectures, 20 Hours

Proper study guides for Renovate Amazon-Web-Services AWS Certified Solutions Architect - Associate (SAA-C03) certified begins with Amazon-Web-Services SAA-C03 preparation products which designed to deliver the Validated SAA-C03 questions by making you pass the SAA-C03 test at your first time. Try the free SAA-C03 demo right now.

Amazon-Web-Services SAA-C03 Free Dumps Questions Online, Read and Test Now.

NEW QUESTION 1
A company wants to establish connectivity between its on-premlses data center and AWS (or an existing workload. The workload runs on Amazon EC2 Instances in two VPCs In different AWS Regions. The VPCs need to communicate with each other. The company needs to provide connectivity from Its data center to both VPCs. The solution must support a bandwidth of 600 Mbps to the data center.
Which solution will meet these requirements?

  • A. Set up an AWS Site-to-Site VPN connection between the data center and one VP
  • B. Create a VPC peering connection between the VPCs.
  • C. Set up an AWS Site-to-Site VPN connection between the data center and each VP
  • D. Create a VPC peering connection between the VPCs.
  • E. Set up an AWS Direct Connect connection between the data center and one VP
  • F. Create a VPC peering connection between the VPCs.
  • G. Create a transit gatewa
  • H. Attach both VPCs to the transit gatewa
  • I. Create an AWS Slte-to-Site VPN tunnel to the transit gateway.

Answer: B

NEW QUESTION 2
A company's reporting system delivers hundreds of csv files to an Amazon S3 bucket each day The company must convert these files to Apache Parquet format and must store the files in a transformed data bucket.
Which solution will meet these requirements with the LEAST development effort?

  • A. Create an Amazon EMR cluster with Apache Spark installed Write a Spark application to transform the data Use EMR File System (EMRFS) to write files to the transformed data bucket
  • B. Create an AWS Glue crawler to discover the data Create an AWS Glue extract transform: and load (ETL) job to transform the data Specify the transformed data bucket in the output step
  • C. Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucketUse the job definition to submit a job Specify an array job as the job type
  • D. Create an AWS Lambda function to transform the data and output the data to the transformed data bucke
  • E. Configure an event notification for the S3 bucke
  • F. Specify the Lambda function as the destination for the event notification.

Answer: D

NEW QUESTION 3
A company has a business system that generates hundreds of reports each day. The business system saves the reports to a network share in CSV format The company needs to store this data in the AWS Cloud in near-real time for analysis. Which solution will meet these requirements with the LEAST administrative overhead?

  • A. Use AWS DataSync to transfer the files to Amazon S3 Create a scheduled task that runs at the end of each day.
  • B. Create an Amazon S3 File Gateway Update the business system to use a new network share from the S3 File Gateway.
  • C. Use AWS DataSync to transfer the files to Amazon S3 Create an application that uses the DataSync API in the automation workflow.
  • D. Deploy an AWS Transfer for SFTP endpoint Create a script that checks for new files on the network share and uploads the new files by using SFTP.

Answer: B

NEW QUESTION 4
A company has an application with a REST-based interface that allows data to be received in near-real time from a third-party vendor Once received the application processes and stores the data for further analysis. The application is running on Amazon EC2 instances.
The third-party vendor has received many 503 Service Unavailable Errors when sending data to the application When the data volume spikes, the compute capacity reaches its maximum limit and the application is unable to process all requests.
Which design should a solutions architect recommend to provide a more scalable solution?

  • A. Use Amazon Kinesis Data Streams to ingest the data Process the data using AWS Lambda function.
  • B. Use Amazon API Gateway on top of the existing applicatio
  • C. Create a usage plan with a quota limit for the third-party vendor
  • D. Use Amazon Simple Notification Service (Amazon SNS) to ingest the data Put the EC2 instances in an Auto Scaling group behind an Application Load Balancer
  • E. Repackage the application as a container Deploy the application using Amazon Elastic Container Service (Amazon ECS) using the EC2 launch type with an Auto Scaling group

Answer: A

NEW QUESTION 5
A company has a web application that runs on Amazon EC2 instances. The company wants end users to authenticate themselves before they use the web application. The web application accesses AWS resources, such as Amazon S3 buckets, on behalf of users who are logged on.
Which combination of actions must a solutions architect take to meet these requirements? (Select TWO).

  • A. Configure AWS App Mesh to log on users.
  • B. Enable and configure AWS Single Sign-On in AWS Identity and Access Management (IAM).
  • C. Define a default (AM role for authenticated users.
  • D. Use AWS Identity and Access Management (IAM) for user authentication.
  • E. Use Amazon Cognito for user authentication.

Answer: BE

NEW QUESTION 6
A solutions architect must design a highly available infrastructure for a website. The website is powered by Windows web servers that run on Amazon EC2 instances. The solutions architect must implement a solution that can mitigate a large-scale DDoS attack that originates from thousands of IP addresses. Downtime is not acceptable for the website.
Which actions should the solutions architect take to protect the website from such an attack? (Select TWO.)

  • A. Use AWS Shield Advanced to stop the DDoS attack.
  • B. Configure Amazon GuardDuty to automatically block the attackers.
  • C. Configure the website to use Amazon CloudFront for both static and dynamic content.
  • D. Use an AWS Lambda function to automatically add attacker IP addresses to VPC network ACLs.
  • E. Use EC2 Spot Instances in an Auto Scaling group with a target tracking scaling policy that is set to 80% CPU utilization

Answer: AC

NEW QUESTION 7
A company is expecting rapid growth in the near future. A solutions architect needs to configure existing users and grant permissions to new users on AWS The solutions architect has decided to create IAM groups The solutions architect will add the new users to IAM groups based on department
Which additional action is the MOST secure way to grant permissions to the new users?

  • A. Apply service control policies (SCPs) to manage access permissions
  • B. Create IAM roles that have least privilege permission Attach the roles lo the IAM groups
  • C. Create an IAM policy that grants least privilege permission Attach the policy to the IAM groups
  • D. Create IAM roles Associate the roles with a permissions boundary that defines the maximum permissions

Answer: C

NEW QUESTION 8
An image-processing company has a web application that users use to upload images. The application uploads the images into an Amazon S3 bucket. The company has set up S3 event notifications to publish the object creation events to an A company has a service that produces event queue. The SQS queue serves as the event source for an AWS Lambda function that processes the images and sends the results to users through email.
Users report that they are receiving multiple email messages for every uploaded image. A solutions architect determines that SQS messages are invoking the Lambda function more than once, resulting in multiple email messages.
What should the solutions architect do to resolve this issue with the LEAST operational overhead?

  • A. Set up long polling in the SQS queue by increasing the ReceiveMessage wait time to 30 seconds.
  • B. Change the SQS standard queue to an SQS FIFO queu
  • C. Use the message deduplication ID to discard duplicate messages.
  • D. Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the batch window timeout.
  • E. Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before processing.

Answer: B

NEW QUESTION 9
A company has an on-premises application that generates a large amount of time-sensitive data that is backed up to Amazon S3. The application has grown and there are user complaints about internet bandwidth limitations. A solutions architect needs to design a long-term solution that allows for both timely backups to Amazon S3 and with minimal impact on internet connectivity for internal users.
Which solution meets these requirements?

  • A. Establish AWS VPN connections and proxy all traffic through a VPC gateway endpoint
  • B. Establish a new AWS Direct Connect connection and direct backup traffic through this new connection.
  • C. Order daily AWS Snowball devices Load the data onto the Snowball devices and return the devices to AWS each day.
  • D. Submit a support ticket through the AWS Management Console Request the removal of S3 service limits from the account.

Answer: B

NEW QUESTION 10
A company is planning to build a high performance computing (HPC) workload as a service solution that Is hosted on AWS A group of 16 AmazonEC2Ltnux Instances requires the lowest possible latency for
node-to-node communication. The instances also need a shared block device volume for high-performing
storage.
Which solution will meet these requirements?

  • A. Use a duster placement grou
  • B. Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon E BS) volume to all the instances by using Amazon EBS Multi-Attach
  • C. Use a cluster placement grou
  • D. Create shared 'lie systems across the instances by using Amazon Elastic File System (Amazon EFS)
  • E. Use a partition placement grou
  • F. Create shared tile systems across the instances by using Amazon Elastic File System (Amazon EFS).
  • G. Use a spread placement grou
  • H. Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon EBS) volume to all the instances by using Amazon EBS Multi-Attach

Answer: A

NEW QUESTION 11
A company has a web-based map application that provides status information about ongoing repairs. The application sometimes has millions of users. Repair teams have a mobile app that sends current location and status in a JSON message to a REST-based endpoint.
Few repairs occur on most days. The company wants the application to be highly available and to scale when large numbers of repairs occur after nature disasters. Customer use the application most often during these times. The company does not want to pay for idle capacity.

  • A. Create a webpage that is based on Amazon S3 to display informatio
  • B. Use Amazon API Gateway and AWS Lambda to receive the JSON status data Store the JSON data m Amazon S3.
  • C. Use Amazon EC2 instances as wad servers across multiple Availability Zone
  • D. Run the EC2 instances inan Auto Scaling grou
  • E. Use Amazon API Gateway and AWS Lambda to receive the JSON status data Store the JSON data In Amazon S3.
  • F. Use Amazon EC2 instances as web servers across multiple Availability Zone
  • G. Run the EC2 instances in an Auto Scaling grou
  • H. Use a REST endpoint on the EC2 instances to receive the JSON status dat
  • I. Store the JSON data in an Amazon RDS Mufti-AZ DB instance.
  • J. Use Amazon EC? instances as web servers across multiple Availability zones Run the FC? instances in an Auto Scaling group Use a REST endpoint on the EC? instances to receive the JSON status data Store the JSON data in an Amazon DynamoDB table.

Answer: D

NEW QUESTION 12
A company needs to move data from an Amazon EC2 instance to an Amazon S3 bucket. The company mutt ensure that no API calls and no data aim routed through public internet routes Only the EC2 instance can have access to upload data to the S3 bucket.
Which solution will meet these requirements?

  • A. Create an interlace VPC endpoinl for Amazon S3 in the subnet where the EC2 instance is located Attach a resource policy to the S3 bucket to only allow the EC2 instance's 1AM rote for access
  • B. Create a gateway VPC endpoinl for Amazon S3 in the Availability Zone where the EC2 instance is located Attach appropriate security groups to the endpoint Attach a resource policy to the S3 bucket to only allow the EC2 instance's lAM tote for access
  • C. Run the nslookup toot from inside the EC2 instance to obtain the private IP address of the S3 bucket's service API endpoint Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucket Attach a resource policy to the S3 bucket to only allow the EC2 instance's AM role for access
  • D. Use the AWS provided publicly available ip-ranges |son file to obtam the pnvate IP address of the S3 bucket's service API endpoint Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucket Attach a resource policy to the S3 bucket to only allow the EC2 instance's 1AM role for access

Answer: B

NEW QUESTION 13
A company has a stateless asynchronous application that runs in an Apache Hadoop cluster The application is invoked on demand to run extract, transform and load (ETL) jobs several limes a day
A solutions architect needs to migrate this application to the AWS Cloud by designing an Amazon EMR cluster for the workload. The cluster must be available immediately to process jobs.
Which implementation meets these requirements MOST cost-effectively?

  • A. Use zonal Reserved Instances for the master nodes and the ewe nodes Use a Spot Fleet lor tire task nodes
  • B. Use zonal Reserved Instances for the master nodes Use Spot instances for the core nodes and the task nodes
  • C. Use regional Reserved Instances for the master nodes Use a Spot Fleer for the core nodes and the task nodes
  • D. Use regional Reserved Instances for the master node
  • E. Use On-Demand Capacity Reservations for the core nodes and the task nodes.

Answer: A

NEW QUESTION 14
A company has a three-tier web application that is deployed on AWS. The web servers are deployed in a public subnet in a VPC. The application servers and database servers are deployed in private subnets in the same VPC. The company has deployed a third-party virtual firewall appliance from AWS Marketplace in an inspection VPC. The appliance is configured with an IP interface that can accept IP packets.
A solutions architect needs to Integrate the web application with the appliance to inspect all traffic to the application before the traffic teaches the web server. Which solution will moot these requirements with the LEAST operational overhead?

  • A. Create a Network Load Balancer the public subnet of the application's VPC to route the traffic lo the appliance for packet inspection
  • B. Create an Application Load Balancer in the public subnet of the application's VPC to route the traffic to the appliance for packet inspection
  • C. Deploy a transit gateway m the inspection VPC Configure route tables to route the incoming pockets through the transit gateway
  • D. Deploy a Gateway Load Balancer in the inspection VPC Create a Gateway Load Balancer endpoint to receive the incoming packets and forward the packets to the appliance

Answer: D

NEW QUESTION 15
A company collects temperature, humidity, and atmospheric pressure data in cities across multiple
continents. The average volume of data collected per site each day is 500 GB. Each site has a highspeed
internet connection. The company's weather forecasting applications are based in a single Region and analyze the data daily.
What is the FASTEST way to aggregate data from all of these global sites?

  • A. Enable Amazon S3 Transfer Acceleration on the destination bucke
  • B. Use multipart uploads todirectly upload site data to the destination bucket.
  • C. Upload site data to an Amazon S3 bucket in the closest AWS Regio
  • D. Use S3 cross-Regionreplication to copy objects to the destination bucket.
  • E. Schedule AWS Snowball jobs daily to transfer data to the closest AWS Regio
  • F. Use S3 cross-Regionreplication to copy objects to the destination bucket.
  • G. Upload the data to an Amazon EC2 instance in the closest Regio
  • H. Store the data in an AmazonElastic Block Store (Amazon EBS) volum
  • I. Once a day take an EBS snapshot and copy it to thecentralized Regio
  • J. Restore the EBS volume in the centralized Region and run an analysis on the datadaily.

Answer: A

Explanation:
Explanation
You might want to use Transfer Acceleration on a bucket for various reasons, including the following:
You have customers that upload to a centralized bucket from all over the world.
You transfer gigabytes to terabytes of data on a regular basis across continents.
You are unable to utilize all of your available bandwidth over the Internet when uploading to Amazon
S3.
https://docs.aws.amazon.com/AmazonS3/latest/dev/transfer-acceleration.html
https://aws.amazon.com/s3/transferacceleration/#:~:text=S3%20Transfer%20Acceleration%20(S3TA)%20reduces,to%20S3%20for%20remote%20applications:
"Amazon S3 Transfer Acceleration can speed up content transfers to and from Amazon S3 by as much
as 50-500% for long-distance transfer of larger objects. Customers who have either web or mobile
applications with widespread users or applications hosted far away from their S3 bucket can experience long and variable upload and download speeds over the Internet"
https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html
"Improved throughput - You can upload parts in parallel to improve throughput."

NEW QUESTION 16
A company runs its ecommerce application on AWS. Every new order is published as a message in a RabbitMQ queue that runs on an Amazon EC2 instance in a single Availability Zone. These messages are processed by a different application that runs on a separate EC2 instance. This application stores the details in a PostgreSQL database on another EC2 instance. All the EC2 instances are in the same Availability Zone.
The company needs to redesign its architecture to provide the highest availability with the least operational overhead.
What should a solutions architect do to meet these requirements?

  • A. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon M
  • B. Create a Multi-AZ Auto Scaling group (or EC2 instances that host the applicatio
  • C. Create another Multi-AZAuto Scaling group for EC2 instances that host the PostgreSQL database.
  • D. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon M
  • E. Create a Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
  • F. Migrate the database to run on a Multi-AZ deployment of Amazon RDS for PostgreSQL.
  • G. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queu
  • H. Create another Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
  • I. Migrate the database to runon a Multi-AZ deployment of Amazon RDS fqjPostgreSQL.
  • J. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queu
  • K. Create another Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
  • L. Create a third Multi-AZ AutoScaling group for EC2 instances that host the PostgreSQL database.

Answer: C

NEW QUESTION 17
A company has an on-premises MySQL database that handles transactional data The company is migrating the database to the AWS Cloud The migrated database must maintain compatibility with the company's applications that use the database The migrated database also must scale automatically during periods of increased demand.
Which migration solution will meet these requirements?

  • A. Use native MySQL tools to migrate the database to Amazon RDS for MySQL Configure elastic storage scaling
  • B. Migrate the database to Amazon Redshift by using the mysqldump utility Turn on Auto Scaling for the Amazon Redshift cluster
  • C. Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon Aurora Turn on Aurora Auto Scaling.
  • D. Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon DynamoDB Configure an Auto Scaling policy.

Answer: C

NEW QUESTION 18
A gaming company hosts a browser-based application on AWS The users of the application consume a large number of videos and images that are stored in Amazon S3. This content is the same for all users
The application has increased in popularity, and millions of users worldwide are accessing these media files. The company wants to provide the files to the users while reducing the load on the origin
Which solution meets these requirements MOST cost-effectively?

  • A. Deploy an AWS Global Accelerator accelerator in front of the web servers
  • B. Deploy an Amazon CloudFront web distribution in front of the S3 bucket
  • C. Deploy an Amazon ElastiCache for Redis instance in front of the web servers
  • D. Deploy an Amazon ElastiCache for Memcached instance in front of the web servers

Answer: B

Explanation:
CloudFront uses Edge Locations to cache content while Global Accelerator uses Edge Locations to find an optimal pathway to the nearest regional endpoint.

NEW QUESTION 19
A company needs to ingested and handle large amounts of streaming data that its application generates. The application runs on Amazon EC2 instances and sends data to Amazon Kinesis Data Streams. which is contained wild default settings. Every other day the application consumes the data and writes the data to an Amazon S3 bucket for business intelligence (BI) processing the company observes that Amazon S3 is not receiving all the data that trio application sends to Kinesis Data Streams.
What should a solutions architect do to resolve this issue?

  • A. Update the Kinesis Data Streams default settings by modifying the data retention period.
  • B. Update the application to use the Kinesis Producer Library (KPL) lo send the data to Kinesis Data Streams.
  • C. Update the number of Kinesis shards lo handle the throughput of me data that is sent to Kinesis Data Streams.
  • D. Turn on S3 Versioning within the S3 bucket to preserve every version of every object that is ingested in the S3 bucket.

Answer: A

NEW QUESTION 20
......

100% Valid and Newest Version SAA-C03 Questions & Answers shared by Dumps-hub.com, Get Full Dumps HERE: https://www.dumps-hub.com/SAA-C03-dumps.html (New 0 Q&As)