$0.00
Amazon SAP-C01 Exam Dumps

Amazon SAP-C01 Exam Dumps

AWS Certified Solutions Architect - Professional

318 Questions & Answers with Explanation
Update Date : March 06, 2024
PDF + Test Engine
$65 $95
Test Engine
$55 $85
PDF Only
$45 $75

Money back Guarantee

We just do not compromise with the bright future of our respected customers. PassExam4Sure takes the future of clients quite seriously and we ensure that our SAP-C01 exam dumps get you through the line. If you think that our exam question and answers did not help you much with the exam paper and you failed it somehow, we will happily return all of your invested money with a full 100% refund.

100% Real Questions

We verify and assure the authenticity of Amazon SAP-C01 exam dumps PDFs with 100% real and exam-oriented questions. Our exam questions and answers comprise 100% real exam questions from the latest and most recent exams in which you’re going to appear. So, our majestic library of exam dumps for Amazon SAP-C01 is surely going to push on forward on the path of success.

Security & Privacy

Free for download Amazon SAP-C01 demo papers are available for our customers to verify the authenticity of our legit helpful exam paper samples, and to authenticate what you will be getting from PassExam4Sure. We have tons of visitors daily who simply opt and try this process before making their purchase for Amazon SAP-C01 exam dumps.



Last Week SAP-C01 Exam Results

215

Customers Passed Amazon SAP-C01 Exam

96%

Average Score In Real SAP-C01 Exam

99%

Questions came from our SAP-C01 dumps.



Authentic SAP-C01 Exam Dumps


Prepare for Amazon SAP-C01 Exam like a Pro

PassExam4Sure is famous for its top-notch services for providing the most helpful, accurate, and up-to-date material for Amazon SAP-C01 exam in form of PDFs. Our SAP-C01 dumps for this particular exam is timely tested for any reviews in the content and if it needs any format changes or addition of new questions as per new exams conducted in recent times. Our highly-qualified professionals assure the guarantee that you will be passing out your exam with at least 85% marks overall. PassExam4Sure Amazon SAP-C01 ProvenDumps is the best possible way to prepare and pass your certification exam.

Easy Access and Friendly UI

PassExam4Sure is your best buddy in providing you with the latest and most accurate material without any hidden charges or pointless scrolling. We value your time and we strive hard to provide you with the best possible formatting of the PDFs with accurate, to the point, and vital information about Amazon SAP-C01. PassExam4Sure is your 24/7 guide partner and our exam material is curated in a way that it will be easily readable on all smartphone devices, tabs, and laptop PCs.

PassExam4Sure - The Undisputed King for Preparing SAP-C01 Exam

We have a sheer focus on providing you with the best course material for Amazon SAP-C01. So that you may prepare your exam like a pro, and get certified within no time. Our practice exam material will give you the necessary confidence you need to sit, relax, and do the exam in a real exam environment. If you truly crave success then simply sign up for PassExam4Sure Amazon SAP-C01 exam material. There are millions of people all over the globe who have completed their certification using PassExam4Sure exam dumps for Amazon SAP-C01.

100% Authentic Amazon SAP-C01 – Study Guide (Update 2024)

Our Amazon SAP-C01 exam questions and answers are reviewed by us on weekly basis. Our team of highly qualified Amazon professionals, who once also cleared the exams using our certification content does all the analysis of our recent exam dumps. The team makes sure that you will be getting the latest and the greatest exam content to practice, and polish your skills the right way. All you got to do now is to practice, practice a lot by taking our demo questions exam, and making sure that you prepare well for the final examination. Amazon SAP-C01 test is going to test you, play with your mind and psychology, and so be prepared for what’s coming. PassExam4Sure is here to help you and guide you in all steps you will be going through in your preparation for glory. Our free downloadable demo content can be checked out if you feel like testing us before investing your hard-earned money. PassExam4Sure guaranteed your success in the Amazon SAP-C01 exam because we have the newest and most authentic exam material that cannot be found anywhere else on the internet.


Amazon SAP-C01 Sample Questions

Question # 1

A mobile gaming company is expanding into the global market. The company's game servers run in the us-east-1 Region. The game's client application uses UDP to communicate with the game servers and needs to be able to connect to a set of static IP addresses. The company wants its game to be accessible on multiple continents. The company also wants the game to maintain its network performance and global availability. Which solution meets these requirements? 

A. Provision an Application Load Balancer (ALB) in front of the game servers Create an Amazon CloudFront distribution that has no geographical restrictions Set the ALB as the origin Perform DNS lookups for the cloudfront net domain name Use the resulting IP addresses in the game's client application. 
B. Provision game servers in each AWS Region. Provision an Application Load Balancer in front of the game servers. Create an Amazon Route 53 latency-based routing policy for the game's client application to use with DNS lookups 
C. Provision game servers in each AWS Region Provision a Network Load Balancer (NLB) in front of the game servers Create an accelerator in AWS Global Accelerator, and configure endpoint groups in each Region Associate the NLBs with the corresponding Regional endpoint groups Point the game client's application to the Global Accelerator endpoints 
D. Provision game servers in each AWS Region Provision a Network Load Balancer (NLB) in front of the game servers Create an Amazon CloudFront distribution that has no geographical restrictions Set the NLB as the origin Perform DNS lookups for the cloudfront net domain name. Use the resulting IP addresses in the game's client application 



Question # 2

A software company is using three AWS accounts for each of its 1 0 development teams The company has developed an AWS CloudFormation standard VPC template that includes three NAT gateways The template is added to each account for each team The company is concerned that network costs will increase each time a new development team is added A solutions architect must maintain the reliability of the company's solutions and minimize operational complexity What should the solutions architect do to reduce the network costs while meeting these requirements? 

A. Create a single VPC with three NAT gateways in a shared services account Configure each account VPC with a default route through a transit gateway to the NAT gateway in the shared services account VPC Remove all NAT gateways from the standard VPC template 
B. Create a single VPC with three NAT gateways in a shared services account Configure each account VPC with a default route through a VPC peering connection to the NAT gateway in the shared services account VPC Remove all NAT gateways from the standard VPC template
C. Remove two NAT gateways from the standard VPC template Rely on the NAT gateway SLA to cover reliability for the remaining NAT gateway. 
D. Create a single VPC with three NAT gateways in a shared services account Configure a Site-to-Site VPN connection from each account to the shared services account Remove all NAT gateways from the standard VPC template 



Question # 3

A life sciences company is using a combination of open source tools to manage data analysis workflows and Docker containers running on servers in its on-premises data center to process genomics data Sequencing data is generated and stored on a local storage area network (SAN), and then the data is processed. The research and development teams are running into capacity issues and have decided to re-architect their genomics analysis platform on AWS to scale based on workload demands and reduce the turnaround time from weeks to days The company has a high-speed AWS Direct Connect connection Sequencers will generate around 200 GB of data for each genome, and individual jobs can take several hours to process the data with ideal compute capacity. The end result will be stored in Amazon S3. The company is expecting 10-15 job requests each day Which solution meets these requirements?  

A. Use regularly scheduled AWS Snowball Edge devices to transfer the sequencing data into AWS When AWS receives the Snowball Edge device and the data is loaded into Amazon S3 use S3 events to trigger an AWS Lambda function to process the data 
B. Use AWS Data Pipeline to transfer the sequencing data to Amazon S3 Use S3 events to trigger an Amazon EC2 Auto Scaling group to launch custom-AMI EC2 instances running the Docker containers to process the data 
C. Use AWS DataSync to transfer the sequencing data to Amazon S3 Use S3 events to trigger an AWS Lambda function that starts an AWS Step Functions workflow Store the Docker images in Amazon Elastic Container Registry (Amazon ECR) and trigger AWS Batch to run the container and process the sequencing data 
D. Use an AWS Storage Gateway file gateway to transfer the sequencing data to Amazon S3 Use S3 events to trigger an AWS Batch job that runs on Amazon EC2 instances running the Docker containers to process the data 



Question # 4

A company is planning to migrate an application from on premises to the AWS Cloud. The company will begin the migration by moving the application's underlying data storage to AWS The application data is stored on a shared tie system on premises, and the application servers connect to the shared We system through SMB. A solutions architect must implement a solution that uses an Amazon S3 bucket tor shared storage Until the application Is fully migrated and code is rewritten to use native Amazon S3 APIs, the application must continue to have access to the data through SMB The solutions architect must migrate the application data to AWS to its new location while still allowing the on-premises application to access the data. Which solution will meet these requirements? 

A. Create a new Amazon FSx for Windows File Server fie system Configure AWS DataSync with one location tor the on-premises file share and one location for the new Amazon FSx file system Create a new DataSync task to copy the data from the onpremises file share location to the Amazon FSx file system 
B. Create an S3 bucket for the application. Copy the data from the on-premises storage to the S3 bucket 
C. Deploy an AWS Server Migration Service (AWS SMS) VM to the on-premises environment. Use AWS SMS to migrate the file storage server from on premises to an Amazon EC2 instance 
D. Create an S3 bucket for the application. Deploy a new AWS Storage Gateway Me gateway on an on-premises VM. Create a new file share that stores data in the S3 bucket and is associated with the tie gateway. Copy the data from the on-premises storage to the new file gateway endpoint. 



Question # 5

A company has an application Once a month, the application creates a compressed file that contains every object within an Amazon S3 bucket The total size of the objects before compression is 1 TB. The application runs by using a scheduled cron job on an Amazon EC2 instance that has a 5 TB Amazon Elastic Block Store (Amazon EBS) volume attached The application downloads all the files from the source S3 bucket to the EBS volume, compresses the file, and uploads the file to a target S3 bucket Every invocation of the application takes 2 hours from start to finish Which combination of actions should a solutions architect take to OPTIMIZE costs for this application? (Select TWO.) 

A. Migrate the application to run an AWS Lambda function Use Amazon EventBridge (Amazon CloudWatch Events) to schedule the Lambda function to run once each month 
B. Configure the application to download the source files by using streams Direct the streams into a compression library Direct the output of the compression library into a target object in Amazon S3 
C. Configure the application to download the source files from Amazon S3 and save the files to local storage Compress the files and upload them to Amazon S3 
D. Configure the application to run as a container in AWS Fargate Use Amazon EventBridge (Amazon CloudWatch Events) to schedule the task to run once each month E. Provision an Amazon Elastic File System (Amazon EFS) file system Attach the file system to the AWS Lambda function 



Question # 6

A retail company needs to provide a series of data files to another company. which is its business partner. These files are saved in an Amazon S3 bucket under Account A. which belongs to the retail company. The business partner company wants one of its IAM users User_DataProcessor to access the files from its own AWS account (Account B) Which combination of steps must the companies take so that User_DataProcessor can access the S3 bucket successfully? (Select TWO.) 

A. Turn on the cross-origin resource sharing (CORS) feature for the S3 bucket in Account



E. InAccount B, set the permissions of User_DataProcessor to the following: 



Question # 7

A company has a media metadata extraction pipeline running on AWS. Notifications containing a reference to a file Amazon S3 are sent to an Amazon Simple Notification Service (Amazon SNS) topic The pipeline consists of a number of AWS Lambda functions that are subscribed to the SNS topic The Lambda functions extract the S3 file and write metadata to an Amazon RDS PostgreSQL DB instance. Users report that updates to the metadata are sometimes stow to appear or are lost. During these times, the CPU utilization on the database is high and the number of failed Lambda invocations increases. Which combination of actions should a solutions architect take to r-e'p resolve this issue? (Select TWO.) 

A. Enable massage delivery status on the SNS topic Configure the SNS topic delivery policy to enable retries with exponential backoff 
B. Create an Amazon Simple Queue Service (Amazon SOS) FIFO queue and subscribe the queue to the SNS topic Configure the Lambda functions to consume messages from the SQS queue. 
C. Create an RDS proxy for the RDS instance Update the Lambda functions to connect to the RDS instance using the proxy.
 D. Enable the RDS Data API for the RDS instance. Update the Lambda functions to connect to the RDS instance using the Data API 
E. Create an Amazon Simple Queue Service (Amazon SQS) standard queue for each Lambda function and subscribe the queues to the SNS topic. Configure the Lambda functions to consume messages from their respective SQS queue. 



Question # 8

A company has more than 10.000 sensors that send data to an on-premises Apache Kafka server by using the Message Queuing Telemetry Transport (MQTT) protocol . The onpremises Kafka server transforms the data and then stores the results as objects in an Amazon S3 bucket Recently, the Kafka server crashed. The company lost sensor data while the server was being restored A solutions architect must create a new design on AWS that is highly available and scalable to prevent a similar occurrence Which solution will meet these requirements?

A. Launch two Amazon EC2 instances to host the Kafka server in an active/standby configuration across two Availability Zones. Create a domain name in Amazon Route 53 Create a Route 53 failover policy Route the sensors to send the data to the domain name 
B. Migrate the on-premises Kafka server to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Create a Network Load Balancer (NLB) that points to the Amazon MSK broker. Enable NLB health checks Route the sensors to send the data to the NLB. 
C. Deploy AWS loT Core, and connect it to an Amazon Kinesis Data Firehose delivery stream Use an AWS Lambda function to handle data transformation Route the sensors to send the data to AWS loT Core
D. Deploy AWS loT Core, and launch an Amazon EC2 instance to host the Kafka server Configure AWS loT Core to send the data to the EC2 instance Route the sensors to send the data to AWSIoT Core. 



Question # 9

A development team s Deploying new APIs as serverless applications within a company. The team is currently using the AWS Maragement Console to provision Amazon API Gateway. AWS Lambda, and Amazon DynamoDB resources A solutions architect has been tasked with automating the future deployments of these serveriess APIs How can this be accomplished?

A. Use AWS CloudFonTiation with a Lambda-backed custom resource to provision API Gateway Use the MfS: :OynMoDB::Table and AWS::Lambda::Function resources to create the Amazon DynamoOB table and Lambda functions Write a script to automata the deployment of the CloudFormation template. 
B. Use the AWS Serverless Application Model to define the resources Upload a YAML template and application files to the code repository Use AWS CodePipeline to conned to the code repository and to create an action to build using AWS CodeBuild. Use the AWS CloudFormabon deployment provider m CodePipeline to deploy the solution. 
C. Use AWS CloudFormation to define the serverless application. Implement versioning on the Lambda functions and create aliases to point to the versions. When deploying, configure weights to implement shifting traffic to the newest version, and gradually update the weights as traffic moves over 
D. Commit the application code to the AWS CodeCommit code repository. Use AWS CodePipeline and connect to the CodeCommit code repository Use AWS CodeBuild to build and deploy the Lambda functions using AWS CodeDeptoy Specify the deployment preference type in CodeDeploy to gradually shift traffic over to the new version. 



Question # 10

A company is planning to migrate an application from on premises to AWS. The application currently uses an Oracle database and the company can tolerate a brief downtime of 1 hour when performing the switch to the new infrastructure As part of the migration. the database engine will be changed to MySQL. A solutions architect needs to determine which AWS services can be used to perform the migration while minimizing the amount of work and time required. Which of the following will meet the requirements?

A. Use AWS SCT to generate the schema scripts and apply them on the target prior to migration Use AWS DMS to analyse the current schema and provide a recommendation for the optimal database engine Then, use AWS DMS to migrate to the recommended engine Use AWS SCT to identify what embedded SQL code in the application can be converted and what has to be done manually
B. Use AWS SCT to generate the schema scripts and apply them on the target prior to migration. Use AWS DMS to begin moving data from the on-premises database to AWS. After the initial copy continue to use AWS DMS to keep the databases m sync until cutting over to the new database Use AWS SCT to identify what embedded SOL code in the application can be converted and what has to be done manually
C. Use AWS DMS lo help identify the best target deployment between installing the database engine on Amazon EC2 directly or moving to Amazon RDS. Then, use AWS DMS to migrate to the platform. Use AWS Application Discovery Service to identify what embedded SQL code in the application can be converted and what has to be done manually. 
D. Use AWS DMS to begin moving data from the on-premises database to AWS After the initial copy, continue to use AWS DMS to keep the databases in sync until cutting over to the new database use AWS Application Discovery Service to identify what embedded SQL code m the application can be convened and what has to be done manually 



Question # 11

A car rental company has built a serverless REST API to provide data to its mobile app. The app consists of an Amazon API Gateway API with a Regional endpoint, AWS Lambda functions and an Amazon Aurora MySQL Serverless DB cluster The company recently opened the API to mobile apps of partners A significant increase in the number of requests resulted causing sporadic database memory errors Analysis of the API traffic indicates that clients are making multiple HTTP GET requests for the same queries in a short period of time Traffic is concentrated during business hours, with spikes around holidays and other events The company needs to improve its ability to support the additional usage while minimizing the increase in costs associated with the solution. Which strategy meets these requirements?

A. Convert the API Gateway Regional endpoint to an edge-optimized endpoint Enable caching in the production stage. 
B. Implement an Amazon ElastiCache for Redis cache to store the results of the database calls Modify the Lambda functions to use the cache 
C. Modify the Aurora Serverless DB cluster configuration to increase the maximum amount of available memory 
D. Enable throttling in the API Gateway production stage Set the rate and burst values to limit the incoming calls 



Question # 12

A company is migrating its infrastructure to the AW5 Cloud. The company must comply with a variety of regulatory standards for different projects. The company needs a multiaccount environment. A solutions architect needs to prepare the baseline infrastructure The solution must provide a consistent baseline of management and security but it must allow flexibility for different compliance requirements within various AWS accounts. The solution also needs to integrate with the existing on-premises Active Directory Federation Services (AD FS) server. Which solution meets these requirements with the LEAST amount of operational overhead? 

A. Create an organization In AWS Organizations Create a single SCP for least privilege access across all accounts Create a single OU for all accounts Configure an IAM identity provider tor federation with the on-premises AD FS server Configure a central togging account with a defined process for log generating services to send log events to the central account. Enable AWS Config in the central account with conformance packs for all accounts.
B. Create an organization In AWS Organizations Enable AWS Control Tower on the organization. Review included guardrails for SCPs. Check AWS Config for areas that require additions Add OUs as necessary Connect AWS Single Sign-On to the on-premises AD FS server 
C. Create an organization in AWS Organizations Create SCPs for least privilege access Create an OU structure, and use it to group AWS accounts Connect AWS Single Sign-On to the on-premises AD FS server. Configure a central logging account with a defined process for tog generating services to send log events to the central account Enable AWS Config in the central account with aggregators and conformance packs. 
D. Create an organization in AWS Organizations Enable AWS Control Tower on the organization Review included guardrails for SCPs. Check AWS Config for areas that require additions Configure an IAM identity provider for federation with the on-premises AD FS server. 



Question # 13

A company's solution architect is designing a diasaster recovery (DR) solution for an application that runs on AWS. The application uses PostgreSQL 11.7 as its database. The company has an PRO of 30 seconds. The solutions architect must design a DR solution with the primary database in the us-east-1 Region and the database in the us-west-2 Region. What should the solution architect do to meet these requirements with minimum application change?

A. Migrate the database to Amazon RDS for PostgreSQL in us-east-1. Set up a read replica up a read replica in us-west-2. Set the managed PRO for the RDS database to 30 seconds. 
B. Migrate the database to Amazon for PostgreSQL in us-east-1. Set up a standby replica in an Availability Zone in us-west-2, Set the managed PRO for the RDS database to 30 seconds. 
C. Migrate the database to an Amazon Aurora PostgreSQL global database with the primary Region as us-east-1 and the secondary Region as us-west-2. Set the managed PRO for the Aurora database to 30 seconds. 
D. Migrate the database to Amazon DynamoDB in us-east-1. Set up global tables with replica tables that are created in us-west-2. 



Question # 14

A solutions architect has been assigned to migrate a 50 TB Oracle data warehouse that contains sales data from on-premises to Amazon Redshift Major updates to the sales data occur on the final calendar day of the month For the remainder of the month, the data warehouse only receives minor daily updates and is primarily used for reading and reporting Because of this the migration process must start on the first day of the month and must be complete before the next set of updates occur. This provides approximately 30 days to complete the migration and ensure that the minor daily changes have been synchronized with the Amazon Redshift data warehouse Because the migration cannot impact normal business network operations, the bandwidth allocated to the migration for moving data over the internet is 50 Mbps The company wants to keep data migration costs low Which steps will allow the solutions architect to perform the migration within the specified timeline? 

A. Install Oracle database software on an Amazon EC2 instance Configure VPN connectivity between AWS and the company's data center Configure the Oracle database running on Amazon EC2 to join the Oracle Real Application Clusters (RAC) When the Oracle database on Amazon EC2 finishes synchronizing, create an AWS DMS ongoing replication task to migrate the data from the Oracle database on Amazon EC2 to Amazon Redshift Verify the data migration is complete and perform the cut over to Amazon Redshift.
B. Create an AWS Snowball import job Export a backup of the Oracle data warehouse Copy the exported data to the Snowball device Return the Snowball device to AWS Create an Amazon RDS for Oracle database and restore the backup file to that RDS instance Create an AWS DMS task to migrate the data from the RDS for Oracle database to Amazon Redshift Copy daily incremental backups from Oracle in the data center to the RDS for Oracle database over the internet Verify the data migration is complete and perform the cut over to Amazon Redshift. 
C. Install Oracle database software on an Amazon EC2 instance To minimize the migration time configure VPN connectivity between AWS and the company's data center by provisioning a 1 Gbps AWS Direct Connect connection Configure the Oracle database running on Amazon EC2 to be a read replica of the data center Oracle database Start the synchronization process between the company's on-premises data center and the Oracle database on Amazon EC2 When the Oracle database on Amazon EC2 is synchronized with the on-premises database create an AWS DMS ongoing replication task from the Oracle database read replica that is running on Amazon EC2 to Amazon Redshift Verify the data migration is complete and perform the cut over to Amazon Redshift. 
D. Create an AWS Snowball import job. Configure a server in the company€™s data center with an extraction agent. Use AWS SCT to manage the extraction agent and convert the Oracle schema to an Amazon Redshift schema. Create a new project in AWS SCT using the registered data extraction agent. Create a local task and an AWS DMS task in AWS SCT with replication of ongoing changes. Copy data to the Snowball device and return the Snowball device to AWS. Allow AWS DMS to copy data from Amazon S3 to Amazon Redshift. Verify that the data migration is complete and perform the cut over to Amazon Redshift. 



Question # 15

A financial services company loaded millions of historical stock trades into an Amazon DynamoDB table The table uses on-demand capacity mode Once each day at midnight, a few million new records are loaded into the table Application read activity against the table happens in bursts throughout the day, and a limited set of keys are repeatedly looked up. The company needs to reduce costs associated with DynamoDB. Which strategy should a solutions architect recommend to meet this requirement? 

A. Deploy an Amazon ElastiCache cluster in front of the DynamoDB table. 
B. Deploy DynamoDB Accelerator (DAX) Configure DynamoDB auto scaling Purchase Savings Plans in Cost Explorer 
C. Use provisioned capacity mode Purchase Savings Plans in Cost Explorer 
D. Deploy DynamoDB Accelerator (DAX) Use provisioned capacity mode Configure DynamoDB auto scaling 



Question # 16

A company is launching a web-based application in multiple regions around the world The application consists of both static content stored in a private Amazon S3 bucket and dyna ECS containers behind an Application Load Balancer (ALB) The company requires that the static and dynamic application content be accessible through Amazon CloudFront only Which combination of steps should a solutions architect recommend to restrict direct content access to CloudFront? (Select THREE)

A. Create a web ACL in AWS WAF with a rule to validate the presence of a custom header and associate the web ACL with the ALB 
B. Create a web ACL in AWS WAF with a rule to validate the presence of a custom header and associate the web ACL with the CloudFront distribution 
C. Configure CloudFront to add a custom header to origin requests 
D. Configure the ALB to add a custom header to HTTP requests
E. Update the S3 bucket ACL to allow access from the CloudFront distribution only
 F. Create a CloudFront Origin Access Identity (OAI) and add it to the CloudFront distribution Update the S3 bucket policy to allow access to the OAI only 



Question # 17

A company recently started hosting new application workloads in the AWS Cloud. The company is using Amazon EC2 instances. Amazon Elastic File System (Amazon EFS) file systems, and Amazon RDS DB instances. To meet regulatory and business requirements, the company must make the following changes for data backups: • Backups must be retained based on custom daily, weekly, and monthly requirements. • Backups must be replicated to at least one other AWS Region immediately after capture. • The backup solution must provide a single source of backup status across the AWS environment. • The backup solution must send immediate notifications upon failure of any resource backup. Which combination of steps will meet these requirements with the LEAST amount of operational overhead? (Select THREE.)

A. Create an AWS Backup plan with a backup rule for each of the retention requirements. 
B. Configure an AWS Backup plan to copy backups to another Region. 
C. Create an AWS Lambda function to replicate backups to another Region and send notification if a failure occurs. 
D. Add an Amazon Simple Notification Service (Amazon SNS) topic to the backup plan to send a notification for finished jobs that have any status except BACKUP_JOB_COMPLETEO. 
E. Create an Amazon Data Lifecycle Manager (Amazon DLM) snapshot lifecycle policy for each of the retention requirements. F. Set up RDS snapshots on each database. 



Question # 18

An online magazine will launch Its latest edition this month. This edition will be the first to be distributed globally. The magazine's dynamic website currently uses an Application Load Balancer in front of the web tier a fleet of Amazon EC2 instances for web and application servers, and Amazon Aurora MySQL. Portions of the website include static content and almost all traffic is read-only The magazine is expecting a significant spike m internet traffic when the new edition is launched Optimal performance is a top priority for the week following the launch Which combination of steps should a solutions architect take to reduce system response antes for a global audience? (Select TWO )

A. Use logical cross-Region replication to replicate the Aurora MySQL database to a secondary Region Replace the web servers with Amazon S3 Deploy S3 buckets in crossRegion replication mode 
B. Ensure the web and application tiers are each m Auto Scaling groups. Introduce an AWS Direct Connect connection Deploy the web and application tiers in Regions across the world 
C. Migrate the database from Amazon Aurora to Amazon RDS for MySQL. Ensure all three of the application tiers—web. application, and database—are in private subnets. 
D. Use an Aurora global database for physical cross-Region replication. Use Amazon S3 with cross-Region replication for static content and resources. Deploy the web and application tiers in Regions across the world
 E. Introduce Amazon Route 53 with latency-based routing and Amazon CloudFront distributions. Ensure me web and application tiers are each in Auto Scaling groups 



Question # 19

A company wants to migrate its workloads from on premises to AWS. The workloads run on Linux and Windows. The company has a large on-premises intra structure that consists of physical machines and VMs that host numerous applications. The company must capture details about the system configuration. system performance. running processure and network coi.net lions of its o. -premises ,on boards. The company also must divide the on-premises applications into groups for AWS migrations. The company needs recommendations for Amazon EC2 instance types so that the company can run its workloads on AWS in the most cost-effective manner. Which combination of steps should a solutions architect take to meet these requirements? (Select THREE.) 

A. Assess the existing applications by installing AWS Application Discovery Agent on the physical machines and VMs. 
B. Assess the existing applications by installing AWS Systems Manager Agent on the physical machines and VMs 
C. Group servers into applications for migration by using AWS Systems Manager Application Manager. 
D. Group servers into applications for migration by using AWS Migration Hub. 
E. Generate recommended instance types and associated costs by using AWS Migration Hub. 
F. Import data about server sizes into AWS Trusted Advisor. Follow the recommendations for cost optimization. 



Question # 20

A company has multiple business units Each business unit has its own AWS account and runs a single website within that account. The company also has a single logging account. Logs from each business unit website are aggregated into a single Amazon S3 bucket in the logging account. The S3 bucket policy provides each business unit with access to write data into the bucket and requires data to be encrypted. The company needs to encrypt logs uploaded into the bucket using a Single AWS Key Management Service {AWS KMS) CMK The CMK that protects the data must be rotated once every 365 days Which strategy is the MOST operationally efficient for the company to use to meet these requirements? 

A. Create a customer managed CMK ri the logging account Update the CMK key policy to provide access to the logging account only Manually rotate the CMK every 365 days. 
B. Create a customer managed CMK in the logging account. Update the CMK key policy to provide access to the logging account and business unit accounts. Enable automatic rotation of the CMK 
C. Use an AWS managed CMK m the togging account. Update the CMK key policy to provide access to the logging account and business unit accounts Manually rotate the CMK every 365 days. 
D. Use an AWS managed CMK in the togging account Update the CMK key policy to provide access to the togging account only. Enable automatic rotation of the CMK. 



Question # 21

A company is migrating an on-premises application and a MySQL database to AWS. The application processes highly sensitive data, and new data is constantly updated in the database. The data must not be transferred over the internet. The company also must encrypt the data in transit and at rest. The database is 5 TB in size. The company already has created the database schema in an Amazon RDS for MySQL DB instance The company has set up a 1 Gbps AWS Direct Connect connection to AWS. The company also has set up a public VIF and a private VIF. A solutions architect needs to design a solution that will migrate the data to AWS with the least possible downtime Which solution will meet these requirements?

A. Perform a database backup. Copy the backup files to an AWS Snowball Edge Storage Optimized device. Import the backup to Amazon S3. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3) for encryption at rest Use TLS for encryption in transit Import the data from Amazon S3 to the DB instance. 
B. Use AWS Database Migration Service (AWS DMS) to migrate the data to AWS. Create a DMS replication instance in a private subnet. Create VPC endpoints for AWS DMS. Configure a DMS task to copy data from the on-premises database to the DB instance by using full load plus change data capture (CDC). Use the AWS Key Management Service (AWS KMS) default key for encryption at rest. Use TLS for encryption in transit. 
C. Perform a database backup. Use AWS DataSync to transfer the backup files to Amazon S3 Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3) for encryption at rest. Use TLS for encryption in transit Import the data from Amazon S3 to the DB instance. 
D. Use Amazon S3 File Gateway Set up a private connection to Amazon S3 by using AWS PrivateLink. Perform a database backup. Copy the backup files to Amazon S3. Use serverside encryption with Amazon S3 managed encryption keys (SSE-S3) for encryption at rest. Use TLS for encryption in transit. Import the data from Amazon S3 to the DB instance. 



Question # 22

A solutions architect is importing a VM from an on-premises environment by using the Amazon EC2 VM Import feature of AWS Import/Export The solutions architect has created an AMI and has provisioned an Amazon EC2 instance that is based on that AMI The EC2 instance runs inside a public subnet in a VPC and has a public IP address assigned The EC2 instance does not appear as a managed instance in the AWS Systems Manager console Which combination of steps should the solutions architect take to troubleshoot this issue"? (Select TWO )

A. Verify that Systems Manager Agent is installed on the instance and is running 
B. Verify that the instance is assigned an appropriate IAM role for Systems Manager 
C. Verify the existence of a VPC endpoint on the VPC 
D. Verify that the AWS Application Discovery Agent is configured 
E. Verify the correct configuration of service-linked roles for Systems Manager 




Related Exams


Our Clients Say About Amazon SAP-C01 Exam