aiotestking uk

SCS-C01 Exam Questions - Online Test


SCS-C01 Premium VCE File

Learn More 100% Pass Guarantee - Dumps Verified - Instant Download
150 Lectures, 20 Hours

It is impossible to pass Amazon-Web-Services SCS-C01 exam without any help in the short term. Come to Examcollection soon and find the most advanced, correct and guaranteed Amazon-Web-Services SCS-C01 practice questions. You will get a surprising result by our Renovate AWS Certified Security- Specialty practice guides.

Also have SCS-C01 free dumps questions for you:

NEW QUESTION 1
You are planning to use AWS Configto check the configuration of the resources in your AWS account. You are planning on using an existing 1AM role and using it for the AWS Config resource. Which of the following is required to ensure the AWS config service can work as required?
Please select:

  • A. Ensure that there is a trust policy in place for the AWS Config service within the role
  • B. Ensure that there is a grant policy in place for the AWS Config service within the role
  • C. Ensure that there is a user policy in place for the AWS Config service within the role
  • D. Ensure that there is a group policy in place for the AWS Config service within the role

Answer: A

Explanation:
SCS-C01 dumps exhibit
C:UserswkDesktopmudassarUntitled.jpg
Options B,C and D are invalid because you need to ensure a trust policy is in place and not a grant, user or group policy or more information on the 1AM role permissions please visit the below Link:
https://docs.aws.amazon.com/config/latest/developerguide/iamrole-permissions.htmll
The correct answer is: Ensure that there is a trust policy in place for the AWS Config service within the role Submit your Feedback/Queries to our Experts

NEW QUESTION 2
A company has set up EC2 instances on the AW5 Cloud. There is a need to see all the IP addresses which are accessing the EC2 Instances. Which service can help achieve this?
Please select:

  • A. Use the AWS Inspector service
  • B. Use AWS VPC Flow Logs
  • C. Use Network ACL's
  • D. Use Security Groups

Answer: B

Explanation:
The AWS Documentation mentions the foil
A flow log record represents a network flow in your flow log. Each record captures the network flow for a specific 5-tuple, for a specific capture window. A 5-tuple is a set of five different values that specify the source, destination, and protocol for an internet protocol (IP) flow.
Options A,C and D are all invalid because these services/tools cannot be used to get the the IP addresses which are accessing the EC2 Instances
For more information on VPC Flow Logs please visit the URL https://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/flow-logs.html
The correct answer is: Use AWS VPC Flow Logs Submit vour Feedback/Queries to our Experts

NEW QUESTION 3
A Security Engineer discovers that developers have been adding rules to security groups that allow SSH and RDP traffic from 0.0.0.0/0 instead of the organization firewall IP.
What is the most efficient way to remediate the risk of this activity?

  • A. Delete the internet gateway associated with the VPC.
  • B. Use network access control lists to block source IP addresses matching 0.0.0.0/0.
  • C. Use a host-based firewall to prevent access from all but the organization’s firewall IP.
  • D. Use AWS Config rules to detect 0.0.0.0/0 and invoke an AWS Lambda function to update the security group with the organization's firewall IP.

Answer: D

NEW QUESTION 4
A Security Administrator at a university is configuring a fleet of Amazon EC2 instances. The EC2 instances are shared among students, and non-root SSH access is allowed. The Administrator is concerned about students attacking other AWS account resources by using the EC2 instance metadata service.
What can the Administrator do to protect against this potential attack?

  • A. Disable the EC2 instance metadata service.
  • B. Log all student SSH interactive session activity.
  • C. Implement ip tables-based restrictions on the instances.
  • D. Install the Amazon Inspector agent on the instances.

Answer: C

NEW QUESTION 5
A Security Engineer is trying to determine whether the encryption keys used in an AWS service are in compliance with certain regulatory standards.
Which of the following actions should the Engineer perform to get further guidance?

  • A. Read the AWS Customer Agreement.
  • B. Use AWS Artifact to access AWS compliance reports.
  • C. Post the question on the AWS Discussion Forums.
  • D. Run AWS Config and evaluate the configuration outputs.

Answer: B

NEW QUESTION 6
A company has an encrypted Amazon S3 bucket. An Application Developer has an IAM policy that allows access to the S3 bucket, but the Application Developer is unable to access objects within the bucket.
What is a possible cause of the issue?

  • A. The S3 ACL for the S3 bucket fails to explicitly grant access to the Application Developer
  • B. The AWS KMS key for the S3 bucket fails to list the Application Developer as an administrator
  • C. The S3 bucket policy fails to explicitly grant access to the Application Developer
  • D. The S3 bucket policy explicitly denies access to the Application Developer

Answer: C

NEW QUESTION 7
Your team is experimenting with the API gateway service for an application. There is a need to implement a custom module which can be used for authentication/authorization for calls made to the API gateway. How can this be achieved?
Please select:

  • A. Use the request parameters for authorization
  • B. Use a Lambda authorizer
  • C. Use the gateway authorizer
  • D. Use CORS on the API gateway

Answer: B

Explanation:
The AWS Documentation mentions the following
An Amazon API Gateway Lambda authorizer (formerly known as a custom authorize?) is a Lambda function that you provide to control access to your API methods. A Lambda authorizer uses bearer token authentication strategies, such as OAuth or SAML. It can also use information described by headers, paths, query strings, stage variables, or context variables request parameters.
Options A,C and D are invalid because these cannot be used if you need a custom authentication/authorization for calls made to the API gateway
For more information on using the API gateway Lambda authorizer please visit the URL: https://docs.aws.amazon.com/apisateway/latest/developerguide/apieateway-use-lambda-authorizer.htmll The correct answer is: Use a Lambda authorizer
Submit your Feedback/Queries to our Experts

NEW QUESTION 8
The Development team receives an error message each time the team members attempt to encrypt or decrypt a Secure String parameter from the SSM Parameter Store by using an AWS KMS customer managed key (CMK).
Which CMK-related issues could be responsible? (Choose two.)

  • A. The CMK specified in the application does not exist.
  • B. The CMK specified in the application is currently in use.
  • C. The CMK specified in the application is using the CMK KeyID instead of CMK Amazon Resource Name.
  • D. The CMK specified in the application is not enabled.
  • E. The CMK specified in the application is using an alias.

Answer: AD

NEW QUESTION 9
A company stores data on an Amazon EBS volume attached to an Amazon EC2 instance. The data is asynchronously replicated to an Amazon S3 bucket. Both the EBS volume and the S3 bucket are encrypted
with the same AWS KMS Customer Master Key (CMK). A former employee scheduled a deletion of that CMK before leaving the company.
The company’s Developer Operations department learns about this only after the CMK has been deleted. Which steps must be taken to address this situation?

  • A. Copy the data directly from the EBS encrypted volume before the volume is detached from the EC2 instance.
  • B. Recover the data from the EBS encrypted volume using an earlier version of the KMS backing key.
  • C. Make a request to AWS Support to recover the S3 encrypted data.
  • D. Make a request to AWS Support to restore the deleted CMK, and use it to recover the data.

Answer: A

NEW QUESTION 10
Which of the following minimizes the potential attack surface for applications?

  • A. Use security groups to provide stateful firewalls for Amazon EC2 instances at the hypervisor level.
  • B. Use network ACLs to provide stateful firewalls at the VPC level to prevent access to any specific AWS resource.
  • C. Use AWS Direct Connect for secure trusted connections between EC2 instances within private subnets.
  • D. Design network security in a single layer within the perimeter network (also known as DMZ, demilitarized zone, and screened subnet) to facilitate quicker responses to threats.

Answer: A

NEW QUESTION 11
A Security Engineer has created an Amazon CloudWatch event that invokes an AWS Lambda function daily. The Lambda function runs an Amazon Athena query that checks AWS CloudTrail logs in Amazon S3 to detect whether any IAM user accounts or credentials have been created in the past 30 days. The results of the Athena query are created in the same S3 bucket. The Engineer runs a test execution of the Lambda function via the AWS Console, and the function runs successfully.
After several minutes, the Engineer finds that his Athena query has failed with the error message: “Insufficient Permissions”. The IAM permissions of the Security Engineer and the Lambda function are shown below:
Security Engineer
SCS-C01 dumps exhibit
Lambda function execution role
SCS-C01 dumps exhibit
What is causing the error?

  • A. The Lambda function does not have permissions to start the Athena query execution.
  • B. The Security Engineer does not have permissions to start the Athena query execution.
  • C. The Athena service does not support invocation through Lambda.
  • D. The Lambda function does not have permissions to access the CloudTrail S3 bucket.

Answer: D

NEW QUESTION 12
A Security Engineer received an AWS Abuse Notice listing EC2 instance IDs that are reportedly abusing other hosts.
Which action should the Engineer take based on this situation? (Choose three.)

  • A. Use AWS Artifact to capture an exact image of the state of each instance.
  • B. Create EBS Snapshots of each of the volumes attached to the compromised instances.
  • C. Capture a memory dump.
  • D. Log in to each instance with administrative credentials to restart the instance.
  • E. Revoke all network ingress and egress except for to/from a forensics workstation.
  • F. Run Auto Recovery for Amazon EC2.

Answer: BCE

NEW QUESTION 13
A new application will be deployed on EC2 instances in private subnets. The application will transfer sensitive data to and from an S3 bucket. Compliance requirements state that the data must not traverse the public internet. Which solution meets the compliance requirement?
Please select:

  • A. Access the S3 bucket through a proxy server
  • B. Access the S3 bucket through a NAT gateway.
  • C. Access the S3 bucket through a VPC endpoint for S3
  • D. Access the S3 bucket through the SSL protected S3 endpoint

Answer: C

Explanation:
The AWS Documentation mentions the following
A VPC endpoint enables you to privately connect your VPC to supported AWS services and VPC endpoint services powered by PrivateLink without requiring an internet gateway, NAT device, VPN connection, or AWS Direct Connect connection. Instances in your VPC do not require public IP addresses to communicate with resources in the service. Traffic between your VPC and the other service does not leave the Amazon network.
Option A is invalid because using a proxy server is not sufficient enough
Option B and D are invalid because you need secure communication which should not traverse the internet For more information on VPC endpoints please see the below link https://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/vpc-endpoints.htmll
The correct answer is: Access the S3 bucket through a VPC endpoint for S3 Submit your Feedback/Queries to our Experts

NEW QUESTION 14
What is the result of the following bucket policy?
SCS-C01 dumps exhibit
Choose the correct answer
Please select:

  • A. It will allow all access to the bucket mybucket
  • B. It will allow the user mark from AWS account number 111111111 all access to the bucket but deny everyone else all access to the bucket
  • C. It will deny all access to the bucket mybucket
  • D. None of these

Answer: C

Explanation:
The policy consists of 2 statements, one is the allow for the user mark to the bucket and the next is the deny
policy for all other users. The deny permission will override the allow and hence all users will not have access to the bucket.
Options A,B and D are all invalid because this policy is used to deny all access to the bucket mybucket For examples on S3 bucket policies, please refer to the below Link: http://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.htmll
The correct answer is: It will deny all access to the bucket mybucket Submit your FeedbacK/Quenes to our Experts

NEW QUESTION 15
You have an S3 bucket defined in AWS. You want to ensure that you encrypt the data before sending it across the wire. What is the best way to achieve this.
Please select:

  • A. Enable server side encryption for the S3 bucke
  • B. This request will ensure that the data is encrypted first.
  • C. Use the AWS Encryption CLI to encrypt the data first
  • D. Use a Lambda function to encrypt the data before sending it to the S3 bucket.
  • E. Enable client encryption for the bucket

Answer: B

Explanation:
One can use the AWS Encryption CLI to encrypt the data before sending it across to the S3 bucket. Options A and C are invalid because this would still mean that data is transferred in plain text Option D is invalid because you cannot just enable client side encryption for the S3 bucket For more information on Encrypting and Decrypting data, please visit the below URL:
https://aws.amazonxom/blogs/securirv/how4o-encrvpt-and-decrypt-your-data-with-the-aws-encryption-cl
The correct answer is: Use the AWS Encryption CLI to encrypt the data first Submit your Feedback/Queries to our Experts

NEW QUESTION 16
Your company has been using AWS for the past 2 years. They have separate S3 buckets for logging the various AWS services that have been used. They have hired an external vendor for analyzing their log files. They have their own AWS account. What is the best way to ensure that the partner account can access the log files in the company account for analysis. Choose 2 answers from the options given below
Please select:

  • A. Create an 1AM user in the company account
  • B. Create an 1AM Role in the company account
  • C. Ensure the 1AM user has access for read-only to the S3 buckets
  • D. Ensure the 1AM Role has access for read-only to the S3 buckets

Answer: BD

Explanation:
The AWS Documentation mentions the following
To share log files between multiple AWS accounts, you must perform the following general steps. These steps are explained in detail later in this section.
Create an 1AM role for each account that you want to share log files with.
For each of these 1AM roles, create an access policy that grants read-only access to the account you want to share the log files with.
Have an 1AM user in each account programmatically assume the appropriate role and retrieve the log files. Options A and C are invalid because creating an 1AM user and then sharing the 1AM user credentials with the
vendor is a direct 'NO' practise from a security perspective.
For more information on sharing cloudtrail logs files, please visit the following URL https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-sharine-loes.htmll
The correct answers are: Create an 1AM Role in the company account Ensure the 1AM Role has access for read-only to the S3 buckets
Submit your Feedback/Queries to our Experts

NEW QUESTION 17
An organization wants to deploy a three-tier web application whereby the application servers run on Amazon EC2 instances. These EC2 instances need access to credentials that they will use to authenticate their SQL connections to an Amazon RDS DB instance. Also, AWS Lambda functions must issue queries to the RDS database by using the same database credentials.
The credentials must be stored so that the EC2 instances and the Lambda functions can access them. No other access is allowed. The access logs must record when the credentials were accessed and by whom.
What should the Security Engineer do to meet these requirements?

  • A. Store the database credentials in AWS Key Management Service (AWS KMS). Create an IAM role with access to AWS KMS by using the EC2 and Lambda service principals in the role’s trust polic
  • B. Add the role to an EC2 instance profil
  • C. Attach the instance profile to the EC2 instance
  • D. Set up Lambda to use the new role for execution.
  • E. Store the database credentials in AWS KM
  • F. Create an IAM role with access to KMS by using the EC2 and Lambda service principals in the role’s trust polic
  • G. Add the role to an EC2 instance profil
  • H. Attach the instance profile to the EC2 instances and the Lambda function.
  • I. Store the database credentials in AWS Secrets Manage
  • J. Create an IAM role with access to Secrets Manager by using the EC2 and Lambda service principals in the role’s trust polic
  • K. Add the role to an EC2 instance profil
  • L. Attach the instance profile to the EC2 instances and the Lambda function.
  • M. Store the database credentials in AWS Secrets Manage
  • N. Create an IAM role with access to Secrets Manager by using the EC2 and Lambda service principals in the role’s trust polic
  • O. Add the role to an EC2 instance profil
  • P. Attach the instance profile to the EC2 instance
  • Q. Set up Lambda to use the new role for execution.

Answer: D

NEW QUESTION 18
Your company has an EC2 Instance that is hosted in an AWS VPC. There is a requirement to ensure that logs files from the EC2 Instance are stored accordingly. The access should also be limited for the destination of the log files. How can this be accomplished? Choose 2 answers from the options given below. Each answer forms part of the solution
Please select:

  • A. Stream the log files to a separate Cloudtrail trail
  • B. Stream the log files to a separate Cloudwatch Log group
  • C. Create an 1AM policy that gives the desired level of access to the Cloudtrail trail
  • D. Create an 1AM policy that gives the desired level of access to the Cloudwatch Log group

Answer: BD

Explanation:
You can create a Log group and send all logs from the EC2 Instance to that group. You can then limit the access to the Log groups via an 1AM policy.
Option A is invalid because Cloudtrail is used to record API activity and not for storing log files Option C is invalid because Cloudtrail is the wrong service to be used for this requirement
For more information on Log Groups and Log Streams, please visit the following URL:
* https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/Workinj
For more information on Access to Cloudwatch logs, please visit the following URL:
* https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/auth-and-access-control-cwl.html
The correct answers are: Stream the log files to a separate Cloudwatch Log group. Create an 1AM policy that gives the desired level of access to the Cloudwatch Log group
Submit your Feedback/Queries to our Experts

NEW QUESTION 19
You need to inspect the running processes on an EC2 Instance that may have a security issue. How can you achieve this in the easiest way possible. Also you need to ensure that the process does not interfere with the continuous running of the instance.
Please select:

  • A. Use AWS Cloudtrail to record the processes running on the server to an S3 bucket.
  • B. Use AWS Cloudwatch to record the processes running on the server
  • C. Use the SSM Run command to send the list of running processes information to an S3 bucket.
  • D. Use AWS Config to see the changed process information on the server

Answer: C

Explanation:
The SSM Run command can be used to send OS specific commands to an Instance. Here you can check and see the running processes on an instance and then send the output to an S3 bucket.
Option A is invalid because this is used to record API activity and cannot be used to record running processes. Option B is invalid because Cloudwatch is a logging and metric service and cannot be used to record running processes.
Option D is invalid because AWS Config is a configuration service and cannot be used to record running processes.
For more information on the Systems Manager Run command, please visit the following URL:
https://docs.aws.amazon.com/systems-manaEer/latest/usereuide/execute-remote-commands.htmll
The correct answer is: Use the SSM Run command to send the list of running processes information to an S3 bucket. Submit your Feedback/Queries to our Experts

NEW QUESTION 20
Your IT Security team has advised to carry out a penetration test on the resources in their company's AWS Account. This is as part of their capability to analyze the security of the Infrastructure. What should be done first in this regard?
Please select:

  • A. Turn on Cloud trail and carry out the penetration test
  • B. Turn on VPC Flow Logs and carry out the penetration test
  • C. Submit a request to AWS Support
  • D. Use a custom AWS Marketplace solution for conducting the penetration test

Answer: C

Explanation:
This concept is given in the AWS Documentation
How do I submit a penetration testing request for my AWS resources? Issue
I want to run a penetration test or other simulated event on my AWS architecture. How do I get permission from AWS to do that?
Resolution
Before performing security testing on AWS resources, you must obtain approval from AWS. After you submit your request AWS will reply in about two business days.
AWS might have additional questions about your test which can extend the approval process, so plan accordingly and be sure that your initial request is as detailed as possible.
If your request is approved, you'll receive an authorization number.
Option A.B and D are all invalid because the first step is to get prior authorization from AWS for penetration tests
For more information on penetration testing, please visit the below URL
* https://aws.amazon.com/security/penetration-testing/
* https://aws.amazon.com/premiumsupport/knowledge-center/penetration-testing/
(
The correct answer is: Submit a request to AWS Support Submit your Feedback/Queries to our Experts

NEW QUESTION 21
An enterprise wants to use a third-party SaaS application. The SaaS application needs to have access to issue several API commands to discover Amazon EC2 resources running within the enterprise's account. The enterprise has internal security policies that require any outside access to their environment must conform to the principles of least privilege and there must be controls in place to ensure that the credentials used by the SaaS vendor cannot be used by any other third party. Which of the following would meet all of these conditions?
Please select:

  • A. From the AWS Management Console, navigate to the Security Credentials page and retrieve the access and secret key for your account.
  • B. Create an 1AM user within the enterprise account assign a user policy to the 1AM user that allows only the actions required by the SaaS applicatio
  • C. Create a new access and secret key for the user and provide these credentials to the SaaS provider.
  • D. Create an 1AM role for cross-account access allows the SaaS provider's account to assume the role and assign it a policy that allows only the actions required by the SaaS application.
  • E. Create an 1AM role for EC2 instances, assign it a policy that allows only the actions required tor the Saas application to work, provide the role ARN to the SaaS provider to use when launching their application instances.

Answer: C

Explanation:
The below diagram from an AWS blog shows how access is given to other accounts for the services in your own account
C:UserswkDesktopmudassarUntitled.jpg
SCS-C01 dumps exhibit
Options A and B are invalid because you should not user 1AM users or 1AM Access keys Options D is invalid because you need to create a role for cross account access
For more information on Allowing access to external accounts, please visit the below URL:
|https://aws.amazon.com/blogs/apn/how-to-best-architect-your-aws-marketplace-saas-subscription-across-multip The correct answer is: Create an 1AM role for cross-account access allows the SaaS provider's account to assume the role and assign it a policy that allows only the actions required by the SaaS application.
Submit your Feedback/Queries to our Experts

NEW QUESTION 22
A Security Engineer discovered a vulnerability in an application running on Amazon ECS. The vulnerability allowed attackers to install malicious code. Analysis of the code shows it exfiltrates data on port 5353 in batches at random time intervals.
While the code of the containers is being patched, how can Engineers quickly identify all compromised hosts and stop the egress of data on port 5353?

  • A. Enable AWS Shield Advanced and AWS WA
  • B. Configure an AWS WAF custom filter for egress traffic on port 5353
  • C. Enable Amazon Inspector on Amazon ECS and configure a custom assessment to evaluate containers that have port 5353 ope
  • D. Update the NACLs to block port 5353 outbound.
  • E. Create an Amazon CloudWatch custom metric on the VPC Flow Logs identifying egress traffic on port 5353. Update the NACLs to block port 5353 outbound.
  • F. Use Amazon Athena to query AWS CloudTrail logs in Amazon S3 and look for any traffic on port 5353. Update the security groups to block port 5353 outbound.

Answer: C

NEW QUESTION 23
A Security Engineer must enforce the use of only Amazon EC2, Amazon S3, Amazon RDS, Amazon DynamoDB, and AWS STS in specific accounts.
What is a scalable and efficient approach to meet this requirement?
SCS-C01 dumps exhibit
SCS-C01 dumps exhibit
SCS-C01 dumps exhibit
SCS-C01 dumps exhibit

  • A. Option A
  • B. Option B
  • C. Option C
  • D. Option D

Answer: A

NEW QUESTION 24
You need to have a cloud security device which would allow to generate encryption keys based on FIPS 140-2 Level 3. Which of the following can be used for this purpose.
Please select:

  • A. AWS KMS
  • B. AWS Customer Keys
  • C. AWS managed keys
  • D. AWS Cloud HSM

Answer: AD

Explanation:
AWS Key Management Service (KMS) now uses FIPS 140-2 validated hardware security modules (HSM) and supports FIPS 140-2 validated endpoints, which provide independent assurances about the confidentiality and integrity of your keys.
All master keys in AWS KMS regardless of their creation date or origin are automatically protected using FIPS 140-2 validated
HSMs. defines four levels of security, simply named "Level 1'' to "Level 4". It does not specify in detail what level of security is required by any particular application.
• FIPS 140-2 Level 1 the lowest, imposes very limited requirements; loosely, all components must be "production-grade" anc various egregious kinds of insecurity must be absent
• FIPS 140-2 Level 2 adds requirements for physical tamper-evidence and role-based authentication.
• FIPS 140-2 Level 3 adds requirements for physical tamper-resistance (making it difficult for attackers to gain access to sensitive information contained in the module) and identity-based authentication, and for a physical or logical separation between the interfaces by which "critical security parameters" enter and leave the module, and its other interfaces.
• FIPS 140-2 Level 4 makes the physical security requirements more stringent and requires robustness against environmental attacks.
AWSCIoudHSM provides you with a FIPS 140-2 Level 3 validated single-tenant HSM cluster in your Amazon Virtual Private Cloud (VPQ to store and use your keys. You have exclusive control over how your keys are used via an authentication mechanism independent from AWS. You interact with keys in your AWS CloudHSM cluster similar to the way you interact with your applications running in Amazon EC2.
AWS KMS allows you to create and control the encryption keys used by your applications and supported AWS services in multiple regions around the world from a single console. The service uses a FIPS 140-2 validated HSM to protect the security of your keys. Centralized management of all your keys in AWS KMS lets you enforce who can use your keys under which conditions, when they get rotated, and who can manage them.
AWS KMS HSMs are validated at level 2 overall and at level 3 in the following areas:
• Cryptographic Module Specification
• Roles, Services, and Authentication
• Physical Security
• Design Assurance
So I think that we can have 2 answers for this question. Both A & D.
• https://aws.amazon.com/blo15s/security/aws-key-management-service- now-ffers-flps-140-2-validated-cryptographic-m<
enabling-easier-adoption-of-the-service-for-regulated-workloads/
• https://a ws.amazon.com/cloudhsm/faqs/
• https://aws.amazon.com/kms/faqs/
• https://en.wikipedia.org/wiki/RPS
The AWS Documentation mentions the following
AWS CloudHSM is a cloud-based hardware security module (HSM) that enables you to easily generate and use your own encryption keys on the AWS Cloud. With CloudHSM, you can manage your own encryption keys using FIPS 140-2 Level 3 validated HSMs. CloudHSM offers you the flexibility to integrate with your applications using industry-standard APIs, such as PKCS#11, Java Cryptography Extensions ()CE). and Microsoft CryptoNG (CNG) libraries. CloudHSM is also standards-compliant and enables you to export all of your keys to most other commercially-available HSMs. It is a fully-managed service that automates
time-consuming administrative tasks for you, such as hardware provisioning, software patching,
high-availability, and backups. CloudHSM also enables you to scale quickly by adding and removing HSM capacity on-demand, with no up-front costs.
All other options are invalid since AWS Cloud HSM is the prime service that offers FIPS 140-2 Level 3 compliance
For more information on CloudHSM, please visit the following url https://aws.amazon.com/cloudhsm;
The correct answers are: AWS KMS, AWS Cloud HSM Submit your Feedback/Queries to our Experts

NEW QUESTION 25
A Software Engineer wrote a customized reporting service that will run on a fleet of Amazon EC2 instances. The company security policy states that application logs for the reporting service must be centrally collected.
What is the MOST efficient way to meet these requirements?

  • A. Write an AWS Lambda function that logs into the EC2 instance to pull the application logs from the EC2 instance and persists them into an Amazon S3 bucket.
  • B. Enable AWS CloudTrail logging for the AWS account, create a new Amazon S3 bucket, and then configure Amazon CloudWatch Logs to receive the application logs from CloudTrail.
  • C. Create a simple cron job on the EC2 instances that synchronizes the application logs to an Amazon S3 bucket by using rsync.
  • D. Install the Amazon CloudWatch Logs Agent on the EC2 instances, and configure it to send the application logs to CloudWatch Logs.

Answer: D

NEW QUESTION 26
Your company has a requirement to monitor all root user activity by notification. How can this best be achieved? Choose 2 answers from the options given below. Each answer forms part of the solution
Please select:

  • A. Create a Cloudwatch Events Rule s
  • B. Create a Cloudwatch Logs Rule
  • C. Use a Lambda function
  • D. Use Cloudtrail API call

Answer: AC

Explanation:
Below is a snippet from the AWS blogs on a solution
SCS-C01 dumps exhibit
C:UserswkDesktopmudassarUntitled.jpg
Option B is invalid because you need to create a Cloudwatch Events Rule and there is such thing as a Cloudwatch Logs Rule Option D is invalid because Cloud Trail API calls can be recorded but cannot be used to send across notifications For more information on this blog article, please visit the following URL:
https://aws.amazon.com/blogs/mt/monitor-and-notify-on-aws-account-root-user-activityy The correct answers are: Create a Cloudwatch Events Rule, Use a Lambda function Submit your Feedback/Queries to our Experts

NEW QUESTION 27
In your LAMP application, you have some developers that say they would like access to your logs. However, since you are using an AWS Auto Scaling group, your instances are constantly being re-created. What would you do to make sure that these developers can access these log files? Choose the correct answer from the options below
Please select:

  • A. Give only the necessary access to the Apache servers so that the developers can gain access to the log files.
  • B. Give root access to your Apache servers to the developers.
  • C. Give read-only access to your developers to the Apache servers.
  • D. Set up a central logging server that you can use to archive your logs; archive these logs to an S3 bucket for developer-access.

Answer: D

Explanation:
One important security aspect is to never give access to actual servers, hence Option A.B and C are just totally wrong from a security perspective.
The best option is to have a central logging server that can be used to archive logs. These logs can then be stored in S3.
Options A,B and C are all invalid because you should not give access to the developers on the Apache se For more information on S3, please refer to the below link
https://aws.amazon.com/documentation/s3j
The correct answer is: Set up a central logging server that you can use to archive your logs; archive these logs to an S3 bucket for developer-access.
Submit vour Feedback/Queries to our Experts

NEW QUESTION 28
A company has contracted with a third party to audit several AWS accounts. To enable the audit,
cross-account IAM roles have been created in each account targeted for audit. The Auditor is having trouble accessing some of the accounts.
Which of the following may be causing this problem? (Choose three.)

  • A. The external ID used by the Auditor is missing or incorrect.
  • B. The Auditor is using the incorrect password.
  • C. The Auditor has not been granted sts:AssumeRole for the role in the destination account.
  • D. The Amazon EC2 role used by the Auditor must be set to the destination account role.
  • E. The secret key used by the Auditor is missing or incorrect.
  • F. The role ARN used by the Auditor is missing or incorrect.

Answer: ACF

NEW QUESTION 29
A security alert has been raised for an Amazon EC2 instance in a customer account that is exhibiting strange behavior. The Security Engineer must first isolate the EC2 instance and then use tools for further investigation.
What should the Security Engineer use to isolate and research this event? (Choose three.)

  • A. AWS CloudTrail
  • B. Amazon Athena
  • C. AWS Key Management Service (AWS KMS)
  • D. VPC Flow Logs
  • E. AWS Firewall Manager
  • F. Security groups

Answer: ADF

NEW QUESTION 30
An organization is using AWS CloudTrail, Amazon CloudWatch Logs, and Amazon CloudWatch to send alerts when new access keys are created. However, the alerts are no longer appearing in the Security Operations mail box.
Which of the following actions would resolve this issue?

  • A. In CloudTrail, verify that the trail logging bucket has a log prefix configured.
  • B. In Amazon SNS, determine whether the “Account spend limit” has been reached for this alert.
  • C. In SNS, ensure that the subscription used by these alerts has not been deleted.
  • D. In CloudWatch, verify that the alarm threshold “consecutive periods” value is equal to, or greater than 1.

Answer: B

NEW QUESTION 31
......

100% Valid and Newest Version SCS-C01 Questions & Answers shared by Surepassexam, Get Full Dumps HERE: https://www.surepassexam.com/SCS-C01-exam-dumps.html (New 330 Q&As)