2017 February Amazon Official New Released AWS-SysOps Dumps in Lead2pass.com!
100% Free Download! 100% Pass Guaranteed!
As a professional IT exam study material provider, Lead2pass gives you more than just AWS-SysOps exam questions and answers. We provide our customers with the most accurate study material about the AWS-SysOps exam and the guarantee of pass. We assist you to prepare for AWS-SysOps certification which is regarded valuable the IT sector.
Following questions and answers are all new published by Amazon Official Exam Center: http://www.lead2pass.com/aws-sysops.html
QUESTION 21
Which of the following requires a custom CloudWatch metric to monitor?
A. Data transfer of an EC2 instance
B. Disk usage activity of an EC2 instance
C. Memory Utilization of an EC2 instance
D. CPU Utilization of an EC2mstance
Answer: C
Explanation:
http://docs.aws.amazon.com/AmazonCloudWatch/latest/DeveloperGuide/ec2-metricscollected.html
CPU, Disk I/O, Data Transfer are default metrics. Memory is not mentioned.
QUESTION 22
Which two AWS services provide out-of-the-box user configurable automatic backup-as-a-service and backup rotation options? (Choose two.)
A. Amazon S3
B. Amazon RDS
C. Amazon EBS
D. Amazon Red shift
Answer: BD
Explanation:
By default and at no additional charge, Amazon RDS enables automated backups of your DB Instance with a 1 day retention period.
By default, Amazon Redshift enables automated backups of your data warehouse cluster with a 1-day retention period.
QUESTION 23
Which of the following statements about this S3 bucket policy is true?
A. Denies the server with the IP address 192 168 100 0 full access to the “mybucket” bucket
B. Denies the server with the IP address 192 168 100 188 full access to the “mybucket” bucket
C. Grants all the servers within the 192 168 100 0/24 subnet full access to the “mybucket”
bucket
D. Grants all the servers within the 192 168 100 188/32 subnet full access to the “mybucket”
bucket
Answer: B
Explanation:
http://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements.html
http://docs.aws.amazon.com/AmazonS3/latest/dev/amazon-s3-policy-keys.html
QUESTION 24
You are tasked with the migration of a highly trafficked Node JS application to AWS.
In order to comply with organizational standards Chef recipes must be used to configure the application servers that host this application and to support application lifecycle events.
Which deployment option meets these requirements while minimizing administrative burden?
A. Create a new stack within Opsworks add the appropriate layers to the stack and deploy the application
B. Create a new application within Elastic Beanstalk and deploy this application to a new environment
C. Launch a Mode JS server from a community AMI and manually deploy the application to the launched EC2 instance
D. Launch and configure Chef Server on an EC2 instance and leverage the AWS CLI to launch application servers and configure those instances using Chef.
Answer: A
Explanation:
OpsWorks has integrated support for Chef and lifecycle events.
http://docs.aws.amazon.com/opsworks/latest/userguide/workingcookbook.html
QUESTION 25
A user is planning to use AWS Cloudformation.
Which of the below mentioned functionalities does not help him to correctly understand Cloudfromation?
A. Cloudformation follows the DevOps model for the creation of Dev & Test
B. AWS Cloudfromation does not charge the user for its service but only charges for the AWS resources created with it
C. Cloudformation works with a wide variety of AWS services, such as EC2, EBS, VPC, IAM,
S3, RDS, ELB, etc
D. CloudFormation provides a set of application bootstrapping scripts which enables the user
to install Software
Answer: A
Explanation:
AWS Cloudformation is an application management tool which provides application modelling, deployment, configuration, management and related activities. It supports a wide variety of AWS services, such as EC2, EBS, AS, ELB, RDS, VPC, etc. It also provides application bootstrapping scripts which enable the user to install software packages or create folders. It is free of the cost and only charges the user for the services created with it. The only challenge is that it does not follow any model, such as DevOps; instead customers can define templates and use them to provision and manage the AWS resources in an orderly way.
QUESTION 26
A user has created a subnet with VPC and launched an EC2 instance in that subnet with only default settings.
Which of the below mentioned options is ready to use on the EC2 instance as soon as it is launched?
A. Elastic IP
B. Private IP
C. Public IP
D. I nternet gateway
Answer: B
Explanation:
A Virtual Private Cloud (VPC. is a virtual network dedicated to a user’s AWS account.
A subnet is a range of IP addresses in the VPC. The user can launch the AWS resources into a subnet. There are two supported platforms into which a user can launch instances: EC2-Classic and EC2-VPC. When the user launches an instance which is not a part of the non-default subnet, it will only have a private IP assigned to it. The instances part of a subnet can communicate with each other but cannot communicate over the internet or to the AWS services, such as RDS/S3.
QUESTION 27
A user is accessing RDS from an application.
The user has enabled the Multi AZ feature with the MS SQL RDS DB.
During a planned outage how will AWS ensure that a switch from DB to a standby replica will not affect access to the application?
A. RDS will have an internal IP which will redirect all requests to the new DB
B. RDS uses DNS to switch over to stand by replica for seamless transition
C. The switch over changes Hardware so RDS does not need to worry about access
D. RDS will have both the DBs running independently and the user has to manually switch over
Answer: B
Explanation:
In the event of a planned or unplanned outage of a DB instance, Amazon RDS automatically switches to a standby replica in another Availability Zone if the user has enabled Multi AZ.
The automatic failover mechanism simply changes the DNS record of the DB instance to point to the standby DB instance. As a result, the user will need to re-establish any existing connections to the DB instance. However, as the DNS is the same, the application can access DB seamlessly.
QUESTION 28
A user has created a queue named “myqueue” in US-East region with AWS SQS.
The user’s AWS account ID is 123456789012.
If the user wants to perform some action on this queue, which of the below Queue URL should he use?
A. http://sqs.us-east-1.amazonaws.com/123456789012/myqueue
B. http://sqs.amazonaws.com/123456789012/myqueue
C. http://sqs.123456789012.us-east-1.amazonaws.com/myqueue
D. http://123456789012.sqs.us-east-1.amazonaws.com/myqueue
Answer: A
Explanation:
http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-queue-message-identifiers.html
QUESTION 29
A user has configured ELB with two EBS backed EC2 instances.
The user is trying to understand the DNS access and IP support for ELB.
Which of the below mentioned statements may not help the user understand the IP mechanism supported by ELB?
A. The client can connect over IPV4 or IPV6 using Dualstack
B. ELB DNS supports both IPV4 and IPV6
C. Communication between the load balancer and back-end instances is always through IPV4
D. The ELB supports either IPV4 or IPV6 but not both
Answer: D
Explanation:
Elastic Load Balancing supports both Internet Protocol version 6 (IPv6. and Internet Protocol version 4 (IPv4.. Clients can connect to the user’s load balancer using either IPv4 or IPv6 (in EC2-Classic. DNS. However, communication between the load balancer and its back-end instances uses only IPv4. The user can use the Dualstack-prefixed DNS name to enable IPv6 support for communications between the client and the load balancers. Thus, the clients are able to access the load balancer using either IPv4 or IPv6 as their individual connectivity needs dictate.
QUESTION 30
A user has setup a web application on EC2.
The user is generating a log of the application performance at every second.
There are multiple entries for each second.
If the user wants to send that data to CloudWatch every minute, what should he do?
A. The user should send only the data of the 60th second as CloudWatch will map the receive
data timezone with the sent data timezone
B. It is not possible to send the custom metric to CloudWatch every minute
C. Give CloudWatch the Min, Max, Sum, and SampleCount of a number of every minute
D. Calculate the average of one minute and send the data to CloudWatch
Answer: C
Explanation:
Amazon CloudWatch aggregates statistics according to the period length that the user has specified while getting data from CloudWatch. The user can publish as many data points as he wants with the same or similartime stamps. CloudWatch aggregates them by the period length when the user calls get statistics about those data points. CloudWatch records the average (sum of all items divided by the number of items. of the values received for every 1-minute period, as well as the number of samples, maximum value, and minimum value for the same time period. CloudWatch will aggregate all the data which have time stamps within a one-minute period.
QUESTION 31
A user is publishing custom metrics to CloudWatch.
Which of the below mentioned statements will help the user understand the functionality better?
A. The user can use the CloudWatch Import tool
B. The user should be able to see the data in the console after around 15 minutes
C. If the user is uploading the custom data, the user must supply the namespace, timezone,
and metric name as part of the command
D. The user can view as well as upload data using the console, CLI and APIs
Answer: B
Explanation:
AWS CloudWatch supports the custom metrics. The user can always capture the custom data and upload the data to CloudWatch using CLI or APIs. The user has to always include the namespace as a part of the request. However, the other parameters are optional. If the user has uploaded data using CLI, he can view it as a graph inside the console. The data will take around 2 minutes to upload but can be viewed only after around 15 minutes.
QUESTION 32
An organization has created 50 IAM users.
The organization has introduced a new policy which will change the access of an IAM user.
How can the organization implement this effectively so that there is no need to apply the policy at the individual user level?
A. Use the IAM groups and add users as per their role to different groups and apply policy to
group
B. The user can create a policy and apply it to multiple users in a single go with the AWS CLI
C. Add each user to the IAM role as per their organization role to achieve effective policy setup
D. Use the IAM role and implement access at the role level
Answer: A
Explanation:
With AWS IAM, a group is a collection of IAM users.
A group allows the user to specify permissions for a collection of users, which can make it easier to manage the permissions for those users.
A group helps an organization manage access in a better way; instead of applying at the individual level, the organization can apply at the group level which is applicable to all the users who are a part of that group.
QUESTION 33
A user has setup a billing alarm using CloudWatch for $200.
The usage of AWS exceeded $200 after some days.
The user wants to increase the limit from $200 to $400.
What should the user do?
A. Create a new alarm of $400 and link it with the first alarm
B. It is not possible to modify the alarm once it has crossed the usage limit
C. Update the alarm to set the limit at $400 instead of $200
D. Create a new alarm for the additional $200 amount
Answer: C
Explanation:
AWS CloudWatch supports enabling the billing alarm on the total AWS charges. The estimated charges are calculated and sent several times daily to CloudWatch in the form of metric data. This data will be stored for 14 days. This data also includes the estimated charges for every service in AWS used by the user, as well as the estimated overall AWS charges. If the user wants to increase the limit, the user can modify the alarm and specify a new threshold.
QUESTION 34
A user has launched an ELB which has 5 instances registered with it.
The user deletes the ELB by mistake.
What will happen to the instances?
A. ELB will ask the user whether to delete the instances or not
B. Instances will be terminated
C. ELB cannot be deleted if it has running instances registered with it
D. Instances will keep running
Answer: D
Explanation:
When the user deletes the Elastic Load Balancer, all the registered instances will be deregistered. However, they will continue to run. The user will incur charges if he does not take any action on those instances.
QUESTION 35
A user has setup connection draining with ELB to allow in-flight requests to continue while the instance is being deregistered through Auto Scaling.
If the user has not specified the draining time, how long will ELB allow inflight requests traffic to continue?
A. 600 seconds
B. 3600 seconds
C. 300 seconds
D. 0 seconds
Answer: C
Explanation:
The Elastic Load Balancer connection draining feature causes the load balancer to stop sending new requests to the back-end instances when the instances are deregistering or become unhealthy, while ensuring that inflight requests continue to be served. The user can specify a maximum time (3600 seconds. for the load balancer to keep the connections alive before reporting the instance as deregistered. If the user does not specify the maximum timeout period, by default, the load balancer will close the connections to the deregistering instance after 300 seconds.
QUESTION 36
A user wants to disable connection draining on an existing ELB.
Which of the below mentioned statements helps the user disable connection draining on the ELB?
A. The user can only disable connection draining from CLI
B. It is not possible to disable the connection draining feature once enabled
C. The user can disable the connection draining feature from EC2 -> ELB console or from CLI
D. The user needs to stop all instances before disabling connection draining
Answer: C
Explanation:
The Elastic Load Balancer connection draining feature causes the load balancer to stop sending new requests to the back-end instances when the instances are deregistering or become unhealthy, while ensuring that inflight requests continue to be served. The user can enable or disable connection draining from the AWS EC2 console -> ELB or using CLI.
QUESTION 37
A user has configured CloudWatch monitoring on an EBS backed EC2 instance.
If the user has not attached any additional device, which of the below mentioned metrics will always show a 0 value?
A. DiskReadBytes
B. NetworkIn
C. NetworkOut
D. CPUUtilization
Answer: A
Explanation:
CloudWatch is used to monitor AWS as the well custom services. For EC2 when the user is monitoring the EC2 instances, it will capture the 7 Instance level and 3 system check parameters for the EC2 instance. Since this is an EBS backed instance, it will not have ephermal storage attached to it. Out of the 7 EC2 metrics, the 4 metrics DiskReadOps, DiskWriteOps, DiskReadBytes and DiskWriteBytes are disk related data and available only when there is ephermal storage attached to an instance. For an EBS backed instance without any additional device, this data will be 0.
QUESTION 38
A user has created a photo editing software and hosted it on EC2.
The software accepts requests from the user about the photo format and resolution and sends a message to S3 to enhance the picture accordingly.
Which of the below mentioned AWS services will help make a scalable software with the AWS infrastructure in this scenario?
A. AWS Glacier
B. AWS Elastic Transcoder
C. AWS Simple Notification Service
D. AWS Simple Queue Service
Answer: D
Explanation:
Amazon Simple Queue Service (SQS. is a fast, reliable, scalable, and fully managed message queuing service. SQS provides a simple and cost-effective way to decouple the components of an application. The user can configure SQS, which will decouple the call between the EC2 application and S3. Thus, the application does not keep waiting for S3 to provide the data.
QUESTION 39
A user has developed an application which is required to send the data to a NoSQL database. The user wants to decouple the data sending such that the application keeps processing and sending data but does not wait for an acknowledgement of DB.
Which of the below mentioned applications helps in this scenario?
A. AWS Simple Notification Service
B. AWS Simple Workflow
C. AWS Simple Queue Service
D. AWS Simple Query Service
Answer: C
Explanation:
Amazon Simple Queue Service (SQS. is a fast, reliable, scalable, and fully managed message queuing service. SQS provides a simple and cost-effective way to decouple the components of an application. In this case, the user can use AWS SQS to send messages which are received from an application and sent to DB. The application can continue processing data without waiting for any acknowledgement from DB. The user can use SQS to transmit any volume of data without losing messages or requiring other services to always be available.
QUESTION 40
A user has a refrigerator plant.
The user is measuring the temperature of the plant every 15 minutes.
If the user wants to send the data to CloudWatch to view the data visually, which of the below mentioned statements is true with respect to the information given above?
A. The user needs to use AWS CLI or API to upload the data
B. The user can use the AWS Import Export facility to import data to CloudWatch
C. The user will upload data from the AWS console
D. The user cannot upload data to CloudWatch since it is not an AWS service metric
Answer: A
Explanation:
AWS CloudWatch supports the custom metrics. The user can always capture the custom data and upload the data to CloudWatch using CLI or APIs. While sending the data the user has to include the metric name, namespace and timezone as part of the request.
Lead2pass.com has been the world leader in providing online training solutions for AWS-SysOps Certification. You use our training materials that have been rigorously tested by international experts.
AWS-SysOps new questions on Google Drive: https://drive.google.com/open?id=0B3Syig5i8gpDekE1aUpSVGNHbWM
2017 Amazon AWS-SysOps exam dumps (All 332 Q&As) from Lead2pass:
http://www.lead2pass.com/aws-sysops.html [100% Exam Pass Guaranteed]