AWS-Certified-Big-Data-Specialty-BDS-C00-(BDS-C00).pdf - BDS-C00 Dumps BDS-C00 Braindumps BDS-C00 Real Questions BDS-C00 Practice Test BDS-C00 dumps

AWS-Certified-Big-Data-Specialty-BDS-C00-(BDS-C00).pdf -...

This preview shows page 1 - 3 out of 7 pages.

BDS-C00 BDS-C00 Dumps BDS-C00 Braindumps BDS-C00 Real Questions BDS-C00 Practice Test BDS-C00 dumps free Amazon AWS Certified Big Data Specialty (BDS-C00)
Image of page 1
Question: 75An organization currently runs a large Hadoop environment in their data center and is in the process of creating analternative Hadoop environment on AWS, using Amazon EMR. They generate around 20 TB of data on a monthlybasis. Also on a monthly basis, files need to be grouped and copied to Amazon S3 to be used for the Amazon EMRenvironment. They have multiple S3 buckets across AWS accounts to which data needs to be copied. There is a10G AWS Direct Connect setup between their data center and AWS, and the network team has agreed to allocate50% of AWS Direct Connect bandwidth to data transfer. The data transfer cannot take more than two days. Whatwould be the MOST efficient approach to transfer data to AWS on a monthly basis? A. Use an offline copy method, such as an AWS Snowball device, to copy and transfer data to Amazon S3.B. Configure a multipart upload for Amazon S3 on AWS Java SDK to transfer data over AWS Direct Connect.C. Use Amazon S3 transfer acceleration capability to transfer data to Amazon S3 over AWS Direct Connect.D. Setup S3DistCop tool on the on-premises Hadoop environment to transfer data to Amazon S3 over AWS DirectConnect.Answer: B Question: 76An organization is developing a mobile social application and needs to collect logs from all devices on which it isinstalled. The organization is evaluating the Amazon Kinesis Data Streams to push logs and Amazon EMR toprocess data. They want to store data on HDFS using the default replication factor to replicate data among thecluster, but they are concerned about the durability of the data. Currently, they are producing 300 GB of raw datadaily, with additional spikes during special events. They will need to scale out the Amazon EMR cluster to matchthe increase in streamed data. Which solution prevents data loss and matches compute demand?
Image of page 2
Image of page 3

You've reached the end of your free preview.

Want to read all 7 pages?

  • Fall '20
  • Amazon Elastic Compute Cloud, B. Create, B. Configure data encryption

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture