CCD-410 Cloudera Certified Developer for Apache Hadoop (CCDH)

Exam Code:  CCD-410
Exam Name: Cloudera Certified Developer for Apache Hadoop (CCDH)

 

vce-pdf-download

 

QUESTION 1 When is the earliest point at which the reduce method of a given Reducer can be called?

A. As soon as at least one mapper has finished processing its input split.
B. As soon as a mapper has emitted at least one record.
C. Not until all mappers have finished processing all records.
D.It depends on the InputFormat used for the job.

Correct Answer: C

 

QUESTION 2 Which describes how a client reads a file from HDFS?

A.The client queries the NameNode for the block location(s). The NameNode returns the block location(s) to the client. The client reads the data directory off the DataNode(s).

B.The client queries all DataNodes in parallel. The DataNode that contains the requested data responds directly to the client. The client reads the data directly off the DataNode.

C.The client contacts the NameNode for the block location(s). The NameNode then queries the DataNodes for block locations. The DataNodes respond to the NameNode, and the NameNode redirects the client to the DataNode that holds the requested data block(s). The client then reads the data directly off the DataNode.

D.The client contacts the NameNode for the block location(s). The NameNode contacts the DataNode that holds the requested data block. Data is transferred from the DataNode to the NameNode, and then from the NameNode to the client.

Correct Answer: A

 

QUESTION 3 You are developing a combiner that takes as input Text keys, IntWritable values, and emits Text keys, IntWritable values. Which interface should your class implement?

A.Combiner <Text, IntWritable, Text, IntWritable>
B.Mapper <Text, IntWritable, Text, IntWritable>
C.Reducer <Text, Text, IntWritable, IntWritable>
D.Reducer <Text, IntWritable, Text, IntWritable> E.Combiner <Text, Text, IntWritable, IntWritable>

Correct Answer: D

QUESTION 4 Indentify the utility that allows you to create and run MapReduce jobs with any executable or script as the mapper and/or the reducer?

A.Oozie
B.Sqoop
C.Flume
D.Hadoop Streaming
E.mapred

Correct Answer: D

Explanation/Reference: Hadoop streaming is a utility that comes with the Hadoop distribution. The utility allows you to create and run Map/Reduce jobs with any executable or script as the mapper and/or the reducer.

Reference: http://hadoop.apache.org/common/docs/r0.20.1/streaming.html (Hadoop Streaming, second sentence)

QUESTION 5 How are keys and values presented and passed to the reducers during a standard sort and shuffle phase of MapReduce?

A.Keys are presented to reducer in sorted order; values for a given key are not sorted.
B.Keys are presented to reducer in sorted order; values for a given key are sorted in ascending order.
C.Keys are presented to a reducer in random order; values for a given key are not sorted.
D.Keys are presented to a reducer in random order; values for a given key are sorted in ascending order.

Correct Answer: A

 vce-pdf-download

 

About the author

admin

Copyright © 2016 - 2017 All Rights Reserved. Powered by DownloadVCEtoPDF.