Home > Articles

Installing Spark

  • Print
  • + Share This
This chapter is from the book

Workshop

The workshop contains quiz questions and exercises to help you solidify your understanding of the material covered. Try to answer all questions before looking at the “Answers” section that follows.

Quiz

  1. True or false: A Spark Standalone cluster consists of a single node.

  2. Which component is not a prerequisite for installing Spark?

    1. Scala

    2. Python

    3. Java

  3. Which of the following subdirectories contained in the Spark installation contains scripts to start and stop master and slave node Spark services?

    1. bin

    2. sbin

    3. lib

  4. Which of the following environment variables are required to run Spark on Hadoop/YARN?

    1. HADOOP_CONF_DIR

    2. YARN_CONF_DIR

    3. Either HADOOP_CONF_DIR or YARN_CONF_DIR will work.

Answers

  1. False. Standalone refers to the independent process scheduler for Spark, which could be deployed on a cluster of one-to-many nodes.

  2. A. The Scala assembly is included with Spark; however, Java and Python must exist on the system prior to installation.

  3. B. sbin contains administrative scripts to start and stop Spark services.

  4. C. Either the HADOOP_CONF_DIR or YARN_CONF_DIR environment variable must be set for Spark to use YARN.

  • + Share This
  • 🔖 Save To Your Account