Skip to content
Snippets Groups Projects
  • Patrick Wendell's avatar
    dc3b640a
    SPARK-1619 Launch spark-shell with spark-submit · dc3b640a
    Patrick Wendell authored
    This simplifies the shell a bunch and passes all arguments through to spark-submit.
    
    There is a tiny incompatibility from 0.9.1 which is that you can't put `-c` _or_ `--cores`, only `--cores`. However, spark-submit will give a good error message in this case, I don't think many people used this, and it's a trivial change for users.
    
    Author: Patrick Wendell <pwendell@gmail.com>
    
    Closes #542 from pwendell/spark-shell and squashes the following commits:
    
    9eb3e6f [Patrick Wendell] Updating Spark docs
    b552459 [Patrick Wendell] Andrew's feedback
    97720fa [Patrick Wendell] Review feedback
    aa2900b [Patrick Wendell] SPARK-1619 Launch spark-shell with spark-submit
    dc3b640a
    History
    SPARK-1619 Launch spark-shell with spark-submit
    Patrick Wendell authored
    This simplifies the shell a bunch and passes all arguments through to spark-submit.
    
    There is a tiny incompatibility from 0.9.1 which is that you can't put `-c` _or_ `--cores`, only `--cores`. However, spark-submit will give a good error message in this case, I don't think many people used this, and it's a trivial change for users.
    
    Author: Patrick Wendell <pwendell@gmail.com>
    
    Closes #542 from pwendell/spark-shell and squashes the following commits:
    
    9eb3e6f [Patrick Wendell] Updating Spark docs
    b552459 [Patrick Wendell] Andrew's feedback
    97720fa [Patrick Wendell] Review feedback
    aa2900b [Patrick Wendell] SPARK-1619 Launch spark-shell with spark-submit
spark-standalone.md 17.40 KiB
layout: global
title: Spark Standalone Mode
  • This will become a table of contents (this text will be scraped). {:toc}

In addition to running on the Mesos or YARN cluster managers, Spark also provides a simple standalone deploy mode. You can launch a standalone cluster either manually, by starting a master and workers by hand, or use our provided launch scripts. It is also possible to run these daemons on a single machine for testing.

Installing Spark Standalone to a Cluster

To install Spark Standalone mode, you simply place a compiled version of Spark on each node on the cluster. You can obtain pre-built versions of Spark with each release or build it yourself.

Starting a Cluster Manually

You can start a standalone master server by executing:

./sbin/start-master.sh

Once started, the master will print out a spark://HOST:PORT URL for itself, which you can use to connect workers to it, or pass as the "master" argument to SparkContext. You can also find this URL on the master's web UI, which is http://localhost:8080 by default.

Similarly, you can start one or more workers and connect them to the master via: