-
Josh Rosen authored
Also fixed a typo in the standalone mode documentation.
Josh Rosen authoredAlso fixed a typo in the standalone mode documentation.
layout: global
title: Spark Standalone Mode
{% comment %} TODO(andyk):
- Add a table of contents
- Move configuration towards the end so that it doesn't come first
- Say the scripts will guess the resource amounts (i.e. # cores) automatically {% endcomment %}
In addition to running on top of Mesos, Spark also supports a standalone mode, consisting of one Spark master and several Spark worker processes. You can run the Spark standalone mode either locally (for testing) or on a cluster. If you wish to run on a cluster, we have provided a set of deploy scripts to launch a whole cluster.
Getting Started
Compile Spark with sbt package
as described in the Getting Started Guide. You do not need to install Mesos on your machine if you are using the standalone mode.
Starting a Cluster Manually
You can start a standalone master server by executing:
./run spark.deploy.master.Master
Once started, the master will print out a spark://IP:PORT
URL for itself, which you can use to connect workers to it,
or pass as the "master" argument to SparkContext
to connect a job to the cluster. You can also find this URL on
the master's web UI, which is http://localhost:8080 by default.