Skip to content
Snippets Groups Projects
Commit c708e817 authored by Patrick Wendell's avatar Patrick Wendell
Browse files

Merge pull request #341 from ash211/patch-5

Clarify spark.cores.max in docs

It controls the count of cores across the cluster, not on a per-machine basis.
parents 33fcb91e 2dd4fb56
No related branches found
No related tags found
No related merge requests found
......@@ -81,7 +81,8 @@ there are at least five properties that you will commonly want to control:
<td>
When running on a <a href="spark-standalone.html">standalone deploy cluster</a> or a
<a href="running-on-mesos.html#mesos-run-modes">Mesos cluster in "coarse-grained"
sharing mode</a>, how many CPU cores to request at most. The default will use all available cores
sharing mode</a>, the maximum amount of CPU cores to request for the application from
across the cluster (not from each machine). The default will use all available cores
offered by the cluster manager.
</td>
</tr>
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment