-
- Downloads
Add way to limit default # of cores used by applications on standalone mode
Also documents the spark.deploy.spreadOut option.
Showing
- core/src/main/scala/org/apache/spark/SparkConf.scala 6 additions, 1 deletioncore/src/main/scala/org/apache/spark/SparkConf.scala
- core/src/main/scala/org/apache/spark/SparkContext.scala 1 addition, 1 deletioncore/src/main/scala/org/apache/spark/SparkContext.scala
- core/src/main/scala/org/apache/spark/deploy/master/ApplicationInfo.scala 5 additions, 2 deletions...cala/org/apache/spark/deploy/master/ApplicationInfo.scala
- core/src/main/scala/org/apache/spark/deploy/master/Master.scala 6 additions, 2 deletions...rc/main/scala/org/apache/spark/deploy/master/Master.scala
- docs/configuration.md 29 additions, 4 deletionsdocs/configuration.md
- docs/css/bootstrap.min.css 1 addition, 1 deletiondocs/css/bootstrap.min.css
- docs/job-scheduling.md 2 additions, 3 deletionsdocs/job-scheduling.md
- docs/spark-standalone.md 10 additions, 0 deletionsdocs/spark-standalone.md
Loading
Please register or sign in to comment