Skip to content
Snippets Groups Projects
Commit aeb2ecc0 authored by Xianyang Liu's avatar Xianyang Liu Committed by Marcelo Vanzin
Browse files

[SPARK-20621][DEPLOY] Delete deprecated config parameter in 'spark-env.sh'

## What changes were proposed in this pull request?

Currently, `spark.executor.instances` is deprecated in `spark-env.sh`, because we suggest config it in `spark-defaults.conf` or other config file. And also this parameter is useless even if you set it in `spark-env.sh`, so remove it in this patch.

## How was this patch tested?

Existing tests.

Please review http://spark.apache.org/contributing.html before opening a pull request.

Author: Xianyang Liu <xianyang.liu@intel.com>

Closes #17881 from ConeyLiu/deprecatedParam.
parent 58518d07
No related branches found
No related tags found
No related merge requests found
......@@ -34,7 +34,6 @@
# Options read in YARN client mode
# - HADOOP_CONF_DIR, to point Spark towards Hadoop configuration files
# - SPARK_EXECUTOR_INSTANCES, Number of executors to start (Default: 2)
# - SPARK_EXECUTOR_CORES, Number of cores for the executors (Default: 1).
# - SPARK_EXECUTOR_MEMORY, Memory per Executor (e.g. 1000M, 2G) (Default: 1G)
# - SPARK_DRIVER_MEMORY, Memory for Driver (e.g. 1000M, 2G) (Default: 1G)
......
......@@ -280,10 +280,7 @@ object YarnSparkHadoopUtil {
initialNumExecutors
} else {
val targetNumExecutors =
sys.env.get("SPARK_EXECUTOR_INSTANCES").map(_.toInt).getOrElse(numExecutors)
// System property can override environment variable.
conf.get(EXECUTOR_INSTANCES).getOrElse(targetNumExecutors)
conf.get(EXECUTOR_INSTANCES).getOrElse(numExecutors)
}
}
}
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment