Skip to content
Snippets Groups Projects
Commit 457dc9cc authored by jerryshao's avatar jerryshao Committed by Wenchen Fan
Browse files

[MINOR][DOC] Improve the docs about how to correctly set configurations

## What changes were proposed in this pull request?

Spark provides several ways to set configurations, either from configuration file, or from `spark-submit` command line options, or programmatically through `SparkConf` class. It may confuses beginners why some configurations set through `SparkConf` cannot take affect. So here add some docs to address this problems and let beginners know how to correctly set configurations.

## How was this patch tested?

N/A

Author: jerryshao <sshao@hortonworks.com>

Closes #18552 from jerryshao/improve-doc.
parent 680b33f1
No related branches found
No related tags found
No related merge requests found
......@@ -95,6 +95,13 @@ in the `spark-defaults.conf` file. A few configuration keys have been renamed si
versions of Spark; in such cases, the older key names are still accepted, but take lower
precedence than any instance of the newer key.
Spark properties mainly can be divided into two kinds: one is related to deploy, like
"spark.driver.memory", "spark.executor.instances", this kind of properties may not be affected when
setting programmatically through `SparkConf` in runtime, or the behavior is depending on which
cluster manager and deploy mode you choose, so it would be suggested to set through configuration
file or `spark-submit` command line options; another is mainly related to Spark runtime control,
like "spark.task.maxFailures", this kind of properties can be set in either way.
## Viewing Spark Properties
The application web UI at `http://<driver>:4040` lists Spark properties in the "Environment" tab.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment