Skip to content
Snippets Groups Projects
Commit 94c6c06e authored by Xiangrui Meng's avatar Xiangrui Meng Committed by Reynold Xin
Browse files

[FIX] do not load defaults when testing SparkConf in pyspark

The default constructor loads default properties, which can fail the test.

Author: Xiangrui Meng <meng@databricks.com>

Closes #775 from mengxr/pyspark-conf-fix and squashes the following commits:

83ef6c4 [Xiangrui Meng] do not load defaults when testing SparkConf in pyspark
parent 65533c7e
No related branches found
No related tags found
No related merge requests found
......@@ -33,7 +33,7 @@ u'My app'
>>> sc.sparkHome == None
True
>>> conf = SparkConf()
>>> conf = SparkConf(loadDefaults=False)
>>> conf.setSparkHome("/path")
<pyspark.conf.SparkConf object at ...>
>>> conf.get("spark.home")
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment