Skip to content
Snippets Groups Projects
Commit 0588dc7c authored by Hossein's avatar Hossein Committed by Xiangrui Meng
Browse files

[SPARK-20088] Do not create new SparkContext in SparkR createSparkContext

## What changes were proposed in this pull request?
Instead of creating new `JavaSparkContext` we use `SparkContext.getOrCreate`.

## How was this patch tested?
Existing tests

Author: Hossein <hossein@databricks.com>

Closes #17423 from falaki/SPARK-20088.
parent 89049345
No related branches found
No related tags found
No related merge requests found
......@@ -136,7 +136,7 @@ private[r] object RRDD {
.mkString(File.separator))
}
val jsc = new JavaSparkContext(sparkConf)
val jsc = new JavaSparkContext(SparkContext.getOrCreate(sparkConf))
jars.foreach { jar =>
jsc.addJar(jar)
}
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment