-
- Downloads
[SPARK-17577][SPARKR][CORE] SparkR support add files to Spark job and get by executors
## What changes were proposed in this pull request? Scala/Python users can add files to Spark job by submit options ```--files``` or ```SparkContext.addFile()```. Meanwhile, users can get the added file by ```SparkFiles.get(filename)```. We should also support this function for SparkR users, since they also have the requirements for some shared dependency files. For example, SparkR users can download third party R packages to driver firstly, add these files to the Spark job as dependency by this API and then each executor can install these packages by ```install.packages```. ## How was this patch tested? Add unit test. Author: Yanbo Liang <ybliang8@gmail.com> Closes #15131 from yanboliang/spark-17577.
Showing
- R/pkg/NAMESPACE 3 additions, 0 deletionsR/pkg/NAMESPACE
- R/pkg/R/context.R 48 additions, 0 deletionsR/pkg/R/context.R
- R/pkg/inst/tests/testthat/test_context.R 13 additions, 0 deletionsR/pkg/inst/tests/testthat/test_context.R
- core/src/main/scala/org/apache/spark/SparkContext.scala 3 additions, 3 deletionscore/src/main/scala/org/apache/spark/SparkContext.scala
Loading
Please register or sign in to comment