Skip to content
Snippets Groups Projects
Commit 9cfa9a51 authored by Sun Rui's avatar Sun Rui Committed by Shivaram Venkataraman
Browse files

[SPARK-6812] [SPARKR] filter() on DataFrame does not work as expected.

According to the R manual: https://stat.ethz.ch/R-manual/R-devel/library/base/html/Startup.html,
" if a function .First is found on the search path, it is executed as .First(). Finally, function .First.sys() in the base package is run. This calls require to attach the default packages specified by options("defaultPackages")."
In .First() in profile/shell.R, we load SparkR package. This means SparkR package is loaded before default packages. If there are same names in default packages, they will overwrite those in SparkR. This is why filter() in SparkR is masked by filter() in stats, which is usually in the default package list.
We need to make sure SparkR is loaded after default packages. The solution is to append SparkR to default packages, instead of loading SparkR in .First().

BTW, I'd like to discuss our policy on how to solve name conflict. Previously, we rename API names from Scala API if there is name conflict with base or other commonly-used packages. However, from long term perspective, this is not good for API stability, because we can't predict name conflicts, for example, if in the future a name added in base package conflicts with an API in SparkR? So the better policy is to keep API name same as Scala's without worrying about name conflicts. When users use SparkR, they should load SparkR as last package, so that all API names are effective. Use can explicitly use :: to refer to hidden names from other packages. If we agree on this, I can submit a JIRA issue to change back some rename API methods, for example, DataFrame.sortDF().

Author: Sun Rui <rui.sun@intel.com>

Closes #5938 from sun-rui/SPARK-6812 and squashes the following commits:

b569145 [Sun Rui] [SPARK-6812][SparkR] filter() on DataFrame does not work as expected.
parent 773aa252
No related branches found
No related tags found
No related merge requests found
......@@ -20,11 +20,13 @@
.libPaths(c(file.path(home, "R", "lib"), .libPaths()))
Sys.setenv(NOAWT=1)
library(utils)
library(SparkR)
sc <- sparkR.init(Sys.getenv("MASTER", unset = ""))
# Make sure SparkR package is the last loaded one
old <- getOption("defaultPackages")
options(defaultPackages = c(old, "SparkR"))
sc <- SparkR::sparkR.init(Sys.getenv("MASTER", unset = ""))
assign("sc", sc, envir=.GlobalEnv)
sqlCtx <- sparkRSQL.init(sc)
sqlCtx <- SparkR::sparkRSQL.init(sc)
assign("sqlCtx", sqlCtx, envir=.GlobalEnv)
cat("\n Welcome to SparkR!")
cat("\n Spark context is available as sc, SQL context is available as sqlCtx\n")
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment