Skip to content
Snippets Groups Projects
Commit f719cccd authored by Jeff Zhang's avatar Jeff Zhang Committed by Felix Cheung
Browse files

[SPARK-19572][SPARKR] Allow to disable hive in sparkR shell


## What changes were proposed in this pull request?
SPARK-15236 do this for scala shell, this ticket is for sparkR shell. This is not only for sparkR itself, but can also benefit downstream project like livy which use shell.R for its interactive session. For now, livy has no control of whether enable hive or not.

## How was this patch tested?

Tested it manually, run `bin/sparkR --master local --conf spark.sql.catalogImplementation=in-memory` and verify hive is not enabled.

Author: Jeff Zhang <zjffdu@apache.org>

Closes #16907 from zjffdu/SPARK-19572.

(cherry picked from commit 73158805)
Signed-off-by: default avatarFelix Cheung <felixcheung@apache.org>
parent d887f758
No related branches found
No related tags found
No related merge requests found
......@@ -47,12 +47,14 @@ private[sql] object SQLUtils extends Logging {
jsc: JavaSparkContext,
sparkConfigMap: JMap[Object, Object],
enableHiveSupport: Boolean): SparkSession = {
val spark = if (SparkSession.hiveClassesArePresent && enableHiveSupport) {
val spark = if (SparkSession.hiveClassesArePresent && enableHiveSupport
&& jsc.sc.conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase == "hive") {
SparkSession.builder().sparkContext(withHiveExternalCatalog(jsc.sc)).getOrCreate()
} else {
if (enableHiveSupport) {
logWarning("SparkR: enableHiveSupport is requested for SparkSession but " +
"Spark is not built with Hive; falling back to without Hive support.")
s"Spark is not built with Hive or ${CATALOG_IMPLEMENTATION.key} is not set to 'hive', " +
"falling back to without Hive support.")
}
SparkSession.builder().sparkContext(jsc.sc).getOrCreate()
}
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment