Skip to content
Snippets Groups Projects
user avatar
Wenchen Fan authored
## What changes were proposed in this pull request?

see https://github.com/apache/spark/pull/12873#discussion_r61993910. The problem is, if we create `SparkContext` first and then call `SparkSession.builder.enableHiveSupport().getOrCreate()`, we will reuse the existing `SparkContext` and the hive flag won't be set.

## How was this patch tested?

verified it locally.

Author: Wenchen Fan <wenchen@databricks.com>

Closes #12890 from cloud-fan/repl.
a432a2b8
History
Name Last commit Last update
..
scala-2.10/src
scala-2.11/src
src
pom.xml