Skip to content
Snippets Groups Projects
Commit bf222173 authored by tone-zhang's avatar tone-zhang Committed by Sean Owen
Browse files

[SPARK-17330][SPARK UT] Clean up spark-warehouse in UT

## What changes were proposed in this pull request?

Check the database warehouse used in Spark UT, and remove the existing database file before run the UT (SPARK-8368).

## How was this patch tested?

Run Spark UT with the command for several times:
./build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver "test-only *HiveSparkSubmitSuit*"
Without the patch, the test case can be passed only at the first time, and always failed from the second time.
With the patch the test case always can be passed correctly.

Author: tone-zhang <tone.zhang@linaro.org>

Closes #14894 from tone-zhang/issue1.
parent 180796ec
No related branches found
No related tags found
No related merge requests found
......@@ -43,6 +43,8 @@ class DDLSuite extends QueryTest with SharedSQLContext with BeforeAndAfterEach {
// drop all databases, tables and functions after each test
spark.sessionState.catalog.reset()
} finally {
val path = System.getProperty("user.dir") + "/spark-warehouse"
Utils.deleteRecursively(new File(path))
super.afterEach()
}
}
......
......@@ -590,7 +590,9 @@ object SparkSubmitClassLoaderTest extends Logging {
def main(args: Array[String]) {
Utils.configTestLog4j("INFO")
val conf = new SparkConf()
val hiveWarehouseLocation = Utils.createTempDir()
conf.set("spark.ui.enabled", "false")
conf.set("spark.sql.warehouse.dir", hiveWarehouseLocation.toString)
val sc = new SparkContext(conf)
val hiveContext = new TestHiveContext(sc)
val df = hiveContext.createDataFrame((1 to 100).map(i => (i, i))).toDF("i", "j")
......@@ -699,11 +701,13 @@ object SPARK_9757 extends QueryTest {
def main(args: Array[String]): Unit = {
Utils.configTestLog4j("INFO")
val hiveWarehouseLocation = Utils.createTempDir()
val sparkContext = new SparkContext(
new SparkConf()
.set("spark.sql.hive.metastore.version", "0.13.1")
.set("spark.sql.hive.metastore.jars", "maven")
.set("spark.ui.enabled", "false"))
.set("spark.ui.enabled", "false")
.set("spark.sql.warehouse.dir", hiveWarehouseLocation.toString))
val hiveContext = new TestHiveContext(sparkContext)
spark = hiveContext.sparkSession
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment