Skip to content
Snippets Groups Projects
Commit 431a3d04 authored by Dongjoon Hyun's avatar Dongjoon Hyun Committed by Sean Owen
Browse files

[SPARK-12653][SQL] Re-enable test "SPARK-8489: MissingRequirementError during reflection"

## What changes were proposed in this pull request?

The purpose of [SPARK-12653](https://issues.apache.org/jira/browse/SPARK-12653) is re-enabling a regression test.
Historically, the target regression test is added by [SPARK-8498](https://github.com/apache/spark/commit/093c34838d1db7a9375f36a9a2ab5d96a23ae683), but is temporarily disabled by [SPARK-12615](https://github.com/apache/spark/commit/8ce645d4eeda203cf5e100c4bdba2d71edd44e6a) due to binary compatibility error.

The following is the current error message at the submitting spark job with the pre-built `test.jar` file in the target regression test.
```
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$6()Lscala/collection/Map;
```

Simple rebuilding `test.jar` can not recover the purpose of testcase since we need to support both Scala 2.10 and 2.11 for a while. For example, we will face the following Scala 2.11 error if we use `test.jar` built by Scala 2.10.
```
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
```

This PR replace the existing `test.jar` with `test-2.10.jar` and `test-2.11.jar` and improve the regression test to use the suitable jar file.

## How was this patch tested?

Pass the existing Jenkins test.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #11744 from dongjoon-hyun/SPARK-12653.
parent 92024797
No related branches found
No related tags found
No related merge requests found
File added
......@@ -22,6 +22,7 @@ import java.sql.Timestamp
import java.util.Date
import scala.collection.mutable.ArrayBuffer
import scala.tools.nsc.Properties
import org.scalatest.{BeforeAndAfterEach, Matchers}
import org.scalatest.concurrent.Timeouts
......@@ -87,13 +88,17 @@ class HiveSparkSubmitSuite
runSparkSubmit(args)
}
ignore("SPARK-8489: MissingRequirementError during reflection") {
test("SPARK-8489: MissingRequirementError during reflection") {
// This test uses a pre-built jar to test SPARK-8489. In a nutshell, this test creates
// a HiveContext and uses it to create a data frame from an RDD using reflection.
// Before the fix in SPARK-8470, this results in a MissingRequirementError because
// the HiveContext code mistakenly overrides the class loader that contains user classes.
// For more detail, see sql/hive/src/test/resources/regression-test-SPARK-8489/*scala.
val testJar = "sql/hive/src/test/resources/regression-test-SPARK-8489/test.jar"
val version = Properties.versionNumberString match {
case v if v.startsWith("2.10") || v.startsWith("2.11") => v.substring(0, 4)
case x => throw new Exception(s"Unsupported Scala Version: $x")
}
val testJar = s"sql/hive/src/test/resources/regression-test-SPARK-8489/test-$version.jar"
val args = Seq(
"--conf", "spark.ui.enabled=false",
"--conf", "spark.master.rest.enabled=false",
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment