Skip to content
Snippets Groups Projects
Commit 664c9795 authored by Shixiong Zhu's avatar Shixiong Zhu Committed by Yin Huai
Browse files

[SPARK-19816][SQL][TESTS] Fix an issue that DataFrameCallbackSuite doesn't recover the log level


## What changes were proposed in this pull request?

"DataFrameCallbackSuite.execute callback functions when a DataFrame action failed" sets the log level to "fatal" but doesn't recover it. Hence, tests running after it won't output any logs except fatal logs.

This PR uses `testQuietly` instead to avoid changing the log level.

## How was this patch tested?

Jenkins

Author: Shixiong Zhu <shixiong@databricks.com>

Closes #17156 from zsxwing/SPARK-19816.

(cherry picked from commit fbc40580)
Signed-off-by: default avatarYin Huai <yhuai@databricks.com>
parent da04d45c
No related branches found
No related tags found
No related merge requests found
......@@ -58,7 +58,7 @@ class DataFrameCallbackSuite extends QueryTest with SharedSQLContext {
spark.listenerManager.unregister(listener)
}
test("execute callback functions when a DataFrame action failed") {
testQuietly("execute callback functions when a DataFrame action failed") {
val metrics = ArrayBuffer.empty[(String, QueryExecution, Exception)]
val listener = new QueryExecutionListener {
override def onFailure(funcName: String, qe: QueryExecution, exception: Exception): Unit = {
......@@ -73,8 +73,6 @@ class DataFrameCallbackSuite extends QueryTest with SharedSQLContext {
val errorUdf = udf[Int, Int] { _ => throw new RuntimeException("udf error") }
val df = sparkContext.makeRDD(Seq(1 -> "a")).toDF("i", "j")
// Ignore the log when we are expecting an exception.
sparkContext.setLogLevel("FATAL")
val e = intercept[SparkException](df.select(errorUdf($"i")).collect())
assert(metrics.length == 1)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment