Skip to content
Snippets Groups Projects
Commit d2d438d1 authored by Eric Liang's avatar Eric Liang Committed by Reynold Xin
Browse files

[SPARK-18167][SQL] Add debug code for SQLQuerySuite flakiness when metastore...

[SPARK-18167][SQL] Add debug code for SQLQuerySuite flakiness when metastore partition pruning is enabled

## What changes were proposed in this pull request?

org.apache.spark.sql.hive.execution.SQLQuerySuite is flaking when hive partition pruning is enabled.
Based on the stack traces, it seems to be an old issue where Hive fails to cast a numeric partition column ("Invalid character string format for type DECIMAL"). There are two possibilities here: either we are somehow corrupting the partition table to have non-decimal values in that column, or there is a transient issue with Derby.

This PR logs the result of the retry when this exception is encountered, so we can confirm what is going on.

## How was this patch tested?

n/a

cc yhuai

Author: Eric Liang <ekl@databricks.com>

Closes #15676 from ericl/spark-18167.
parent 59cccbda
No related branches found
No related tags found
No related merge requests found
......@@ -24,6 +24,7 @@ import java.util.{ArrayList => JArrayList, List => JList, Map => JMap, Set => JS
import java.util.concurrent.TimeUnit
import scala.collection.JavaConverters._
import scala.util.Try
import scala.util.control.NonFatal
import org.apache.hadoop.fs.{FileSystem, Path}
......@@ -585,7 +586,19 @@ private[client] class Shim_v0_13 extends Shim_v0_12 {
getAllPartitionsMethod.invoke(hive, table).asInstanceOf[JSet[Partition]]
} else {
logDebug(s"Hive metastore filter is '$filter'.")
getPartitionsByFilterMethod.invoke(hive, table, filter).asInstanceOf[JArrayList[Partition]]
try {
getPartitionsByFilterMethod.invoke(hive, table, filter)
.asInstanceOf[JArrayList[Partition]]
} catch {
case e: InvocationTargetException =>
// SPARK-18167 retry to investigate the flaky test. This should be reverted before
// the release is cut.
val retry = Try(getPartitionsByFilterMethod.invoke(hive, table, filter))
val full = Try(getAllPartitionsMethod.invoke(hive, table))
logError("getPartitionsByFilter failed, retry success = " + retry.isSuccess)
logError("getPartitionsByFilter failed, full fetch success = " + full.isSuccess)
throw e
}
}
partitions.asScala.toSeq
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment