[SPARK-21936][SQL] backward compatibility test framework for HiveExternalCatalog
## What changes were proposed in this pull request? `HiveExternalCatalog` is a semi-public interface. When creating tables, `HiveExternalCatalog` converts the table metadata to hive table format and save into hive metastore. It's very import to guarantee backward compatibility here, i.e., tables created by previous Spark versions should still be readable in newer Spark versions. Previously we find backward compatibility issues manually, which is really easy to miss bugs. This PR introduces a test framework to automatically test `HiveExternalCatalog` backward compatibility, by downloading Spark binaries with different versions, and create tables with these Spark versions, and read these tables with current Spark version. ## How was this patch tested? test-only change Author: Wenchen Fan <wenchen@databricks.com> Closes #19148 from cloud-fan/test.
Showing
- sql/hive/pom.xml 4 additions, 0 deletionssql/hive/pom.xml
- sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogBackwardCompatibilitySuite.scala 0 additions, 260 deletions.../hive/HiveExternalCatalogBackwardCompatibilitySuite.scala
- sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala 194 additions, 0 deletions...che/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala
- sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala 2 additions, 75 deletions...cala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala
- sql/hive/src/test/scala/org/apache/spark/sql/hive/MetastoreDataSourcesSuite.scala 0 additions, 27 deletions...org/apache/spark/sql/hive/MetastoreDataSourcesSuite.scala
- sql/hive/src/test/scala/org/apache/spark/sql/hive/SparkSubmitTestUtils.scala 101 additions, 0 deletions...cala/org/apache/spark/sql/hive/SparkSubmitTestUtils.scala
Loading
Please register or sign in to comment