Skip to content
Snippets Groups Projects
Commit d1eac3ef authored by Weiqing Yang's avatar Weiqing Yang Committed by Herman van Hovell
Browse files

[SPARK-17108][SQL] Fix BIGINT and INT comparison failure in spark sql


## What changes were proposed in this pull request?

Add a function to check if two integers are compatible when invoking `acceptsType()` in `DataType`.
## How was this patch tested?

Manually.
E.g.

```
    spark.sql("create table t3(a map<bigint, array<string>>)")
    spark.sql("select * from t3 where a[1] is not null")
```

Before:

```
cannot resolve 't.`a`[1]' due to data type mismatch: argument 2 requires bigint type, however, '1' is of int type.; line 1 pos 22
org.apache.spark.sql.AnalysisException: cannot resolve 't.`a`[1]' due to data type mismatch: argument 2 requires bigint type, however, '1' is of int type.; line 1 pos 22
    at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:82)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:74)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:307)
```

After:
 Run the sql queries above. No errors.

Author: Weiqing Yang <yangweiqing001@gmail.com>

Closes #15448 from weiqingy/SPARK_17108.

(cherry picked from commit 0d95662e)
Signed-off-by: default avatarHerman van Hovell <hvanhovell@databricks.com>
parent 7a84edb2
No related branches found
No related tags found
No related merge requests found
......@@ -260,7 +260,7 @@ case class GetArrayItem(child: Expression, ordinal: Expression)
* We need to do type checking here as `key` expression maybe unresolved.
*/
case class GetMapValue(child: Expression, key: Expression)
extends BinaryExpression with ExpectsInputTypes with ExtractValue {
extends BinaryExpression with ImplicitCastInputTypes with ExtractValue {
private def keyType = child.dataType.asInstanceOf[MapType].keyType
......
......@@ -1939,6 +1939,18 @@ class SQLQuerySuite extends QueryTest with SQLTestUtils with TestHiveSingleton {
}
}
test("SPARK-17108: Fix BIGINT and INT comparison failure in spark sql") {
sql("create table t1(a map<bigint, array<string>>)")
sql("select * from t1 where a[1] is not null")
sql("create table t2(a map<int, array<string>>)")
sql("select * from t2 where a[1] is not null")
sql("create table t3(a map<bigint, array<string>>)")
sql("select * from t3 where a[1L] is not null")
}
test("SPARK-17796 Support wildcard character in filename for LOAD DATA LOCAL INPATH") {
withTempDir { dir =>
for (i <- 1 to 3) {
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment