-
- Downloads
[SPARK-17996][SQL] Fix unqualified catalog.getFunction(...)
## What changes were proposed in this pull request? Currently an unqualified `getFunction(..)`call returns a wrong result; the returned function is shown as temporary function without a database. For example: ``` scala> sql("create function fn1 as 'org.apache.hadoop.hive.ql.udf.generic.GenericUDFAbs'") res0: org.apache.spark.sql.DataFrame = [] scala> spark.catalog.getFunction("fn1") res1: org.apache.spark.sql.catalog.Function = Function[name='fn1', className='org.apache.hadoop.hive.ql.udf.generic.GenericUDFAbs', isTemporary='true'] ``` This PR fixes this by adding database information to ExpressionInfo (which is used to store the function information). ## How was this patch tested? Added more thorough tests to `CatalogSuite`. Author: Herman van Hovell <hvanhovell@databricks.com> Closes #15542 from hvanhovell/SPARK-17996.
Showing
- sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java 12 additions, 2 deletions...apache/spark/sql/catalyst/expressions/ExpressionInfo.java
- sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala 1 addition, 1 deletion...apache/spark/sql/catalyst/analysis/FunctionRegistry.scala
- sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala 8 additions, 2 deletions...rg/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
- sql/core/src/main/scala/org/apache/spark/sql/execution/command/functions.scala 3 additions, 2 deletions...la/org/apache/spark/sql/execution/command/functions.scala
- sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala 3 additions, 3 deletions...ain/scala/org/apache/spark/sql/internal/CatalogImpl.scala
- sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala 12 additions, 3 deletions...st/scala/org/apache/spark/sql/internal/CatalogSuite.scala
Loading
Please register or sign in to comment