-
- Downloads
[SPARK-12495][SQL] use true as default value for propagateNull in NewInstance
Most of cases we should propagate null when call `NewInstance`, and so far there is only one case we should stop null propagation: create product/java bean. So I think it makes more sense to propagate null by dafault. This also fixes a bug when encode null array/map, which is firstly discovered in https://github.com/apache/spark/pull/10401 Author: Wenchen Fan <wenchen@databricks.com> Closes #10443 from cloud-fan/encoder.
Showing
- sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala 8 additions, 8 deletions...ala/org/apache/spark/sql/catalyst/JavaTypeInference.scala
- sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala 8 additions, 8 deletions...scala/org/apache/spark/sql/catalyst/ScalaReflection.scala
- sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/ExpressionEncoder.scala 1 addition, 1 deletion...pache/spark/sql/catalyst/encoders/ExpressionEncoder.scala
- sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/RowEncoder.scala 0 additions, 2 deletions...a/org/apache/spark/sql/catalyst/encoders/RowEncoder.scala
- sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects.scala 6 additions, 6 deletions...a/org/apache/spark/sql/catalyst/expressions/objects.scala
- sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/encoders/EncoderResolutionSuite.scala 12 additions, 12 deletions.../spark/sql/catalyst/encoders/EncoderResolutionSuite.scala
- sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/encoders/ExpressionEncoderSuite.scala 3 additions, 0 deletions.../spark/sql/catalyst/encoders/ExpressionEncoderSuite.scala
Loading
Please register or sign in to comment