-
- Downloads
[SPARK-15392][SQL] fix default value of size estimation of logical plan
## What changes were proposed in this pull request? We use autoBroadcastJoinThreshold + 1L as the default value of size estimation, that is not good in 2.0, because we will calculate the size based on size of schema, then the estimation could be less than autoBroadcastJoinThreshold if you have an SELECT on top of an DataFrame created from RDD. This PR change the default value to Long.MaxValue. ## How was this patch tested? Added regression tests. Author: Davies Liu <davies@databricks.com> Closes #13183 from davies/fix_default_size.
Showing
- python/pyspark/sql/dataframe.py 1 addition, 1 deletionpython/pyspark/sql/dataframe.py
- sql/core/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala 4 additions, 5 deletions...rc/main/scala/org/apache/spark/sql/internal/SQLConf.scala
- sql/core/src/test/scala/org/apache/spark/sql/JoinSuite.scala 1 addition, 1 deletionsql/core/src/test/scala/org/apache/spark/sql/JoinSuite.scala
- sql/core/src/test/scala/org/apache/spark/sql/StatisticsSuite.scala 34 additions, 0 deletions...src/test/scala/org/apache/spark/sql/StatisticsSuite.scala
Loading
Please register or sign in to comment