An error occurred while fetching folder content.
hyukjinkwon
authored
[SPARK-14189][SQL] JSON data sources find compatible types even if inferred decimal type is not capable of the others ## What changes were proposed in this pull request? https://issues.apache.org/jira/browse/SPARK-14189 When inferred types in the same field during finding compatible `DataType`, are `IntegralType` and `DecimalType` but `DecimalType` is not capable of the given `IntegralType`, JSON data source simply fails to find a compatible type resulting in `StringType`. This can be observed when `prefersDecimal` is enabled. ```scala def mixedIntegerAndDoubleRecords: RDD[String] = sqlContext.sparkContext.parallelize( """{"a": 3, "b": 1.1}""" :: """{"a": 3.1, "b": 1}""" :: Nil) val jsonDF = sqlContext.read .option("prefersDecimal", "true") .json(mixedIntegerAndDoubleRecords) .printSchema() ``` - **Before** ``` root |-- a: string (nullable = true) |-- b: string (nullable = true) ``` - **After** ``` root |-- a: decimal(21, 1) (nullable = true) |-- b: decimal(21, 1) (nullable = true) ``` (Note that integer is inferred as `LongType` which becomes `DecimalType(20, 0)`) ## How was this patch tested? unit tests were used and style tests by `dev/run_tests`. Author: hyukjinkwon <gurwls223@gmail.com> Closes #11993 from HyukjinKwon/SPARK-14189.
Name | Last commit | Last update |
---|