Skip to content
Snippets Groups Projects
Commit 37526aca authored by Davies Liu's avatar Davies Liu Committed by Davies Liu
Browse files

[SPARK-10980] [SQL] fix bug in create Decimal

The created decimal is wrong if using `Decimal(unscaled, precision, scale)` with unscaled > 1e18 and and precision > 18 and scale > 0.

This bug exists since the beginning.

Author: Davies Liu <davies@databricks.com>

Closes #9014 from davies/fix_decimal.
parent 7bf07faa
No related branches found
No related tags found
No related merge requests found
......@@ -88,7 +88,7 @@ final class Decimal extends Ordered[Decimal] with Serializable {
if (precision < 19) {
return null // Requested precision is too low to represent this value
}
this.decimalVal = BigDecimal(unscaled)
this.decimalVal = BigDecimal(unscaled, scale)
this.longVal = 0L
} else {
val p = POW_10(math.min(precision, MAX_LONG_DIGITS))
......
......@@ -44,6 +44,7 @@ class DecimalSuite extends SparkFunSuite with PrivateMethodTester {
checkDecimal(Decimal(170L, 4, 2), "1.70", 4, 2)
checkDecimal(Decimal(17L, 24, 1), "1.7", 24, 1)
checkDecimal(Decimal(1e17.toLong, 18, 0), 1e17.toLong.toString, 18, 0)
checkDecimal(Decimal(1000000000000000000L, 20, 2), "10000000000000000.00", 20, 2)
checkDecimal(Decimal(Long.MaxValue), Long.MaxValue.toString, 20, 0)
checkDecimal(Decimal(Long.MinValue), Long.MinValue.toString, 20, 0)
intercept[IllegalArgumentException](Decimal(170L, 2, 1))
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment