Trey Yi03/29/2023, 11:16 AM
Failed to merge decimal types with incompatible scale 4 to 8
Omkar03/29/2023, 3:06 PM
delta_table with score = [20.38121234, 80.12341234] --> DecimalType(30,8)
Now, when you try to perform a merge operation of
new_data with score = [30.2455, 90.1234] --> DecimalType(26,4)
, you may encounter the error
since their scales (number of digits on right side of decimal point) are 8 and 4 respectively. So the workaround would be converting all values to single decimal type. e.g: converting
Failed to merge decimal types with incompatible scale
column to `DecimalType(30,8)`:
And then retry the merge, which should work.
Trey Yi03/29/2023, 3:48 PM
column with decimal (17,4) • I created a delta table called `test_table`` with schema that contains
column with decimal (17.8) • now I have a new delta data with
column with decimal(17,8) • How could I merge the two dataset together and save to the
Omkar03/29/2023, 3:55 PM
to DecimalType(17, 4) using Spark's
function. This will avoid any decimal type conflicts with your existing delta data when you perform the
operation. Regarding how to merge the two delta tables, you can refer to this doc: https://docs.delta.io/latest/delta-update.html#language-python
Trey Yi03/29/2023, 3:58 PM
column with Decimal(17,8)
Omkar03/29/2023, 4:01 PM
holds lesser digits (17-8=9) on left side of the number as compared to
(17-4=13), so there's a chance that Spark might throw a casting error.
Trey Yi03/29/2023, 4:03 PM
Omkar03/29/2023, 4:08 PM
and then merge them. Since
will be able to store larger number of digits on both left (30) as well as right (8) side of the decimal, it won't throw any casting errors and you'll also get the data in a single datatype. Try it out maybe!
Trey Yi03/29/2023, 6:22 PM
becomes null when chainging Decimal(17,4) to Decimal(17,8). I should find out something else.