https://delta.io logo
m

Marius Grama

01/10/2023, 6:27 AM
I’m experimenting with CDF - specifically with table schema evolution - and i created a scenario where I add a column
x
of type
int
, add data into the column, perform modifications , drop the column and subsequently I add a new column
x
of type
string
and add data into the newly added column and perform modifications. While calling
table_changes
function I get the message :
Error in SQL statement: DeltaColumnMappingUnsupportedSchemaIncompatibleException: Change Data Feed (CDF) read is not supported on tables with column mapping schema changes (e.g. rename or drop).
…
You may force enable streaming read at your own risk by turning on spark.databricks.delta.changeDataFeed.unsafeBatchReadOnIncompatibleSchemaChanges.enabled.
I did add the property
spark.databricks.delta.changeDataFeed.unsafeBatchReadOnIncompatibleSchemaChanges.enabled
to my test table (in
tblproperties
while creating the table) - however I still receive the above mentioned exception. Can somebody provide me a hint?
t

Tony S.

01/10/2023, 10:15 AM
The doc (bottom of the page, under '_Known Limitation_s') clearly states why CDF reads would not work in your case. I wouldn't rely on unsafe ops on CDF, which is still on experimental support mode at this time.
šŸ‘ 1
s

Scott Sandre (Delta Lake)

01/11/2023, 6:31 PM
Hi @Marius Grama.
spark.databricks.delta.changeDataFeed.unsafeBatchReadOnIncompatibleSchemaChanges.enabled
is a SQL configuration ... not a delta table property. Can you try adding it as a SQL conf to your spark session instead?
šŸ‘ 1
17 Views