https://delta.io logo
p

Pablo Flores

01/12/2023, 8:02 PM
Hello folks! We are trying to migrate a job from spark to flink using the Flink delta connector. An issue that we are facing is that our data is partitioned by a date field, so when we try to write using a flink app partitioning by the date column we receive an
unsupported type
error. In spark we encoded the field as a
java.sql.Date
. Any suggestions on how to handle this scenario or any known workaround for this?
n

Nick Karpov

01/12/2023, 9:29 PM
hi @Pablo Flores, could you please share more detail in a github issue? cc @Scott Sandre (Delta Lake)
👍 1
p

Pablo Flores

01/12/2023, 9:31 PM
Sure thing! Thanks!
m

Maks Lyzhkov

01/12/2023, 11:19 PM
try to cast partition column to STRING as I know spark do it under the hood
p

Pablo Flores

01/12/2023, 11:22 PM
Thanks Maks! We’ll try that and check if the date range queries still work
5 Views