https://delta.io logo
r

Rudhra Raveendran

06/22/2023, 10:55 PM
I recently enabled column mapping on my delta table and I'm having an issue where a bunch of columns are now displaying the physical names instead of the logical names, e.g.
col-04ee4877-ee53-4cb9-b1fb-1a4eb74b508c
. The underlying parquet files still have the right column names (which I think is the whole point of the column mapping right?), but is there a way to force delta table reads to use the logical/parquet file reads?
g

Gerhard Brueckl

06/23/2023, 7:17 AM
are you using the same tool to read and write the table? if yes then its indeed weird and should not happen
r

Rudhra Raveendran

06/23/2023, 7:54 PM
I think I figured out the issue, the underling parquet files don't actually have the right column name (I was misinterpreting the data I was seeing). I was reading files with spark.read.parquet(), but changing it to spark.read.format("delta").load() now shows the correct column names!
👍 1