Alex
02/06/2023, 9:08 AMIt is possible the underlying files have been updated. You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in SQL or by recreating the Dataset/DataFrame involved.
How it can be possible to have a file name cached in spark? We are getting a strange error regarding:
java.io.FileNotFoundException: No such file or directory: s3a://.....snappy.parquet
I restarted the app but still the same issue.
I checked delta logs and the file is not present in delta_log. How is spark able to remember this file when it is not even present in delta log?