Hi All, I have a use case where I need to write the data into same delta-lake location Parallely. When I try to do this using pyspark,I got the concurrency error is there any way to handle this issue?
I am not sure whether this I need to post it in #random or here. Apologies if I posted wrongly
Its because the certificate expired i guess. But its the offical site.
s
sabari dass
04/06/2023, 9:48 PM
Ok, now I am able to access it. Also I think I found the solution in stack overflow for my exact use case. When I read concurrency control page asking to create partition for the delta table which will not help for my usecase 🙂
Thanks for your help!