Does anyone by any chance know of a slack/chat where spark/pyspark/jdbc developers hang around? I'm playing around with a small framework and something seems to have changed from databricks LTS 10.4 to LTS 11.3 that causes reading with pyspark-jdbc-sqlserver to generate SQL ending in a invalid 'WHERE (1)' when there should not be any where clause at all as far as I can tell. It seems to be caused by joining/comparing incoming dataframe from sqlserver to a empty delta table before writing. Once the table has some data the exact same query runs fine.