I am writing some delta lake streaming with pyspark and tables etc. (Databricks and UC otherwise) and want to unit test some streaming functionality locally (SparkSession with warehouse.dir, deltaSparkSessionExtension, spark__catalog, delta__catalog, delta-core...).
Is this considered ok to do? I'm just after some input on what to think about/whether this is actually ok to do/dangers/etc.
Here's some mock code that runs just fine.