Rahul Sharma
04/25/2023, 5:27 PMfrom pyspark.sql import SparkSession
spark = SparkSession.builder \
.appName("Delta Lake Auto-Optimization") \
.config("spark.databricks.delta.optimizeWrite.enabled", "true") \
.config("spark.databricks.delta.autoCompact.enabled", "true") \
.config("spark.databricks.delta.autoCompact.maxFileSize", "512") \
.getOrCreate()
# Your code to read and write Delta Lake files here
spark.stop()
Scott Sandre (Delta Lake)
04/25/2023, 6:13 PM