Rahul Madnawat
08/14/2023, 5:50 PMspark.conf.set("spark.delta.logStore.s3.impl", "io.delta.storage.S3DynamoDBLogStore")
spark.conf.set("spark.delta.DynamoDBLogStore.ddb.region", "us-west-2")
spark.conf.set("spark.io.delta.storage.S3DynamoDBLogStore.ddb.tableName", "ddb_test")
df = spark.createDataFrame(
[
(1, "foo"),
(2, "bar"),
],
["id", "label"]
)
df.write.mode("append").format("delta").save(<S3 path>)
and didn't see a DynamoDB table created. Would appreciate any pointers, thanks!Nick Karpov
08/14/2023, 6:56 PMRahul Madnawat
08/14/2023, 8:06 PMNick Karpov
08/14/2023, 8:18 PMspark.delta.logStore.class io.delta.storage.S3DynamoDBLogStore
spark.databricks.tahoe.logStore.allowOSSLogStores true
spark.io.delta.storage.S3DynamoDBLogStore.ddb.tableName <dynamodb_table_name>
spark.io.delta.storage.S3DynamoDBLogStore.ddb.region <dynamodb_region>