Matthew Powers
05/03/2023, 6:52 PMJoydeep Banik Roy
05/04/2023, 4:49 PMMatthew Powers
05/04/2023, 9:58 PMMatthew Powers
05/04/2023, 10:04 PMMatthew Powers
05/04/2023, 10:04 PMAbidi Gassen
05/04/2023, 10:23 PMAbidi Gassen
05/04/2023, 10:39 PMAbidi Gassen
05/04/2023, 10:39 PMJoydeep Banik Roy
05/05/2023, 6:36 AMJoydeep Banik Roy
05/05/2023, 9:56 AMMatthew Powers
05/08/2023, 9:49 PMMatthew Powers
05/08/2023, 9:52 PMMatthew Powers
05/09/2023, 1:02 PMMatthew Powers
05/12/2023, 6:39 PMNick Karpov
05/12/2023, 6:42 PMshingo
05/12/2023, 11:46 PMshingo
05/12/2023, 11:47 PMMatthew Powers
05/15/2023, 5:38 PMMichael Shtelma
05/16/2023, 10:40 AMshingo
05/17/2023, 12:16 AMshingo
06/01/2023, 9:32 AMAbidi Gassen
06/12/2023, 11:02 PMshingo
06/13/2023, 5:58 PMshingo
06/13/2023, 6:08 PMshingo
06/14/2023, 6:01 AMMatthew Powers
06/14/2023, 2:05 PMAbidi Gassen
06/22/2023, 9:50 AMRahul Madnawat
08/14/2023, 5:50 PMspark.conf.set("spark.delta.logStore.s3.impl", "io.delta.storage.S3DynamoDBLogStore")
spark.conf.set("spark.delta.DynamoDBLogStore.ddb.region", "us-west-2")
spark.conf.set("spark.io.delta.storage.S3DynamoDBLogStore.ddb.tableName", "ddb_test")
df = spark.createDataFrame(
[
(1, "foo"),
(2, "bar"),
],
["id", "label"]
)
df.write.mode("append").format("delta").save(<S3 path>)
and didn't see a DynamoDB table created. Would appreciate any pointers, thanks!Abidi Gassen
08/21/2023, 11:20 AM