Hanan Shteingart
04/19/2023, 8:56 AMHanan Shteingart
04/19/2023, 8:57 AMMartin Beaussart
04/19/2023, 9:01 AMVincent Chee
04/19/2023, 3:24 PMBeni
04/20/2023, 8:44 AMVincent Chee
04/20/2023, 9:11 AMHarsh
04/20/2023, 12:04 PMJeremy Jordan
04/20/2023, 12:53 PMConcurrentAppendException
error. Is this expected? Do I need to implement retries in my write logic? I figured that after enabling multi-cluster setup that a cluster would wait for the lock to be released and then write a new transaction, I didn't think I would have to worry about retries. Do I have something misconfigured?Starkadur Hrobjartsson
04/20/2023, 1:56 PMenqueuedTime
that I want to rename to enqueued_time
.
The column enqueuedTime
is used to define three generated columns: year, month, day
The table itself is then partitioned on the year, month, day
generated columns.
When I do:
ALTER TABLE <table> RENAME COLUMN enqueuedTime TO enqueued_time;
I get the following error:
"Cannot rename column enqueuedTime because this column is referenced by the following generated columns:"
• year
• month
• day
How can I perform this renaming?Yuya Ebihara
04/20/2023, 10:46 PMunknown
(e.g. 1 > null
) in addition to true
from SQL standard perspective in my understanding. Can we update the protocol?
Rows in a table must satisfy CHECK constraints. In other words, evaluating the SQL expressions of CHECK constraints must returncc: @Slawomir Pajakfor each row in a table.true
SharathEndluri
04/21/2023, 1:50 PMVincent Chee
04/21/2023, 2:20 PMJack
04/22/2023, 4:49 AMVinay K L
04/22/2023, 1:14 PMDEEP CLONE & CLONE
syntax/feature isn’t yet supported as of oss delta-lake:2.3.0
right?? it’s only available in DBR
i suppose. Can someone please confirm if this is indeed the case?吕
04/23/2023, 8:49 AMspark.read.format("xml").option("rowTag","Documents").load("/data/pat/66086B.XML").write.format("delta").mode("append").save("<s3a://delta-lake/demo1>")
org.apache.spark.sql.AnalysisException: Failed to merge fields 'PatentDocument' and 'PatentDocument'. Failed to merge fields 'BibliographicData' and 'BibliographicData'. Failed to merge fields 'Parties' and 'Parties'. Failed to merge fields 'Agents' and 'Agents'. Failed to merge fields 'Agent' and 'Agent'. Failed to merge incompatible data types StructType(StructField(Name,StructType(StructField(_VALUE,StringType,true),StructField(_lang,StringType,true)),true),StructField(_format,StringType,true),StructField(_sequence,LongType,true)) and ArrayType(StructType(StructField(Name,StructType(StructField(_VALUE,StringType,true),StructField(_lang,StringType,true)),true),StructField(_format,StringType,true),StructField(_sequence,LongType,true)),true)
Why do the above errors occur?udit ujagar
04/24/2023, 10:04 AMDhruvil Shah
04/24/2023, 2:41 PMDhruvil Shah
04/24/2023, 2:41 PMDhruvil Shah
04/24/2023, 2:41 PMRahul Sharma
04/24/2023, 3:46 PMLucas Zago
04/24/2023, 11:35 PMtransform
, it returns that "it not possible to groupBy with .transform method"
Here is my code:
def qtd_faturada(col1,col2):
return (sum(when(col1=='S',col2))
.otherwise(-col2))
def with_qtd_faturada(df,col1,col2):
return df.withColumn("qtd_faturada",with_qtd_faturada(col(col1),col(col2)))
df=(df
.select(A,B,C)
.groupBy(C)
.transform(with_qtd_faturada,col1="A",col2="B"))
Am I missing something ?
Some advice would be really appreciated, 🙂Robin Moffatt
04/25/2023, 8:34 AMkeshav
04/25/2023, 12:27 PMxyz_delta_table
to something elseRahul Sharma
04/25/2023, 5:27 PMfrom pyspark.sql import SparkSession
spark = SparkSession.builder \
.appName("Delta Lake Auto-Optimization") \
.config("spark.databricks.delta.optimizeWrite.enabled", "true") \
.config("spark.databricks.delta.autoCompact.enabled", "true") \
.config("spark.databricks.delta.autoCompact.maxFileSize", "512") \
.getOrCreate()
# Your code to read and write Delta Lake files here
spark.stop()
Jack
04/26/2023, 5:06 AMRajat Mishra
04/26/2023, 6:38 AMAlber Tadrous
04/26/2023, 5:26 PMLucas Zago
04/26/2023, 6:17 PMresult.write \
.format("delta") \
.mode("overwrite") \
.option("overwriteSchema", "true") \
.saveAsTable('schema.table')
Georg Heiler
04/26/2023, 9:18 PM吕
04/27/2023, 9:37 AM