https://delta.io logo
i

Itai Yaffe

08/23/2023, 6:37 AM
Building Delta Lake and running tests locally Hey folks, I've followed the IntelliJ Setup guidelines (based on the master branch), and when running
DeltaLogSuite
as part of the Setup Verification, I encountered 2 issues: 1.
paths should be canonicalized
test seem to hang forever 2.
delete and re-add the same file in different transactions
test fails consistently:
Copy code
Some(AddFile("foo", Map(), 1, 1692771567496, false, "{"numRecords": 1}", null, null, None, None)) did not equal Some(AddFile("foo", Map(), 1, 1692771567496, false, "{"numRecords": 1}", null, null, Some(1), Some(2)))
ScalaTestFailureLocation: org.apache.spark.sql.delta.DeltaLogSuite at (DeltaLogSuite.scala:315)
Expected :Some(AddFile("foo", Map(), 1, 1692771567496, false, "{"numRecords": 1}", null, null, Some(1), Some(2)))
Actual   :Some(AddFile("foo", Map(), 1, 1692771567496, false, "{"numRecords": 1}", null, null, None, None))
The
createTestAddFile
method creates a new
AddFile
with
baseRowId
and
defaultRowCommitVersion
being
None
(which correlates with the
Actual
result), but I'm assuming one of the Delta infra methods is expected to change both values (the same way the
dataChange
field is initialized as
true
, but is changed to
false
at a later stage). Any insights? Do I need to run the tests with any specific flags? Also, is the Java 1.8 JDK requirement (here) still valid, or can we use Java 11? Thanks!