https://delta.io logo
j

Jacek

07/03/2023, 10:39 AM
I think I should rather ask this question here. So here it goes.
s

Shivam Panicker

07/04/2023, 5:30 AM
Hey @Jacek, I've create sbt package delta-iceberg_2.12-3.0.0rc1.jar which works fine in my local environment which has Java 11 and scala 2.12.17 and Spark 3.4.1. Can you try with 2.12 instead?
j

Jacek

07/04/2023, 8:53 AM
What was the command to build delta-iceberg? I used
sbt '++ 2.13.5 -v' clean compile publishLocal
that seems to give me an empty delta-storage jar (for some reason).
And BTW, if 2.13 is the issue, we should fix that (even if it means to document it in README)
s

Shivam Panicker

07/05/2023, 12:26 AM
The sbt file has
Copy code
scalaVersion := default_scala_version
which is set to "2.12". Ran
build/sbt clean
build/sbt compile
build/sbt package
.
Let me have a go at 2.13 today
This message contains interactive elements.
❤️ 1
j

Jacek

07/05/2023, 8:35 PM
Can you remove all the jars in
$ rm -rf ~/.ivy2/local ~/.ivy2/cache
and
sbt publishLocal
?
clean
,
compile
and then
package
do not build DL but package it locally. In other words, when you fire up a Spark app with DL, the jars are not in the local ivy repo so all you end up with are jars from the central repo (aka downloaded not yours). That’s why it could work for you (so does it for me in this setup).
1
This is what I managed to narrow it down…
Copy code
[error] /Users/jacek/dev/oss/delta/kernel/kernel-api/src/main/java/io/delta/kernel/internal/deletionvectors/RoaringBitmapArray.java:214:1: as of release 9, '_' is a keyword, and may not be used as an identifier
[error] /Users/jacek/dev/oss/delta/kernel/kernel-api/src/main/java/io/delta/kernel/internal/deletionvectors/RoaringBitmapArray.java:214:1: as of release 9, '_' is a keyword, and may not be used as an identifier
[error] /Users/jacek/dev/oss/delta/kernel/kernel-api/src/main/java/io/delta/kernel/internal/deletionvectors/RoaringBitmapArray.java:214:1: as of release 9, '_' is a keyword, and may not be used as an identifier
this is with Java 11 when
sbt kernelApi/Compile/doc
(while building with Scala 2.13 and the target JVM as 11)
It works! I had to manually change a few lines and the Java class (to avoid
_
as identifiers). I do wonder how it works for others on Java 11?!
Copy code
diff --git a/build.sbt b/build.sbt
index 85b61fe5..d7ade06c 100644
--- a/build.sbt
+++ b/build.sbt
@@ -21,7 +21,7 @@ import java.nio.file.Files
 // Scala versions
 val scala212 = "2.12.15"
 val scala213 = "2.13.5"
-val default_scala_version = scala212
+val default_scala_version = scala213
 val all_scala_versions = Seq(scala212, scala213)

 // Dependent library versions
@@ -47,15 +47,16 @@ scalaVersion := default_scala_version
 // crossScalaVersions must be set to Nil on the root project
 crossScalaVersions := Nil

+val targetJvm = "11"
 lazy val commonSettings = Seq(
   organization := "io.delta",
   scalaVersion := default_scala_version,
   crossScalaVersions := all_scala_versions,
   fork := true,
-  scalacOptions ++= Seq("-target:jvm-1.8", "-Ywarn-unused:imports"),
-  javacOptions ++= Seq("-source", "1.8"),
+  scalacOptions ++= Seq(s"-target:jvm-$targetJvm", "-Ywarn-unused:imports"),
+  javacOptions ++= Seq("-source", targetJvm),
   // -target cannot be passed as a parameter to javadoc. See <https://github.com/sbt/sbt/issues/355>
-  Compile / compile / javacOptions ++= Seq("-target", "1.8"),
+  Compile / compile / javacOptions ++= Seq("-target", targetJvm),

   // Make sure any tests in any project that uses Spark is configured for running well locally
   Test / javaOptions ++= Seq(
One of a few to fix it once and for all 😉
s

Shivam Panicker

07/05/2023, 11:18 PM
😄 Legit stuff. Thanks!
👍 1