Jacek
07/03/2023, 10:39 AMShivam Panicker
07/04/2023, 5:30 AMJacek
07/04/2023, 8:53 AMsbt '++ 2.13.5 -v' clean compile publishLocal
that seems to give me an empty delta-storage jar (for some reason).Shivam Panicker
07/05/2023, 12:26 AMscalaVersion := default_scala_version
which is set to "2.12".
Ran build/sbt clean
build/sbt compile
build/sbt package
.Jacek
07/05/2023, 8:35 PM$ rm -rf ~/.ivy2/local ~/.ivy2/cache
and sbt publishLocal
?clean
, compile
and then package
do not build DL but package it locally. In other words, when you fire up a Spark app with DL, the jars are not in the local ivy repo so all you end up with are jars from the central repo (aka downloaded not yours). That’s why it could work for you (so does it for me in this setup).[error] /Users/jacek/dev/oss/delta/kernel/kernel-api/src/main/java/io/delta/kernel/internal/deletionvectors/RoaringBitmapArray.java:214:1: as of release 9, '_' is a keyword, and may not be used as an identifier
[error] /Users/jacek/dev/oss/delta/kernel/kernel-api/src/main/java/io/delta/kernel/internal/deletionvectors/RoaringBitmapArray.java:214:1: as of release 9, '_' is a keyword, and may not be used as an identifier
[error] /Users/jacek/dev/oss/delta/kernel/kernel-api/src/main/java/io/delta/kernel/internal/deletionvectors/RoaringBitmapArray.java:214:1: as of release 9, '_' is a keyword, and may not be used as an identifier
sbt kernelApi/Compile/doc
(while building with Scala 2.13 and the target JVM as 11)_
as identifiers). I do wonder how it works for others on Java 11?!diff --git a/build.sbt b/build.sbt
index 85b61fe5..d7ade06c 100644
--- a/build.sbt
+++ b/build.sbt
@@ -21,7 +21,7 @@ import java.nio.file.Files
// Scala versions
val scala212 = "2.12.15"
val scala213 = "2.13.5"
-val default_scala_version = scala212
+val default_scala_version = scala213
val all_scala_versions = Seq(scala212, scala213)
// Dependent library versions
@@ -47,15 +47,16 @@ scalaVersion := default_scala_version
// crossScalaVersions must be set to Nil on the root project
crossScalaVersions := Nil
+val targetJvm = "11"
lazy val commonSettings = Seq(
organization := "io.delta",
scalaVersion := default_scala_version,
crossScalaVersions := all_scala_versions,
fork := true,
- scalacOptions ++= Seq("-target:jvm-1.8", "-Ywarn-unused:imports"),
- javacOptions ++= Seq("-source", "1.8"),
+ scalacOptions ++= Seq(s"-target:jvm-$targetJvm", "-Ywarn-unused:imports"),
+ javacOptions ++= Seq("-source", targetJvm),
// -target cannot be passed as a parameter to javadoc. See <https://github.com/sbt/sbt/issues/355>
- Compile / compile / javacOptions ++= Seq("-target", "1.8"),
+ Compile / compile / javacOptions ++= Seq("-target", targetJvm),
// Make sure any tests in any project that uses Spark is configured for running well locally
Test / javaOptions ++= Seq(
Shivam Panicker
07/05/2023, 11:18 PM