Anita A
03/10/2023, 4:49 PMDeltaTable.isDeltaTable
I get this error:
object is not callable
.
Thanks!Kashyap Bhatt
03/10/2023, 5:37 PMAnita A
03/10/2023, 6:24 PMfrom delta import configure_spark_with_delta_pip
spark_jars_packages = ["io.delta:delta-core_2.12:2.0.2"]
builder = SparkSession.builder.appName("MyApp") \
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") \
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
spark = configure_spark_with_delta_pip(builder, extra_packages=spark_jars_packages).getOrCreate()
def test_table():
assert DeltaTable.isDeltaTable(spark, "/some/path") == 1
And i have tried many different variation of injecting the delta core package.
This is the error that i get:
return jvm.io.delta.tables.DeltaTable.isDeltaTable(jsparkSession, identifier)
TypeError: 'JavaPackage' object is not callable
If i invoke that same code as a regular phyton code within Docker i sill get the error unless i invoke it like this
spark-submit --packages io.delta:delta-core_2.12:2.0.2
So basecally i need a way to load this delta core package as part of the pytest invocation.