我概述了SBT多项目设置
https://github.com/geoHeil/sf-sbt-multiproject-dependency-problem
并希望能够执行
sbt console
在根项目中。
执行时:
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().master("local[*]").enableHiveSupport.getOrCreate
spark.sql("CREATE database foo")
在根控制台中,错误为:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.EmbeddedDriver
奇怪的是,它在子项目中运行良好:
sbt
project common
console
现在粘贴相同的代码。
问题
-
如何修复sbt控制台以直接加载正确的依赖项?
-
如何直接从子项目加载控制台?sbt common/console似乎无法解决此问题。
详细信息
最重要的设置如下:
lazy val global = project
.in(file("."))
.settings(
settings,
libraryDependencies ++= commonDependencies
)
.aggregate(
common
)
.dependsOn(
common
)
lazy val common = project
.settings(
name := "common",
settings,
libraryDependencies ++= commonDependencies
)
lazy val dependencies =
new {
val sparkV = "2.3.0"
val sparkBase = "org.apache.spark" %% "spark-core" % sparkV % "provided"
val sparkSql = "org.apache.spark" %% "spark-sql" % sparkV % "provided"
val sparkHive = "org.apache.spark" %% "spark-hive" % sparkV % "provided"
}
lazy val commonDependencies = Seq(
dependencies.sparkBase,
dependencies.sparkHive,
dependencies.sparkSql
)
lazy val settings = commonSettings
lazy val commonSettings = Seq(
fork := true,
run in Compile := Defaults
.runTask(fullClasspath in Compile, mainClass.in(Compile, run), runner.in(Compile, run))
.evaluated
)
相关问题
编辑
奇怪的是:对于spark 2.2.0版,此设置工作正常。只有2.2.1/2.3.0会导致这些问题,但在单个项目设置中或在正确的项目中启动控制台时,可以正常工作。
而且
java.security.AccessControlException: access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )'
在堆栈跟踪中提到。