如何使用IntelliJ解决Scala中的异常?

Aaditya Ura

我正在尝试运行此项目,我已在sbt文件中添加了依赖性,我的sbt文件如下所示:

name := "HelloScala"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"
resolvers += Resolver.bintrayRepo("salesforce", "maven")
libraryDependencies += "com.salesforce.transmogrifai" %% "transmogrifai-core" % "0.3.4"

然后,我Helloworld从他们的存储库中复制了该文件夹,但是有很多问题。

Information:10/09/18, 12:01 PM - Compilation completed with 88 errors and 0 warnings in 15 s 624 ms
Error:scalac: missing or invalid dependency detected while loading class file 'package.class'.
Could not access type Vector in value org.apache.spark.ml.linalg,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'package.class' was compiled against an incompatible version of org.apache.spark.ml.linalg.
Error:scalac: missing or invalid dependency detected while loading class file 'OPVector.class'.
Could not access type Vector in value org.apache.spark.ml.linalg,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OPVector.class' was compiled against an incompatible version of org.apache.spark.ml.linalg.
Error:scalac: missing or invalid dependency detected while loading class file 'OpEvaluatorBase.class'.
Could not access type Evaluator in value org.apache.spark.ml.evaluation,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpEvaluatorBase.class' was compiled against an incompatible version of org.apache.spark.ml.evaluation.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasLabelCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasLabelCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasFullPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasFullPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasRawPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasRawPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasProbabilityCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasProbabilityCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'ClassificationModelSelector.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'ClassificationModelSelector.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'InputParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'InputParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpPipelineStageBase.class'.
Could not access type MLWritable in value org.apache.spark.ml.util,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpPipelineStageBase.class' was compiled against an incompatible version of org.apache.spark.ml.util.
Error:scalac: missing or invalid dependency detected while loading class file 'HasLogisticRegression.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasLogisticRegression.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasRandomForestBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasRandomForestBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasDecisionTreeBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasDecisionTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasNaiveBayes.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasNaiveBayes.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'DataReaders.class'.
Could not access type Encoder in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'DataReaders.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'OpWorkflow.class'.
Could not access type SparkSession in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpWorkflow.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'SplitterParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'SplitterParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'ModelSelectorBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'ModelSelectorBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'HasLinearRegression.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasLinearRegression.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasGradientBoostedTreeBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasGradientBoostedTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasRandomForestBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasRandomForestBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'DataCutterParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'DataCutterParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasDecisionTreeBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasDecisionTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'FeatureBuilder.class'.
Could not access term package in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'FeatureBuilder.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'FeatureBuilder.class'.
Could not access type DataFrame in value org.apache.spark.sql.package,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'FeatureBuilder.class' was compiled against an incompatible version of org.apache.spark.sql.package.
Error:scalac: missing or invalid dependency detected while loading class file 'OpWorkflowCore.class'.
Could not access type Dataset in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpWorkflowCore.class' was compiled against an incompatible version of org.apache.spark.sql.
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/OpTitanicSimple.scala
Error:(42, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(95, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
Error:(143, 8) overloaded method value setLabelCol with alternatives:
  (value: com.salesforce.op.features.FeatureLike[T])OpHasLabelCol.this.type <and>
  (value: String)OpHasLabelCol.this.type
 cannot be applied to (com.salesforce.op.features.Feature[com.salesforce.op.features.types.RealNN])
      .setLabelCol(survived)
Error:(154, 64) could not find implicit value for evidence parameter of type org.apache.spark.sql.Encoder[com.salesforce.hw.Passenger]
    val trainDataReader = DataReaders.Simple.csvCase[Passenger](
Error:(166, 40) could not find implicit value for parameter spark: org.apache.spark.sql.SparkSession
    val fittedWorkflow = workflow.train()
Error:(174, 15) value columns is not a member of Any
    dataframe.columns.foreach(println)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/boston/OpBoston.scala
Error:(41, 8) object Dataset is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(41, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(56, 47) not found: type SparkSession
  def customRead(path: Option[String], spark: SparkSession): RDD[BostonHouse] = {
Error:(69, 90) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(69, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(77, 90) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(77, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(94, 6) value setGradientBoostedTreeSeed is not a member of com.salesforce.op.stages.impl.selector.HasRandomForestBase[E,MS]
possible cause: maybe a semicolon is missing before `value setGradientBoostedTreeSeed'?
    .setGradientBoostedTreeSeed(randomSeed)
Error:(100, 43) overloaded method value setLabelCol with alternatives:
  (value: com.salesforce.op.features.FeatureLike[T])OpHasLabelCol.this.type <and>
  (value: String)OpHasLabelCol.this.type
 cannot be applied to (com.salesforce.op.features.Feature[com.salesforce.op.features.types.RealNN])
  val evaluator = Evaluators.Regression().setLabelCol(medv).setPredictionCol(prediction)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/dataprep/ConditionalAggregation.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(69, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/dataprep/JoinsAndAggregates.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(74, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/IrisFeatures.scala
Error:(38, 36) not found: type Iris
  val id = FeatureBuilder.Integral[Iris].extract(_.getID.toIntegral).asPredictor
Error:(39, 41) not found: type Iris
  val sepalLength = FeatureBuilder.Real[Iris].extract(_.getSepalLength.toReal).asPredictor
Error:(40, 40) not found: type Iris
  val sepalWidth = FeatureBuilder.Real[Iris].extract(_.getSepalWidth.toReal).asPredictor
Error:(41, 41) not found: type Iris
  val petalLength = FeatureBuilder.Real[Iris].extract(_.getPetalLength.toReal).asPredictor
Error:(42, 40) not found: type Iris
  val petalWidth = FeatureBuilder.Real[Iris].extract(_.getPetalWidth.toReal).asPredictor
Error:(43, 39) not found: type Iris
  val irisClass = FeatureBuilder.Text[Iris].extract(_.getClass$.toText).asResponse
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/IrisKryoRegistrator.scala
Error:(40, 47) type Iris is not a member of package com.salesforce.hw.iris
    doAvroRegistration[com.salesforce.hw.iris.Iris](kryo)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/OpIris.scala
Error:(41, 8) object Dataset is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(41, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(56, 37) not found: type Iris
  val irisReader = new CustomReader[Iris](key = _.getID.toString){
Error:(57, 76) not found: type Iris
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(57, 83) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(57, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(79, 6) value setInput is not a member of com.salesforce.op.stages.impl.selector.HasDecisionTreeBase[E,MS]
possible cause: maybe a semicolon is missing before `value setInput'?
    .setInput(labels, features).getOutput()
Error:(87, 53) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
Error:(87, 59) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
Error:(87, 64) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/OpTitanic.scala
Error:(54, 45) not found: type Passenger
  val simpleReader = DataReaders.Simple.csv[Passenger](
Error:(55, 5) not found: value schema
    schema = Passenger.getClassSchema.toString, key = _.getPassengerId.toString
Error:(55, 49) not found: value key
    schema = Passenger.getClassSchema.toString, key = _.getPassengerId.toString
Error:(79, 6) value setModelsToTry is not a member of com.salesforce.op.stages.impl.selector.HasRandomForestBase[E,MS]
possible cause: maybe a semicolon is missing before `value setModelsToTry'?
    .setModelsToTry(LogisticRegression, RandomForest)
Error:(83, 53) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw)
Error:(83, 59) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/TitanicFeatures.scala
Error:(41, 40) not found: type Passenger
  val pClass = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getPclass).map(_.toString).toPickList).asPredictor // scalastyle:off
Error:(43, 34) not found: type Passenger
  val name = FeatureBuilder.Text[Passenger].extract(d => Option(d.getName).toText).asPredictor
Error:(45, 37) not found: type Passenger
  val sex = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getSex).toPickList).asPredictor
Error:(47, 33) not found: type Passenger
  val age = FeatureBuilder.Real[Passenger].extract(d => Option(Double.unbox(d.getAge)).toReal).asPredictor
Error:(49, 39) not found: type Passenger
  val sibSp = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getSibSp).map(_.toString).toPickList).asPredictor
Error:(51, 39) not found: type Passenger
  val parch = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getParch).map(_.toString).toPickList).asPredictor
Error:(53, 40) not found: type Passenger
  val ticket = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getTicket).toPickList).asPredictor
Error:(57, 39) not found: type Passenger
  val cabin = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getCabin).toPickList).asPredictor
Error:(59, 42) not found: type Passenger
  val embarked = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getEmbarked).toPickList).asPredictor
Error:(39, 40) not found: type Passenger
  val survived = FeatureBuilder.RealNN[Passenger].extract(_.getSurvived.toDouble.toRealNN).asResponse
Error:(55, 34) not found: type Passenger
  val fare = FeatureBuilder.Real[Passenger].extract(d => Option(Double.unbox(d.getFare)).toReal).asPredictor
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/OpTitanicMini.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(66, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(new SparkConf()).getOrCreate()
Error:(75, 34) value transmogrify is not a member of Any
    val featureVector = features.transmogrify()
Error:(78, 36) value sanityCheck is not a member of Any
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(78, 63) not found: value checkSample
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(78, 82) not found: value removeBadFeatures
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(81, 73) too many arguments for method setInput: (features: (com.salesforce.op.features.FeatureLike[com.salesforce.op.features.types.RealNN], com.salesforce.op.features.FeatureLike[com.salesforce.op.features.types.OPVector]))com.salesforce.op.stages.impl.classification.BinaryClassificationModelSelector
    val (pred, raw, prob) = BinaryClassificationModelSelector().setInput(survived, checkedFeatures).getOutput()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/TitanicKryoRegistrator.scala
Error:(41, 50) type Passenger is not a member of package com.salesforce.hw.titanic
    doAvroRegistration[com.salesforce.hw.titanic.Passenger](kryo)

我试图搜索这些问题,发现可能是版本问题,但是如果存在版本问题,我没有得到应该使用哪个版本的信息。但是,如果我尝试从命令行运行它,那么它将起作用:

cd helloworld
./gradlew compileTestScala installDist
./gradlew -q sparkSubmit -Dmain=com.salesforce.hw.OpTitanicSimple -Dargs="\
`pwd`/src/main/resources/TitanicDataset/TitanicPassengersTrainData.csv"

IntelliJ无法正常工作,如何解决此问题?

卢卡(Luca T.)

build.sbt两个依赖是缺失的:spark-mllibspark-sql

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.3.1",
  "org.apache.spark" %% "spark-sql" % "2.3.1",
  "com.salesforce.transmogrifai" %% "transmogrifai-core" % "0.3.4"
)

这将删除第一个错误块。

本文收集自互联网,转载请注明来源。

如有侵权,请联系 [email protected] 删除。

编辑于
0

我来说两句

0 条评论
登录 后参与评论

相关文章

如何使用 Intellij 在 Java 中解决 fileNotFoundException

如何“覆盖”scala中的异常?

如何使用Intellij Idea的异常断点

使用高DPI时如何解决IntelliJ IDEA中的字体抗锯齿问题?

使用IntelliJ中的Elasticsearch测试框架,如何解决关于idea_rt.jar的问题?

如何使IntelliJ识别Scala脚本中的导入?

如何使用Scala解析器解决解析路径中的错误?

如何解决Java中的spring temlateInput异常?

如何解决C#中的系统参数异常错误?

如何在Pylibmodbus中解决此TypeError异常?

如何解决Servelts中的非法状态异常

如何解决JavaScriptSerializer中超出maxJsonLength的异常?

如何解决Git中的“远端异常挂断”错误

如何解决C#中的OutOfMemory异常?

如何解决 PyTest 中的陈旧元素异常

GET中的StackOverflow异常错误,如何解决?

如何解决laravel中获取异常id的问题

如何解决线程主线程中的异常?怎么了?

如何使用(@JsonIgnore)解决(com.fasterxml.jackson.databind)异常

如何使用spring-boot和hibernate解决stackoverflow异常?

如何使用Swift 4.2解决JSON可解码的坏异常?

如何使用 ChartTrackBallBehavior 解决“值不能为空”异常

如何使IntelliJ scala项目使用Scala 2.9.2版本?

如何使用EF Core 3.1.5解决ASP.net Core应用程序中的空引用异常?

在Intellij中,如何在断点处引发异常?

如何修复IntelliJ插件中的“连接被拒绝”异常

如何解决intellij中的无效源发布错误

Maven:如何解决IntelliJ中的依赖性问题

如何使用Intellij中的compojure