在Spark 1.6.2(Scala 2.10.5)上,以下代码在shell中运行良好:
import org.apache.spark.mllib.linalg.Vector
case class DataPoint(vid: String, label: Double, features: Vector)
mllib Vector正确地遮盖了Scala Vector。
但是,在Spark 2.0(Scala 2.11.8)上,相同的代码在shell中引发以下错误:
<console>:11: error: type Vector takes type parameters
case class DataPoint(vid: String, label: Double, features: Vector)
为了使其工作,我现在必须明确命名该类:
case class DataPoint(vid: String, label: Double,
features: org.apache.spark.mllib.linalg.Vector)
有人可以告诉我发生了什么变化吗,这里的Spark或Scala是否存在问题?谢谢!
最简单的解决方案就是这个问题很简单paste
:
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0-SNAPSHOT
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_102)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.apache.spark.mllib.linalg.Vector
import org.apache.spark.mllib.linalg.Vector
scala> case class DataPoint(vid: String, label: Double, features: Vector)
<console>:11: error: type Vector takes type parameters
case class DataPoint(vid: String, label: Double, features: Vector)
^
scala> :paste
// Entering paste mode (ctrl-D to finish)
import org.apache.spark.mllib.linalg.Vector
case class DataPoint(vid: String, label: Double, features: Vector)
// Exiting paste mode, now interpreting.
import org.apache.spark.mllib.linalg.Vector
defined class DataPoint
本文收集自互联网,转载请注明来源。
如有侵权,请联系 [email protected] 删除。
我来说两句