我写过two case class
扩展 Base abstract class
。我有每个班级的两个列表(listA
和listB
)。当我想合并这两个列表时,我无法将最终列表转换为 Apache Spark 1.6.1 数据集。
abstract class Base
case class A(name: String) extends Base
case class B(age: Int) extends Base
val listA: List[A] = A("foo")::A("bar")::Nil
val listB: List[B] = B(10)::B(20)::Nil
val list: List[Base with Product with Serializable] = listA ++ listB
val result: RDD[Base with Product with Serializable] = sc.parallelize(list).toDS()
Apache Spark 将引发此异常:
A needed class was not found. This could be due to an error in your runpath. Missing class: no Java class corresponding to Base with Product with Serializable found
java.lang.NoClassDefFoundError: no Java class corresponding to Base with Product with Serializable found
at scala.reflect.runtime.JavaMirrors$JavaMirror.typeToJavaClass(JavaMirrors.scala:1299)
at scala.reflect.runtime.JavaMirrors$JavaMirror.runtimeClass(JavaMirrors.scala:192)
at scala.reflect.runtime.JavaMirrors$JavaMirror.runtimeClass(JavaMirrors.scala:54)
at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:50)
at org.apache.spark.sql.SQLImplicits.newProductEncoder(SQLImplicits.scala:41)
当我想从list
Spark 创建 RDD 时不会抛出任何异常,但是当我使用toDS()
方法将 RDD 转换为 Dataset 时,会抛出这个先前的异常。