0

知道为什么我们会收到这些错误吗?

ubuntu@group-3-vm1:~/software/sbt/bin$ ./sbt package
[info] Set current project to hello (in build file:/home/ubuntu/software/sbt/bin/)
[info] Compiling 1 Scala source to /home/ubuntu/software/sbt/bin/target/scala-2.11/classes...
[error] /home/ubuntu/software/sbt/bin/hi.scala:1: object apache is not a member of package org
[error] import org.apache.spark.SparkContext
[error]            ^
[error] /home/ubuntu/software/sbt/bin/hi.scala:2: object apache is not a member of package org
[error] import org.apache.spark.SparkContext._
[error]            ^

代码是:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.api.java._
import org.apache.spark.api.java.function.Function_
import org.apache.spark.graphx._
import org.apache.spark.graphx.lib._
import org.apache.spark.graphx.PartitionStrategy._
//class PartBQ1{

object PartBQ1{
val conf = new SparkConf().setMaster("spark://10.0.1.31:7077")
             .setAppName("CS-838-Assignment2-Question2")
             .set("spark.driver.memory", "1g")
             .set("spark.eventLog.enabled", "true")
             .set("spark.eventLog.dir", "/home/ubuntu/storage/logs")
             .set("spark.executor.memory", "21g")
             .set("spark.executor.cores", "4")
             .set("spark.cores.max", "4")
             .set("spark.task.cpus", "1")

val sc = new SparkContext(conf=conf)
sql_ctx = new SQLContext(sc)
graph = GraphLoader.edgeListFile(sc, "data2.txt")
}
4

1 回答 1

1

似乎缺少一个 sbt 文件。像:

简单的.sbt

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
于 2015-10-26T01:55:22.933 回答