2016-12-04 380 views

回答

3

貌似你试图调用readSparkContext,而不是SQLContextSparkSession

// New 2.0.+ API: create SparkSession and use it for all purposes: 
val session = SparkSession.builder().appName("test").master("local").getOrCreate() 
session.read.load("/file") // OK 

// Old <= 1.6.* API: create SparkContext, then create a SQLContext for DataFrame API usage: 
val sc = new SparkContext("local", "test") // used for RDD operations only 
val sqlContext = new SQLContext(sc) // used for DataFrame/DataSet APIs 

sqlContext.read.load("/file") // OK 
sc.read.load("/file") // NOT OK 
0

为sqlcontext功能完整的语法如下

val df = sqlContext 
       .read() 
       .format("com.databricks.spark.csv") 
       .option("inferScheme","true") 
       .option("header","true") 
       .load("path to/data.csv"); 

在您正在阅读/写作csv文件的情况下

相关问题