2017-04-19 356 views
0

我在Spark中有逻辑回归模型。
我想从输出向量中提取标签= 1的概率并计算areaUnderROC。Spark中逻辑回归模型的areaUnderROC计算

val assembler = new VectorAssembler() 
.setInputCols(Array("A","B","C","D","E"))--for example 
.setOutputCol("features") 

val data = assembler.transform(logregdata) 

val Array(training,test) = data.randomSplit(Array(0.7,0.3),seed=12345) 
val training1 = training.select("label", "features") 
val test1 = test.select("label", "features") 

val lr = new LogisticRegression() 
val model = lr.fit(training1) 
val results = model.transform(test1) 
results.show() 

label|   features|  rawPrediction| probability| prediction| 
+-----+--------------------+--------------------+--------------------+---------- 

    0.0|(54,[13,31,34,35,...|[2.44227333947447...|[0.91999457581425...|  0.0| 

import org.apache.spark.mllib.evaluation.MulticlassMetrics 

val predictionAndLabels =results.select($"probability",$"label").as[(Double,Double)].rdd 
val metrics = new MulticlassMetrics(predictionAndLabels) 
val auROC= metrics.areaUnderROC() 

概率看起来像这样:[0.9199945758142595,0.0800054241857405]
如何可以从矢量提取标签= 1的概率,并计算AUC?

+0

我不明白这个问题。这不是默认情况下UNROC将计算的区域吗? – jamborta

+0

它假设是。在Python中,相同的模型返回AUC = 91%,Spark AUC = 73%。我想手动测试它。我如何从矢量中提取概率值? – Liron

回答

0

您可以从底层的RDD中获得价值。这将返回tuple与您的原始标签和P(label=1)的预测值:

val predictions = results.map(row => (row.getAs[Double]("label"), row.getAs[Vector]("probability")(0))) 
+0

我试过了,它不工作......我得到这个警告: org.apache.spark.sql.AnalysisException:无法从概率#5477提取值; – Liron

+0

谢谢。似乎它的工作。预测:org.apache.spark.sql.Dataset [(Double,Double)] = [_1:double,_2:double] 但我无法显示结果。我得到这个错误:org.apache.spark.ml.linalg.DenseVector不能转换为org.apache.spark.mllib.linalg.Vector。我如何看到我收到的预测? – Liron

+0

我不能重现你的错误,但你可以尝试指定确切的类型:'import org.apache.spark.ml.linalg.DenseVector',然后'val predictions = results.map(row =>(row.getAs [Double ](“label”),row.getAs [DenseVector](“probability”)(0)))' – jamborta