2017-05-16 70 views
0

这里是Scala的新开发者,也是Spark GraphX的新用户。 到目前为止,我真的很享受我的时间,但我刚刚有一个非常奇怪的错误。我已经将问题隔离为长整型转换,但它确实很奇怪。 另一个奇怪的是,它在Windows中正常工作,但在Linux中无法正常工作(创建无限循环)我在Linux中找到了问题的根源,但我不明白为什么会出现问题。我必须先将随机数字放入一个变量中,然后才能正常工作。使用Spark GraphX时INT/LONG转换的奇怪错误

你应该能够复制/粘贴和执行整个事情

斯卡拉2.10.6,星火2.1.0,Linux操作系统Ubuntu 16.04

import org.apache.spark.{SparkConf, SparkContext} 
    import org.apache.spark.graphx._ 
    import scala.util.Random 

object Main extends App { 

    //Fonction template pour imprimer n'importe quel graphe 
    def printGraph[VD,ED] (g : Graph[VD,ED]): Unit = { 
    g.vertices.collect.foreach(println) 
    } 

    def randomNumber(limit : Int) = { 
    val start = 1 
    val end = limit 
    val rnd = new Random 
    start + rnd.nextInt((end - start) + 1) 
    } 

    val conf = new SparkConf() 
    .setAppName("Simple Application") 
    .setMaster("local[*]") 

    val sc = new SparkContext(conf) 
    sc.setLogLevel("ERROR") 

    val myVertices = sc.makeRDD(Array((1L, "A"), (2L, "B"), (3L, "C"), (4L, "D"), (5L, "E"), (6L, "F"))) 

    val myEdges = sc.makeRDD(Array(Edge(1L, 2L, ""), 
    Edge(1L, 3L, ""), Edge(1L, 6L, ""), Edge(2L, 3L, ""), 
    Edge(2L, 4L, ""), Edge(2L, 5L, ""), Edge(3L, 5L, ""), 
    Edge(4L, 6L, ""), Edge(5L, 6L, ""))) 

    val myGraph = Graph(myVertices, myEdges) 

    //Add a random color to each vertice. This random color is chosen from the total number of vertices 
    //Transform vertex attribute to color only 

    val bug = myVertices.count() 
    println("Long : " + bug) 
    val bugInt = bug.toInt 
    println("Int : " + bugInt) 

    //Problem is here when adding myGraph.vertices.count().toInt inside randomNumber. Works on Windows, infinite loop on Linux. 
    val g2 = myGraph.mapVertices((id, name ) => (randomNumber(myGraph.vertices.count().toInt))) 

//Rest of code removed 



} 

回答

2

不知道,如果你正在寻找一个解决方案或潜在的原因。 我认为mapVertices方法正在干扰count(一个是转换,一个是动作)。

该解决方案将是

val lim = myGraph.vertices.count().toInt 
val g2 = myGraph.mapVertices((id, name ) => (randomNumber(lim)))