2016-04-22 95 views
9

有两个DataFrames(Scala中,阿帕奇星火1.6.1)如何在Scala和Apache Spark中连接两个DataFrame?

1)匹配

  MatchID | Player1 | Player2 
     -------------------------------- 
       1 | John Wayne | John Doe 
       2 | Ive Fish | San Simon 

2)个人资料

   Player  | BirthYear 
       -------------------------------- 
       John Wayne | 1986 
       Ive Fish | 1990 
       San Simon | 1974 
       john Doe | 1995 

怎么会用 'BirthYear' 一个新的数据帧对于双方球员来说

  MatchID | Player1 | Player2 | BYear_P1 |BYear_P2 | Diff 
     ------------------------------------------------------------- 
       1 | John Wayne | John Doe | 1986 | 1995 | 9 
       2 | Ive Fish | San Simon | 1990 | 1974 | 16 

我试图

val df = MatchesDF.join(PersonalDF, MatchesDF("Player1") === PersonalDF("Player")) 

然后第二个玩家

val resDf = df.join(PersonalDF, df("Player2") === PersonalDF("Player")) 

再次加入,但它是非常耗时的操作。

可能是Scala和Apache Spark中的另一种方法吗?

回答

7

这应该有更好的表现:

case class Match(matchId: Int, player1: String, player2: String) 
case class Player(name: String, birthYear: Int) 

val matches = Seq(
    Match(1, "John Wayne", "John Doe"), 
    Match(2, "Ive Fish", "San Simon") 
) 

val players = Seq(
    Player("John Wayne", 1986), 
    Player("Ive Fish", 1990), 
    Player("San Simon", 1974), 
    Player("John Doe", 1995) 
) 

val matchesDf = sqlContext.createDataFrame(matches) 
val playersDf = sqlContext.createDataFrame(players) 

matchesDf.registerTempTable("matches") 
playersDf.registerTempTable("players") 

sqlContext.sql(
    "select matchId, player1, player2, p1.birthYear, p2.birthYear, abs(p1.birthYear-p2.birthYear) " + 
    "from matches m inner join players p1 inner join players p2 " + 
    "where m.player1 = p1.name and m.player2 = p2.name").show() 

+-------+----------+---------+---------+---------+---+ 
|matchId| player1| player2|birthYear|birthYear|_c5| 
+-------+----------+---------+---------+---------+---+ 
|  1|John Wayne| John Doe|  1986|  1995| 9| 
|  2| Ive Fish|San Simon|  1990|  1974| 16| 
+-------+----------+---------+---------+---------+---+ 

我没有找到表达加入斯卡拉DSL 3代表的方式。

+0

再次做两联接,它是如何做的更好? – void

+0

这对匹配表运行约2分钟,约10000行, 玩家表约700记录 – gmlvsv

+1

使用数据框为您的连接,而不是普通的SQL更好的性能。 – dheee

4
val df = left.join(right, Seq("name")) 
display(df) 
+6

你好,欢迎来到StackOverflow。请为您的答案添加一些解释,以便对其他用户更有价值。请参阅http://stackoverflow.com/help/how-to-answer – wmk

+0

此信息不足以提供任何形式的帮助。剩下什么”?什么是对的”?请更改您的答案。 –

+0

Spark Dataframe中没有显示功能(Scala实现) –

6

这是一个解决方案使用火花的数据帧功能:

import sqlContext.implicits._ 
import org.apache.spark.sql.Row 
import org.apache.spark.sql.functions.abs 

val matches = sqlContext.sparkContext.parallelize(Row(1, "John Wayne", "John Doe"), Row(2, "Ive Fish", "San Simon"))) 

val players = sqlContext.sparkContext.parallelize(Seq(
    Row("John Wayne", 1986), 
    Row("Ive Fish", 1990), 
    Row("San Simon", 1974), 
    Row("John Doe", 1995) 
)) 

val matchesDf = sqlContext.createDataFrame(matches, StructType(Seq(
    StructField("matchId", IntegerType, nullable = false), 
    StructField("player1", StringType, nullable = false), 
    StructField("player2", StringType, nullable = false))) 
).as('matches) 

val playersDf = sqlContext.createDataFrame(players, StructType(Seq(
    StructField("player", StringType, nullable = false), 
    StructField("birthYear", IntegerType, nullable = false) 
))).as('players) 

matchesDf 
    .join(playersDf, $"matches.player1" === $"players.player") 
    .select($"matches.matchId" as "matchId", $"matches.player1" as "player1", $"matches.player2" as "player2", $"players.birthYear" as "player1BirthYear") 
    .join(playersDf, $"player2" === $"players.player") 
    .select($"matchId" as "MatchID", $"player1" as "Player1", $"player2" as "Player2", $"player1BirthYear" as "BYear_P1", $"players.birthYear" as "BYear_P2") 
    .withColumn("Diff", abs('BYear_P2.minus('BYear_P1))) 
    .show() 

+-------+----------+---------+--------+--------+----+ 
|MatchID| Player1| Player2|BYear_P1|BYear_P2|Diff| 
+-------+----------+---------+--------+--------+----+ 
|  1|John Wayne| John Doe| 1986| 1995| 9| 
|  2| Ive Fish|San Simon| 1990| 1974| 16| 
+-------+----------+---------+--------+--------+----+ 
相关问题