我正在将我的spark应用程序连接到DashDB。目前,我可以加载我的数据就好了。将JDBC驱动到带有CLOB错误的DashDB(DB2)
但是,我无法将DataFrame保存到DashDB。
任何见解都会有所帮助。
var jdbcSets = sqlContext.read.format("jdbc").options(Map("url" -> url, "driver" -> driver, "dbtable" -> "setsrankval")).load()
jdbcSets.registerTempTable("setsOpponentRanked")
jdbcSets = jdbcSets.coalesce(10)
sqlContext.cacheTable("setsOpponentRanked")
然而,当我试图挽救大DataFrames,我得到的错误:
DB2 SQL错误:SQLCODE = -1666,SQLSTATE = 42613,则sqlerrmc = CLOB,DRIVER = 26年4月19日
的代码我使用保存数据如下:
val writeproperties = new Properties()
writeproperties.setProperty("user", "dashXXXX")
writeproperties.setProperty("password", "XXXXXX")
writeproperties.setProperty("rowId", "false")
writeproperties.setProperty("driver", "com.ibm.db2.jcc.DB2Driver")
results.write.mode(SaveMode.Overwrite).jdbc(writeurl, "players_stat_temp", writeproperties)
示例测试数据集在这里可以看到:
println("Test set: "+results.first())
Test set: ['Damir DZUMHUR','test','test','test','test','test','test','test','test','test','test','test','test','test','test','test','test','test','test','test','test','test',null,null,null,null,null,null,null]
数据帧架构如下:
root
|-- PLAYER: string (nullable = true)
|-- set01: string (nullable = true)
|-- set02: string (nullable = true)
|-- set12: string (nullable = true)
|-- set01weakseed: string (nullable = true)
|-- set01medseed: string (nullable = true)
|-- set01strongseed: string (nullable = true)
|-- set02weakseed: string (nullable = true)
|-- set02medseed: string (nullable = true)
|-- set02strongseed: string (nullable = true)
|-- set12weakseed: string (nullable = true)
|-- set12medseed: string (nullable = true)
|-- set12strongseed: string (nullable = true)
|-- set01weakrank: string (nullable = true)
|-- set01medrank: string (nullable = true)
|-- set01strongrank: string (nullable = true)
|-- set02weakrank: string (nullable = true)
|-- set02medrank: string (nullable = true)
|-- set02strongrank: string (nullable = true)
|-- set12weakrank: string (nullable = true)
|-- set12medrank: string (nullable = true)
|-- set12strongrank: string (nullable = true)
|-- minibreak: string (nullable = true)
|-- minibreakweakseed: string (nullable = true)
|-- minibreakmedseed: string (nullable = true)
|-- minibreakstrongseed: string (nullable = true)
|-- minibreakweakrank: string (nullable = true)
|-- minibreakmedrank: string (nullable = true)
|-- minibreakstrongrank: string (nullable = true)
我已经看过了JDBC DB2Dialect和看到,StringType代码被映射到CLOB。我想知道以下内容是否有帮助:
private object DB2CustomDialect extends JdbcDialect {
override def canHandle(url: String): Boolean = url.startsWith("jdbc:db2")
override def getJDBCType(dt: DataType): Option[JdbcType] = dt match {
case StringType => Option(JdbcType("VARCHAR(10000)", java.sql.Types.VARCHAR))
case BooleanType => Option(JdbcType("CHAR(1)", java.sql.Types.CHAR))
case _ => None
}
}
我有完全相同的问题,但我使用PySpark。我怎样才能解决这个问题? –
您可以在使用Pixidust的Scala桥接功能的PySpark笔记本中应用此修补程序。我写了一篇关于整个问题和解决方案的博客文章,其中包含示例笔记本的链接:http://datascience.ibm.com/blog/working-with-dashdb-in-data-science-experience/ –
我已经看到了这个文章之前,我实际上使用IBM的spark-submit而不是Notebooks/DSX。你是否说我需要在本地修复我的脚本,然后将它提交到Spark Cluster?由于Spark集群是托管服务,是否安装了所有这些依赖项? –