2
我想在这将有1500个动态分区表中插入一些数据,我收到此错误:无法更改hive.exec.max.dynamic.partitions在星火
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1500, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1500.
所以,我尝试到:SET hive.exec.max.dynamic.partitions=2048
但我仍然得到相同的错误。
如何从Spark更改此值?
代码:
this.spark.sql("SET hive.exec.dynamic.partition=true")
this.spark.sql("set hive.exec.dynamic.partition.mode=nonstrict")
this.spark.sql("SET hive.exec.max.dynamic.partitions=2048")
this.spark.sql(
"""
|INSERT INTO processed_data
|PARTITION(event, date)
|SELECT c1,c2,c3,c4,c5,c6,c7,c8,c9,c10,event,date FROM csv_data DISTRIBUTE BY event, date
""".stripMargin
).show()
使用星火2.0.0独立模式。 谢谢!