1

我正在尝试将我的数据从Cloud SQL移动到云数据存储。将数据从Google Cloud-SQL移动到云数据存储

SQL数据库中有500万条以下的条目。

看来我每天只能移动超过100,000个实体,才会出现配额错误。

我无法弄清楚哪一个确切的配额我超出,但我有指数退避,以确保我没有发送太快。

最终它击中了5分钟,并且连接到SQL服务器死亡,但我认为每秒写入配额不是问题。而且在我的API页面或App Engine API页面中看不到其他配额。

我已经尝试了两种不同的API来写入记录。

的GCP数据存储API

进口googledatastore
下面是代码:
https://gist.github.com/nburn42/d8b488da1d2dc53df63f4c4a32b95def

而且从apache_beam.io.gcp.datastore.v1.datastoreio进口WriteToDatastore数据流API

这里是代码:
https://gist.github.com/nburn42/2c2a06e383aa6b04f84ed31548f1cb09

这是我在看到一两十万个好写之后所看到的错误。
RPCError:数据存储调用提交[在写入数据存储/写入数据存储突变时失败]失败:错误代码:RESOURCE_EXHAUSTED。消息:超出配额。

我在计算引擎上运行这个。

任何帮助,非常感谢! 谢谢,
Nathan

回答

4

我问配额增加,有人在谷歌检查我的帐户找到问题。

这是他们的回复。

I understand that you want to know what specific quota you are reaching whenever you try to backup your Cloud SQL to Cloud Datastore.

Upon checking your project, it seems that the problem is that your App Engine application is at or near its spending limit. As of this time of writing, the Datastore Write Operations you have executed costed you 1.10$, which will be refreshed after 5 hours. It can definitely cause your resources to become unavailable until the daily spending limit is replenished. Kindly try to increase your spending limit as soon as possible to avoid service interruption and then run or execute your datastore write operations.

Give this a shot and let me know what happens. I will be looking forward to your reply.

这解决了这个问题。我只需要进入应用引擎并设置更高的每日支出限额。

希望我上面包含的代码可以帮助其他人。

相关问题