Apache Beam上的BigQuery Storager API的许可_DENDIEN和DATAFLOW RUNNER上的API API
我的一个数据流工作有以下错误:
2022-06-15T16:12:27.365182607Z来自工人的错误消息:java.lang.runtimeexception:org.apache.beam.sdk.util.usercodeexception:java.lang.lang.runtimeexception:java.lang.lang.rang.rang.rang.rang.rang.rang.exception。 lang.runtimeException:com.google.api.gax.rpc.permissiondeniedexception:io.grpc.statusruntimeexception:permission_denied:BigQuery Storage_denied:BigQuery Storage API尚未在项目770406736630中使用,或者是在禁用之前或其已被禁用。通过访问 htttps:/console。 developers.google.com/apis/api/bigquerystorage.googleapis.com/overview?project=77040406736630 然后重试。如果您最近启用了此API,请等待几分钟以传播我们的系统并重试。
相同的代码可与Apache Beam 2.38.0一起使用。我测试了多次,这不是一个临时问题。错误(770406736630)中提到的项目编号不是我的。
知道为什么我会遇到这个错误吗?
I have the following error for one of my DataFlow Jobs:
2022-06-15T16:12:27.365182607Z Error message from worker: java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: com.google.api.gax.rpc.PermissionDeniedException: io.grpc.StatusRuntimeException: PERMISSION_DENIED: BigQuery Storage API has not been used in project 770406736630 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/bigquerystorage.googleapis.com/overview?project=770406736630 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
The same code works fine with Apache Beam 2.38.0. I tested multiple times and this is not a temporary issues. The project number mentioned in the error (770406736630) is not mine.
Any idea why I get this error?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
我只是遇到了这一点,只需要通过运行
gcloud auth Application-Default登录
来重新认证GCP CLI。I just ran into this, and simply needed to re-authenticate with the gcp cli by running
gcloud auth application-default login
.我也有同样的问题。我正在使用Spring Cloud GCP,并且没有设置
Spring.Cloud.gcp.Project-ID
属性,我猜这使SDK或API使用一些默认值。我不知道您是如何设置环境的,因为您尚未指定,而是研究如何明确设置项目ID。您可以从对话框中获取它,以在GCP控制台中选择一个项目。
I had the same issue. I'm using Spring Cloud GCP and hadn't set the
spring.cloud.gcp.project-id
property, which I'm guessing makes the SDK or API use some default value.I don't know how you've set up you environment, because you haven't specified, but look into how you can explicitly set the project id. You can get it from the dialog for selecting a project in GCP Console.
使用
bigqueryio.write.method.storage_write_api
时,最新的Apache Beam SKD(2.41.0)发生了错误,并且目标未指定项目名称。例如dataset.table
而不是project-id:dataset.table
这是对我有用的解决方案:
出于某种原因,apache beam实现了BigQuery Write Storage Storage API即使对
file_loads
方法的工作正常,也无法处理这种情况。对于最新的Beam SDK,您可能还会收到不同的错误。
The error happens for the latest Apache Beam SKD (2.41.0) when
BigQueryIO.Write.Method.STORAGE_WRITE_API
is used and destination does not specify the project name. For exampledataset.table
instead ofproject-id:dataset.table
This is the solution that worked for me:
For some reason the Apache Beam implementation for BigQuery Write Storage API does not handle this situation even though it works fine for
FILE_LOADS
method.You may also receive a sightly different error for the latest Beam SDK.