如何将hbase-site.xml传递到Google云数据流模板
我们有一个设置,其中我们在Google Cloud上运行HBase群集,并使用要写入HBase表中的数据流。为此,我想将我的hbase-site.xml
在登台中传递,然后在产品中,我将在生产环境中传递不同的hbase-site.xml
。但是,我找不到将资源文件传递到数据流模板的选项。数据流中是否有任何选项类似于- prife
spark或- classPath
inflink中的。
我绝对可以将hbase-site.xml
添加到src/main/resources
,但我想要两个不同的hbase-site.xml
环境。因此,有这样的选择将非常有益。
We have a setup where we have a Hbase cluster running on Google cloud and using Dataflow I want to write into Hbase tables. For this, I want to pass my hbase-site.xml
file in staging and then in prod, I will pass different hbase-site.xml
in production environment. However, I am not able to find an option to pass a resource file to Dataflow template. Is there any option in Dataflow similar to --files
in Spark or --classpath
in Flink for adding this.
I can definitely add hbase-site.xml
to src/main/resources
which helps but I want different hbase-site.xml
for two different environments. So, having an option like this would be very beneficial.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您是否正在使用beam hbaseio ,是否可以将这些参数作为提供给它的
配置
的一部分?如果是这样,您可能可以更新模板以接受此配置(或创建配置的值)作为PipelineOption(并在主类中解析它们)。如果您希望该文件在本地可用(在VM中),则可能需要设置a 自定义容器您的模板使用。
Are you using Beam HBaseIO and is it possible to pass these parameters as a part of the
Configuration
provided to it ? If so, you could probably update your template to accept this config (or values for creating a config) as a PipelineOption (and parse them in the Main class).If you want the file to be available locally (in the VM), you probably need to setup a custom container to be used by your template.