是否有一个在独立的存储库中进行pyflink SQL单元测试的示例?
是否有一个独立存储库的示例,显示了如何执行pyflink的SQL单元测试(如果可能的话,特别是1.13.x)?
有一个相关问题在这里,建议在其中使用Pyflink中的一些测试本身。我遇到的问题是,pyflink repo假设Java Class Path上有很多东西,并且一些Python实用程序类都可以使用(它们没有通过PYPI Apache-Flink分发)。
I have done the following:
- Copied
test_case_utils.py
andsource_sink_utils.py
from - Copy an example unit test ( 问题所建议的。
这个相关的 fails, because this code试图通过运行mvn帮助来评估
)avro.version
的值:estuate -dexpression = avro.version
然后,我添加了一个虚拟pom.xml
pom.xml 定义avro.version
的Maven属性(具有1.10.0
的值),
我现在 加载了我的单元测试案例。跳过:
'flink-table-planner*-tests.jar' is not available. Will skip the related tests.
我不知道如何解决此问题。 < type> test-jar</type> to我的虚拟pom.xml,但仍然失败。
这开始感觉像是要做应该很重要的事情的真正痛苦:pyflink项目的基本TDD。是否有一个Python项目的现实世界示例,该示例显示了如何使用PYFLINK设置单元测试SQL的测试环境?
Is there an example of a self-contained repository showing how to perform SQL unit testing of PyFlink (specifically 1.13.x if possible)?
There is a related SO question here, where it is suggested to use some of the tests from PyFlink itself. The issue I'm running into is that the PyFlink repo assumes that a bunch of things are on the Java classpath and that some Python utility classes are available (they're not distributed via PyPi apache-flink).
I have done the following:
- Copied
test_case_utils.py
andsource_sink_utils.py
from PyFlink into my project. - Copy an example unit test (this one as suggested by the related SO question.
When I try to run the test, I get an error because the test case cannot determine what version of Avro jars to download (download_apache_avro()
fails, because this code tries to evaluate the value of avro.version
by running mvn help:evaluate -Dexpression=avro.version
)
I then added a dummy pom.xml
defining a Maven property of avro.version
(with a value of 1.10.0
) and my unit test case is loaded.
I now get a new error and my test is skipped:
'flink-table-planner*-tests.jar' is not available. Will skip the related tests.
I don't know how to fix this. I've tried adding flink-table-planner
and flink-table-planner-blink
dependencies with <type>test-jar</type>
to my dummy pom.xml, but it still fails.
This is starting to feel like a real pain to do something that should be trivial: basic TDD of a PyFlink project. Is there a real-world example of a Python project that shows how to set up a testing environment for unit testing SQL with PyFlink?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您可以参考 https://github.com/dianfuk/dianfuk/pyflink-flink/pyflink-faq/树/主/测试提供了一个有关如何在外部项目中编写单元测试的示例。
You can refer to https://github.com/dianfu/pyflink-faq/tree/main/testing which gives an example on how to write unit tests in an external project.