pyspark布尔配置与大写有关的问题
可能是一个有点愚蠢的问题,但我不想再进行试错......当您在集群中设置 pyspark 配置时,以下哪两个是正确的? ->
spark.hadoop.validateOutputSpecs: true
或者
spark.hadoop.validateOutputSpecs: True
我知道 scala 有小写的布尔值,spark 也是如此。但有点困惑,因为 python 的布尔值是“True/False”,那么 pyspark 的第一个字母也有相同的大写吗?或者 Spark 配置不区分大小写吗?
谢谢!
Probably a little silly question, but i don't want to do trial-error anymore... when you set pyspark config in cluster, which of the following 2 is correct? ->
spark.hadoop.validateOutputSpecs: true
or
spark.hadoop.validateOutputSpecs: True
I know scala has boolean values in lowercase, so does spark. But just a bit confused as python has boolean values as 'True/False', so does pyspark also have same capitalization of 1st letter? Or are spark configs case insensitive?
Thanks!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
完全取决于你的写作方式。话虽如此,以下两个实现都工作得很好 -
Depends completely the way you write. Having said that, both of the below implementations work just fine -
Pyspark 的配置也是小写的。我理解你的沮丧,哈哈。如果你查看 pyspark 文档,他们会显示它是小写的。此外,我的所有 pyspark 脚本都使用小写字母进行配置,当我在集群上查看它们时,它们显示为小写字母。
Pyspark is also lowercase for the config. And I understand your frustration lol. If you look at the pyspark docs, they show it lowercase. Also, all of my pyspark scripts use lowercase for config and they display as lowercase when I view them on the cluster.