如何通过spark-sql删除hive表中的记录?
我尝试通过spark-sql删除hive表中的记录,但失败了。下面是消息:
spark-sql> delete from jgdy
> ;
2022-03-17 04:13:13,585 WARN conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
2022-03-17 04:13:13,585 WARN conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
2022-03-17 04:13:13,585 WARN conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
Error in query: DELETE is only supported with v2 tables.
谁能告诉我如何删除?或者我需要做什么配置吗? 谢谢
I try to delete records in hive table by spark-sql, but failed. Follow is message:
spark-sql> delete from jgdy
> ;
2022-03-17 04:13:13,585 WARN conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
2022-03-17 04:13:13,585 WARN conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
2022-03-17 04:13:13,585 WARN conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
Error in query: DELETE is only supported with v2 tables.
Who can show me how to delete? Or is there any configurations I need to do?
thanks
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论