将 WHERE NOT EXISTS 与虚拟表一起使用在 Apache Derby 数据库上不起作用
因此,我将 Java 与 Apache Derby 结合使用,并且尝试插入一条记录,但前提是具有相同键的记录尚不存在,因为我希望所有值都存在于代码中而不是数据库中使用 derbys 虚拟表(类似于 db2 的 DUAL)这是我正在使用的查询(md5 是主键)
PreparedStatement stmt = conn.prepareStatement("INSERT INTO artwork (md5,image) " +
"SELECT ?,?" +
"FROM SYSIBM.SYSDUMMY1 " +
"WHERE NOT EXISTS ( SELECT 1 FROM artwork WHERE md5=?)");
stmt.setString(1, key);
stmt.setBinaryStream(2, new ByteArrayInputStream(imageData), imageData.length);
stmt.setString(3, key);
并且它似乎正在工作,但是当我对代码进行多线程处理时,两个线程可能会尝试插入相同的艺术品我收到有关在索引中输入重复值的错误。
如果我同步该方法,以便只有一个线程可以同时调用该方法,那么我就不会收到此类错误,但这违背了首先添加 WHERE NOT EXISTS 值的目的。
那么我的查询是否没有按照我认为的方式进行,或者我是否误解了这里的一般概念?
So I'm using Java with Apache Derby, and I'm trying to insert a record, but only if a record with the same key does not already exist, because all the values I want to exist in my code rather than the database I use derbys dummy table (analogous to DUAL for db2) this is the query I'm using (md5 is the primary key)
PreparedStatement stmt = conn.prepareStatement("INSERT INTO artwork (md5,image) " +
"SELECT ?,?" +
"FROM SYSIBM.SYSDUMMY1 " +
"WHERE NOT EXISTS ( SELECT 1 FROM artwork WHERE md5=?)");
stmt.setString(1, key);
stmt.setBinaryStream(2, new ByteArrayInputStream(imageData), imageData.length);
stmt.setString(3, key);
and it seemed to be working, however when I multi-thread the code so that two threads could be trying to insert the same artwork I get the get errors about entering duplicate value into index.
If I synchronize the method so that only one thread can call the method at the same time then I get no such errors but this defeats the purpose of adding the WHERE NOT EXISTS value in the first place.
So is my query not doing what I think it is doing, or am I misunderstanding a general concept here ?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
信息:此解决方案不起作用。详情请参阅评论。我把它留在这里,这样任何寻找解决方案的人都不必尝试这个。
我自己还没有遇到这个问题,但我猜想如果你优化它的子查询,你可以解决它。
像这样(我没有测试过,也许它需要改进):
INFO: This solution does not work. See the comments for details. I left it here so who ever looks for a solution dosen't have to try this as well.
I haven't had the prodblem myself yet, but I would guess that you could get around it if you optimize the subquery out of it.
Like this (I haven't tested it, maybe it needs improvement):