使用具有大量 ID 列表的 IN 子句更新表时出现问题
您好,我在尝试使用 IN 子句更新表时遇到问题,我有一个很大的客户端列表,应该更新 4500+。
Update table
set columnA = 'value'
where ID in ( biglistofids ) //biglistofids > 4500 ids
我收到这个错误 “字符串或二进制数据将被截断。”
我尝试了相同的脚本,但 id 较少(2000),效果很好。
我也尝试过使用时态表,但遇到了同样的错误。
SELECT Id INTO tmpTable FROM dbo.table WHERE id IN (biglistofids) //创建临时表成功
更新表集columnA = 'value' FROM table INNER JOIN tmpTable ON table.ID = tmpTable. ID
有没有办法处理这个问题,而不需要为每2000条记录重复代码?
提前致谢
Hi I am having a problem when trying to update a table using an IN clause, I have a big list of clients that should be updated 4500+.
Update table
set columnA = 'value'
where ID in ( biglistofids ) //biglistofids > 4500 ids
I am getting this error
"String or binary data would be truncated."
I tried the same script with fewer ids lets say (2000) and it worked fine.
I have also tried using a temporal table but I got same error.
SELECT Id INTO tmpTable FROM dbo.table WHERE id IN (biglistofids) //create temporal table succesfully
Update table set columnA = 'value' FROM table INNER JOIN tmpTable ON table.ID = tmpTable.ID
Is there any way to handle this, without repeating code for each 2000 records?
Thanks in advance
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
“字符串或二进制数据将被截断。”
与IN
子句无关。这意味着在这一行中:
您将
columnA
设置为太长而无法在columnA
中保存的内容。也许某些
id
对应的数据太长,并且这些数据不在您尝试过的前2000条之内。The
"String or binary data would be truncated."
has nothing to do with theIN
clause.It means in this line:
you are setting
columnA
to something that is too long to be held incolumnA
.Maybe certain
id
s have corresponding data that is too long, and these are not among the first 2000 you have tried.根据您的错误,在我看来,实际问题在于您正在更新的一个或多个值。我会首先尝试验证输入。我已经根据我拥有的记录数量、值的大小、值的类型等来完成此操作,因此这取决于您的具体情况。
最直接的一种(不一定是最好的)就是您所描述的一种。尝试做 2000。如果有效,请尝试下一个 2000,等等。这既耗时又笨重,可能不是最适合您的情况,但我从未见过它无法识别我的问题。
It looks to me, based on your error, that the actual problem is with one or more of the values you're updating. I'd try validating the input, first. I've done this many ways based on number of records I had, size of the
value
, type ofvalue
, etc., so that will depend on your specific scenario.The most straight-forward one (not necessarilly the best) is the one you describe. Try to do 2000. If that works, try the next 2000, etc. That is time intensive and clunky and may not be the best for your situation, but I've never seen it fail to identify my problem.