无法使用 BCP 实用程序更新 SQL Server 表
我们有一个数据库表,作为部署过程的一部分,我们预先填充了数据。 由于其中一列是二进制的(它是二进制序列化对象),我们使用 BCP 将数据复制到表中。
到目前为止,这种方法效果很好,但是,今天我们第一次在 Windows Server 2008
计算机上尝试了这种技术,并注意到并非所有列都被更新。 在通常作为此操作的一部分插入的 31 行
中,只有 2 行
实际上正确填充了其二进制列
。 其他 29 行的二进制列
仅具有空值。 这是我们第一次看到这样的问题,并且这是我们用于所有部署的同一个 .dat 文件。
有其他人以前遇到过这个问题或者对问题可能是什么有任何见解吗?
提前致谢, 杰里米
We have a database table that we pre-populate with data as part of our deployment procedure. Since one of the columns is binary (it's a binary serialized object) we use BCP
to copy the data into the table.
So far this has worked very well, however, today we tried this technique on a Windows Server 2008
machine for the first time and noticed that not all of the columns were being updated. Out of the 31 rows
that are normally inserted as part of this operation, only 2 rows
actually had their binary columns
populated correctly. The other 29 rows simply had null values for their binary column
. This is the first situation where we've seen an issue like this and this is the same .dat file that we use for all of our deployments.
Has anyone else ever encountered this issue before or have any insight as to what the issue could be?
Thanks in advance,
Jeremy
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我的猜测是,您使用 -c 或 -w 转储为文本,并且它因不喜欢的特定字符组合而窒息,并以 NULL 代替。 如果没有格式文件,在本机模式下也会发生这种情况。 尝试以下方法,看看是否有帮助。 (显然,您需要自己添加服务器和登录开关。)
这会将表数据转储为带有格式文件、NULL 和标识值的直接二进制文件,以确保所有内容都正确向上。 此外,它将使用 1000 个批次来优化数据转储。 然后,将其插入回来:
...这将使用格式文件、数据文件并设置批处理以稍微提高速度。 如果您需要比这更快的速度,您将需要查看 BULK INSERT、FirstRow/LastRow 和并行加载,但这有点超出了本问题的范围。 :)
My guess is that you're using -c or -w to dump as text, and it's choking on a particular combination of characters it doesn't like and subbing in a NULL. This can also happen in Native mode if there's no format file. Try the following and see if it helps. (Obviously, you'll need to add the server and login switches yourself.)
This'll dump the table data as straight binary with a format file, NULLs, and identity values to make absolutely sure everything lines up. In addition, it'll use batches of 1000 to optimize the data dump. Then, to insert it back:
...which will use the format file, data file, and set batching to increase the speed a little. If you need more speed than that, you'll want to look at BULK INSERT, FirstRow/LastRow, and loading in parallel, but that's a bit beyond the scope of this question. :)