如何在不指定列名的情况下从一张表更新另一张表?

发布于 2024-09-26 12:41:18 字数 546 浏览 0 评论 0原文

我有两个结构相同且字段数量非常大(大约 1000 个)的表。我需要执行2个操作 1)将第二个表中的所有行插入到第一个表中。示例:

INSERT INTO [1607348182]
SELECT * 
FROM _tmp_1607348182;

2)从第二个表更新第一个表 但对于更新,我找不到正确的更新 SQL 语法。

像这样的查询:

Update [1607348182]
set [1607348182].* = tmp.*
from [1607348182]
inner join _tmp_1607348182 as tmp on tmp.recordid = [1607348182].recordid

Update [1607348182]
from [1607348182]
inner join _tmp_1607348182 as tmp on tmp.recordid = [1607348182].recordid

是无效的。

I have two tables with identical structure and VERY LARGE number of fields (about 1000). I need to perform 2 operations
1) Insert from the second table all rows into the fist. Example:

INSERT INTO [1607348182]
SELECT * 
FROM _tmp_1607348182;

2) Update the first table from the second table
but for update i can't found proper sql syntax for update.

Queries like:

Update [1607348182]
set [1607348182].* = tmp.*
from [1607348182]
inner join _tmp_1607348182 as tmp on tmp.recordid = [1607348182].recordid

or

Update [1607348182]
from [1607348182]
inner join _tmp_1607348182 as tmp on tmp.recordid = [1607348182].recordid

are invalid.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

我一向站在原地 2024-10-03 12:41:18

从主表中删除在 temp 中具有 ID 的所有内容,然后使用新数据进行插入是否可行?

Would it work to delete everything from the master table that had an ID in temp, then do an insert with the new data?

深爱成瘾 2024-10-03 12:41:18

不确定您是否能够在不使用动态 sql 在变量中构建更新语句的情况下完成此任务。

此语句将根据您输入的表名返回列列表:

select name from syscolumns
where [id] = (select [id] from sysobjects where name = 'tablename')

不确定是否可以避免此处的循环...您需要将上面的结果加载到游标中,然后从中构建查询。伪编码:

set @query = 'update [1607348182] set '
load cursor --(we will use @name to hold the column name)
while stillrecordsincursor
set @query = @query + @name + ' = tmp_[1607348182]. ' +@name + ','
load next value from cursor
loop!

在循环中构建查询完成后,使用 exec sp_executesql @query。

只是一点警告...在这样的循环中构建动态 sql 可能会有点混乱。为了排除故障,将 select @query 放入循环中并观察 @query 的构建。

编辑:
不确定您是否能够一次更新所有 1000 行...@query 也可以增长的大小存在逻辑限制(varchar(8000)?)。您可能必须划分代码,以便它一次处理 50 列。将 syscolumns select 语句中的列放入带有 id 的临时表中,并构建动态 sql,以便它一次更新 20 列(或 50?)。

另一种选择是使用 Excel 来大规模构建它。进行列选择并将结果复制到电子表格的 a 列中。将 '= 放入 b 列,tmp。[12331312]在 c 列中,将 a 列复制到 D 列,并将逗号复制到 e 列。将整个电子表格复制到记事本中,您应该已经为您构建了更新语句的列。如果这是一次性事件,那么这是一个不错的解决方案,不确定我是否会依赖它作为持续的解决方案。

Not sure if you'll be able to accomplish this without using dynamic sql to build out the update statement in a variable.

This statement will return a list of columns based on the table name you put in:

select name from syscolumns
where [id] = (select [id] from sysobjects where name = 'tablename')

Not sure if I can avoid a loop here....you'll need to load the results from above into a cursor and then build a query from it. Psuedo coded:

set @query = 'update [1607348182] set '
load cursor --(we will use @name to hold the column name)
while stillrecordsincursor
set @query = @query + @name + ' = tmp_[1607348182]. ' +@name + ','
load next value from cursor
loop!

When the query is done being built in the loop, use exec sp_executesql @query.

Just a little warning...building dynamic sql in a loop like this can get a bit confusing. For trouble shooting, putting a select @query in the loop and watch the @query get built.

edit:
Not sure if you'll be able to do all 1000 rows in an update at once...there are logical limits (varchar(8000)?) on the size that @query can grow too. You may have to divide the code so it handles 50 columns at a time. Put the columns from the syscolumns select statement into a temp table with an id and build your dynamic sql so it updates 20 columns (or 50?) at a time.

Another alternative would be to use excel to mass build this. Do the column select and copy the results into column a of a spreadsheet. Put '= in column b, tmp.[12331312] in column c, copy column a into column D, and a comma into column e. Copy the entire spreadsheet into a notepad, and you should have the columns of the update statement built out for you. Not a bad solution if this is a one shot event, not sure if I'd rely on this as a on-going solution.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文