重复数据删除

发布于 2024-09-24 14:58:51 字数 1458 浏览 3 评论 0原文

下面的代码最好地解释了我想要完成的任务。我知道我可以使用游标或其他循环例程来循环记录以查找重复项,并根据找到的内容创建我的笔记记录。我会尽力避免这种情况,除非没有更好的选择。

DROP TABLE #orig
DROP TABLE #parts
DROP TABLE #part_notes

CREATE TABLE #orig(partnum VARCHAR(20), notes VARCHAR(100));
INSERT INTO #orig VALUES ('A123', 'To be used on Hyster models only')
INSERT INTO #orig VALUES ('A123', 'Right Hand model only')
INSERT INTO #orig VALUES ('A125', 'Not to be used by Jerry')
INSERT INTO #orig VALUES ('A125', NULL)
INSERT INTO #orig VALUES ('A125', 'asdfasdlfj;lsdf')
INSERT INTO #orig VALUES ('A128', 'David test')
INSERT INTO #orig VALUES ('A129', 'Fake part')

SELECT COUNT(*) FROM #orig

-- SHOW ME UNIQUE PARTS, MY PARTS TABLE SHOULD BE UNIQUE!
SELECT DISTINCT partnum FROM #orig


CREATE TABLE #parts(id INT IDENTITY(1,1), partnum VARCHAR(20));
INSERT INTO #parts
SELECT DISTINCT partnum FROM #orig

SELECT * FROM #parts

CREATE TABLE #part_notes(id INT IDENTITY(1,1), part_id INT, line_number INT, notes VARCHAR(100));
/*
    HOW DO I AT THIS POINT POPULATE the #part_notes table so that it looks like this:
    (note: any NULL or empty note strings should be ignored)

    id  part_id line_number notes
    1   1       1           To be used on Hyster models only    
    2   1       2           Right Hand model only
    3   2       1           Not to be used by Jerry
    4   2       2           asdfasdlfj;lsdf
    6   3       1           David test
    7   4       1           Fake part

*/

The code below explains best what I'm trying to accomplish. I know that I can use a cursor or other looping routine to loop through the records to find the duplicates and create my notes records based on what is found. I'm trying to avoid that, unless there's no better option.

DROP TABLE #orig
DROP TABLE #parts
DROP TABLE #part_notes

CREATE TABLE #orig(partnum VARCHAR(20), notes VARCHAR(100));
INSERT INTO #orig VALUES ('A123', 'To be used on Hyster models only')
INSERT INTO #orig VALUES ('A123', 'Right Hand model only')
INSERT INTO #orig VALUES ('A125', 'Not to be used by Jerry')
INSERT INTO #orig VALUES ('A125', NULL)
INSERT INTO #orig VALUES ('A125', 'asdfasdlfj;lsdf')
INSERT INTO #orig VALUES ('A128', 'David test')
INSERT INTO #orig VALUES ('A129', 'Fake part')

SELECT COUNT(*) FROM #orig

-- SHOW ME UNIQUE PARTS, MY PARTS TABLE SHOULD BE UNIQUE!
SELECT DISTINCT partnum FROM #orig


CREATE TABLE #parts(id INT IDENTITY(1,1), partnum VARCHAR(20));
INSERT INTO #parts
SELECT DISTINCT partnum FROM #orig

SELECT * FROM #parts

CREATE TABLE #part_notes(id INT IDENTITY(1,1), part_id INT, line_number INT, notes VARCHAR(100));
/*
    HOW DO I AT THIS POINT POPULATE the #part_notes table so that it looks like this:
    (note: any NULL or empty note strings should be ignored)

    id  part_id line_number notes
    1   1       1           To be used on Hyster models only    
    2   1       2           Right Hand model only
    3   2       1           Not to be used by Jerry
    4   2       2           asdfasdlfj;lsdf
    6   3       1           David test
    7   4       1           Fake part

*/

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

甜妞爱困 2024-10-01 14:58:51

下面只是任意选择 line_numbers,因为数据中似乎没有任何适合 order by 的内容。

SELECT   p.id part_id,
         p.partnum   ,
         ROW_NUMBER() over (partition BY p.id ORDER BY (SELECT 0)) line_number,
         notes
FROM     #parts p
         JOIN #orig o
         ON       o.partnum=p.partnum
WHERE    notes   IS NOT NULL
AND      notes            <> ''
ORDER BY part_id

The below just arbitrarily chooses line_numbers as there doesn't seem to be anything suitable to order by in the data.

SELECT   p.id part_id,
         p.partnum   ,
         ROW_NUMBER() over (partition BY p.id ORDER BY (SELECT 0)) line_number,
         notes
FROM     #parts p
         JOIN #orig o
         ON       o.partnum=p.partnum
WHERE    notes   IS NOT NULL
AND      notes            <> ''
ORDER BY part_id
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文