为什么我只在最后一条记录处插入重复记录?
我从插入 330 多条记录的程序中得到一条重复记录。 但仅限于最后一张记录。换句话说,最后两条记录并不不同,它们是相同的。这个过程是什么导致最后一条记录被复制。
DELIMITER $$
DROP PROCEDURE IF EXISTS `zzExclude_Products` $$
CREATE DEFINER=`root`@`%` PROCEDURE `zzExclude_Products`()
BEGIN
DECLARE done INT DEFAULT 0;
DECLARE VAR_ENTITY_ID VARCHAR(50);
DECLARE CUR_NO CURSOR FOR
SELECT DISTINCT NO
FROM stage_product_data.ITEMMAST AS IM
JOIN stage_product_data.zzLive_Products AS LIVE ON IM.NO = LIVE.SKU
WHERE DIVISION = '30' AND STATUS NOT IN ('XX','YY','ZZ');
DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = 1;
OPEN CUR_NO;
REPEAT
FETCH CUR_NO INTO VAR_ENTITY_ID;
INSERT INTO zz_CATALOG (TYPE, ENTITY_ID, ESTRICTION_TYPE, RESTRICTION_VALUE)
VALUES ('Product', VAR_ENTITY_ID, 'Country', 'ALL');
UNTIL done END REPEAT;
CLOSE CUR_NO;
END $$
DELIMITER ;
I get a duplicate record from my procedure which inserts 330+ records.
But ONLY on the very last record. So in other words the last 2 records are not distinct, they are the same. What is it about this procedure that allows the last record to get duplicated.
DELIMITER $
DROP PROCEDURE IF EXISTS `zzExclude_Products` $
CREATE DEFINER=`root`@`%` PROCEDURE `zzExclude_Products`()
BEGIN
DECLARE done INT DEFAULT 0;
DECLARE VAR_ENTITY_ID VARCHAR(50);
DECLARE CUR_NO CURSOR FOR
SELECT DISTINCT NO
FROM stage_product_data.ITEMMAST AS IM
JOIN stage_product_data.zzLive_Products AS LIVE ON IM.NO = LIVE.SKU
WHERE DIVISION = '30' AND STATUS NOT IN ('XX','YY','ZZ');
DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = 1;
OPEN CUR_NO;
REPEAT
FETCH CUR_NO INTO VAR_ENTITY_ID;
INSERT INTO zz_CATALOG (TYPE, ENTITY_ID, ESTRICTION_TYPE, RESTRICTION_VALUE)
VALUES ('Product', VAR_ENTITY_ID, 'Country', 'ALL');
UNTIL done END REPEAT;
CLOSE CUR_NO;
END $
DELIMITER ;
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
这里确实不需要光标。这可以在单个 INSERT 语句中完成。
There's really no need for a cursor here. This can be done in a single INSERT statement.
您确定这是该过程而不是您要插入的数据中的重复项吗?
Are you sure it is the procedure and not duplicates in the data you are inserting?