我正在尝试使用 ALTER IGNORE TABLE + UNIQUE KEY 从 MySQL 表中删除重复项。 MySQL 文档说:
IGNORE 是标准 SQL 的 MySQL 扩展。如果新表中的唯一键存在重复项或者启用严格模式时出现警告,它控制 ALTER TABLE 的工作方式。如果未指定 IGNORE,则在出现重复键错误时,复制将中止并回滚。如果指定了 IGNORE,则仅使用唯一键上有重复项的行的第一行。其他冲突的行将被删除。不正确的值将被截断为最接近匹配的可接受值。
当我运行查询时……
ALTER IGNORE TABLE table ADD UNIQUE INDEX dupidx (field)
我仍然收到错误#1062 - 键“dupidx”的重复条目“blabla”。
I'm trying to remove duplicates from a MySQL table using ALTER IGNORE TABLE + an UNIQUE KEY. The MySQL documentation says:
IGNORE is a MySQL extension to standard SQL. It controls how ALTER TABLE works if there are duplicates on unique keys in the new table or if warnings occur when strict mode is enabled. If IGNORE is not specified, the copy is aborted and rolled back if duplicate-key errors occur. If IGNORE is specified, only the first row is used of rows with duplicates on a unique key. The other conflicting rows are deleted. Incorrect values are truncated to the closest matching acceptable value.
When I run the query ...
ALTER IGNORE TABLE table ADD UNIQUE INDEX dupidx (field)
... I still get the error #1062 - Duplicate entry 'blabla' for key 'dupidx'.
发布评论
评论(3)
MySQL 的
IGNORE
关键字扩展似乎在 InnoDB 中有一个 bug某些版本的 MySQL 上的版本。您始终可以转换为 MyISAM,忽略添加索引,然后转换回 InnoDB
注意,如果您有外键约束,则这将不起作用,您必须先删除它们,然后再将它们添加回来。
The
IGNORE
keyword extension to MySQL seems to have a bug in the InnoDB version on some version of MySQL.You could always, convert to MyISAM, IGNORE-ADD the index and then convert back to InnoDB
Note, if you have Foreign Key constraints this will not work, you will have to remove those first, and add them back later.
或者尝试 set session old_alter_table=1 (不要忘记将其设置回来!)
请参阅:http://mysqlolyk.wordpress.com/2012/02/18/alter-ignore-table-add-index-always-give-errors/
Or try set session old_alter_table=1 (Don't forget to set it back!)
See: http://mysqlolyk.wordpress.com/2012/02/18/alter-ignore-table-add-index-always-give-errors/
问题是您尝试索引的字段中有重复的数据。您需要先删除有问题的重复项,然后才能添加唯一索引。
一种方法是执行以下操作:
这允许您仅将唯一数据插入表中
The problem is that you have duplicate data in the field you're trying to index. You'll need to remove the offending duplicates before you can add a unique index.
One way is to do the following:
this allows you to insert only the unique data into the table