I think you can use this query to delete the duplicate records from the table ALTER IGNORE TABLE table_name ADD UNIQUE (location_id, datetime) Before doing this, just test with some sample data first..and then Try this.
I think you can use this query to delete the duplicate records from the table ALTER IGNORE TABLE table_name ADD UNIQUE (location_id, datetime) Before doing this, just test with some sample data first..and then Try this....
This looks promising, I hadn't heard about this feature before. Trying it now, I'll let you know how it turns out. And welcome to SO :) – Tatu Ulmanen Mar 5 '10 at 11:19 2 This worked, thank you.
Took 31 minutes to go through 16 982 040 rows with 1 589 908 duplicates. I can't believe it could be this simple, with no additional tables or complex queries. :) – Tatu Ulmanen Mar 5 '10 at 12:10.
SELECT *, COUNT(*) AS Count FROM table GROUP BY location_id, datetime HAVING Count > 2.
UPDATE table SET datetime = null WHERE location_id IN ( SELECT location_id FROM table as tableBis WHERE tableBis. Location_id = table. Location_id AND table.
Datetime > tableBis. Datetime) SELECT * INTO tableCopyWithNoDuplicate FROM table WHERE datetime is not null DROp TABLE table RENAME tableCopyWithNoDuplicate to table So you keep the line with the lower datetime. I'm not sure about perf, it depends on your table column, your server etc...
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.