How to write mysql5 query for finding duplicate rows from a table?

Since the ID is not unique so its a bit not logical to get the sum of unique UserName from the table If the ID is not required we can get the result from single query SELECT UserName, COUNT(UserName) AS Count FROM TableName GROUP BY UserName HAVING COUNT(UserName) > 1 But in the case of ID in the result it will be a more complicated query including sub-query and inner table.

Since the ID is not unique so its a bit not logical to get the sum of unique UserName from the table. If the ID is not required we can get the result from single query. SELECT UserName, COUNT(UserName) AS Count FROM TableName GROUP BY UserName HAVING COUNT(UserName) > 1; But in the case of ID in the result it will be a more complicated query including sub-query and inner table.

You r using SUM it will not work. I tried this its working SELECT UserName, COUNT(UserName) AS Count FROM TableName GROUP BY UserName HAVING COUNT(UserName) > 1; – learner Jul 20 at 10:58 Thanks for correcting me. – Talha Ahmed Khan Jul 20 at 11:01 It's not logical to SUM usernames.

You can COUNT them though. – ypercube Jul 20 at 11:01 @ypercube I have made that change already, Thnanks. – Talha Ahmed Khan Jul 20 at 11:03.

I had the same problem some time ago and solved it like this (as far as I remember): SELECT * FROM tableA INNER JOIN (SELECT DISTINCT MAX(id) as id, type_id, temp FROM tableA GROUP BY type_id, temp) AS t ON tableA. Id = t. Id AND tableA.

Type_id = t. Type_id AND tableA. Temp = t.

Temp You join the table with itself selecting the ids that are duplicate. The fields that should be tested against duplicate values are in this case type_id and temp. If you need more or less fields that should be considered as duplicates you can adjust the fields.

I don't know if this helps in your case and if it can be done in a more simple way, so I'm prepared for downvotes ;-) Edit: removed last condition AND tableA. Id.

Your query will show exactly 0 rows. You have both tableA. Id = t.Id AND tableA.

Id – ypercube Jul 20 at 11:06 If you remove the last condition, it will work. But there is no reason for such complexity. A simple GROUP BY can solve most such problems.

– ypercube Jul 20 at 11:07 Ok, thanks. Yeah, as I said I doubt is was the most simplest way, but at least I wanted to share this solution. – acme Jul 20 at 11:37 Welcome.

You can also remove the DISTINCT from your subquery. It's redundant there because of the GROUP BY. – ypercube Jul 20 at 11:42 And what you propose can be helpful in situations when one wants to show more rows (that the grouped ones) from the table.

– ypercube Jul 20 at 11:43.

It looks like you're trying to pull the following data: First ID for a given UserName The UserName itself The total number of IDs for that UserName This query should do the trick: SELECT MIN(id), UserName, COUNT(id) FROM users GROUP BY UserName.

SELECT UserName , COUNT(*) AS `Count` FROM tableX GROUP BY UserName HAVING COUNT(*) > 1.

This is the right answer. SELECT UserName, COUNT(UserName) AS Count FROM TableName GROUP BY UserName HAVING COUNT(UserName) > 1.

Then accept Talha's answer that has this. You don't need to post it as an answer yourself. – ypercube Jul 20 at 11:17 Also note that COUNT(*) is faster in MySQL than COUNT(a_field).

And in this case will have same results. – ypercube Jul 20 at 11:18.

You join the table with itself selecting the ids that are duplicate. The fields that should be tested against duplicate values are in this case type_id and temp. If you need more or less fields that should be considered as duplicates you can adjust the fields.

Edit: removed last condition AND tableA.id.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions