In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
SQL finds all duplicate records in a table
1. There are two fields id and name in the table, and all data with duplicate name can be queried.
select * from xi a where (a.username) in (select username from xi group by username having count(*) > 1)
2. After querying all the data for grouping, the query data for the number of repetitions of duplicate data are listed as follows:
select count(username) as 'multiple counts', username from xi group by username having count(*)>1 order by username desc
3. In order to view the results of others, the following are listed: Query and delete duplicate records
1. Look up redundant duplicate records in the table. Duplicate records are judged according to a single field (peopleId).
select * from peoplewhere peopleId in (select peopleId from people group by peopleId having count(peopleId) > 1)
2. Delete redundant duplicate records in the table. Duplicate records are judged according to a single field (peopleId), leaving only records with the smallest rowid.
delete from people where peopleId in (select peopleId from people group by peopleId having count(peopleId) > 1)and rowid not in (select min(rowid) from people group by peopleId having count(peopleId )>1)
3. Find redundant duplicate records in the table (multiple fields)
select * from vitae awhere (a.peopleId,a.seq) in (select peopleId,seq from vitae group by peopleId,seq having count(*) > 1)
Delete redundant duplicate records (multiple fields) in the table, leaving only records with the smallest rowid
delete from vitae awhere (a.peopleId,a.seq) in (select peopleId,seq from vitae group by peopleId,seq having count(*) > 1)and rowid not in (select min(rowid) from vitae group by peopleId,seq having count(*)>1)
5. Find redundant duplicate records (multiple fields) in the table, excluding the record with the smallest rowid
select * from vitae awhere (a.peopleId,a.seq) in (select peopleId,seq from vitae group by peopleId,seq having count(*) > 1)and rowid not in (select min(rowid) from vitae group by peopleId,seq having count(*)>1)
(ii)
for example
There is a field "name" in table A,
And the "name" value may be the same between different records,
Now it is necessary to find out the items with duplicate "name" values among records in this table;
Select Name,Count(*) From A Group By Name Having Count(*) > 1
If you also check the gender is also the same as the following:
Select Name,sex,Count(*) From A Group By Name,sex Having Count(*) > 1
(iii)
method one
declare @max integer,@id integerdeclare cur_rows cursor local for select main field,count(*) from table name group by main field having count(*) >;open cur_rows fetch cur_rows into @id,@maxwhile @@fetch_status=0begin select @max = @max -1 set rowcount @max delete from table name where main field = @idfetch cur_rows into @id,@maxendclose cur_rowsset rowcount 0
Method 2 "Duplicate records" have duplicate records in two senses: one is completely duplicate records, that is, records with duplicate fields in all fields; the other is records with duplicate key fields in some fields, such as Name field, while other fields are not necessarily duplicate or can be ignored.
1. For the first kind of repetition, it is easier to solve and use
select distinct * from tableName
You can get a result set without duplicate records.
If duplicate records need to be deleted from the table (one duplicate record remains),
You can delete it as follows
select distinct * into #Tmp from tableNamedrop table tableNameselect * into tableName from #Tmpdrop table #Tmp
This duplication occurs due to poor table design, which can be solved by adding unique index columns.
2. This kind of duplicate problem usually requires the first record in the duplicate record to be kept. The operation method is as follows. Assume that there are duplicate fields named Name and Address, and require the unique result set of these two fields.
select identity(int,1,1) as autoID, * into #Tmp from tableNameselect min(autoID) as autoID into #Tmp2 from #Tmp group by Name,autoIDselect * from #Tmp where autoID in(select autoID from #tmp2)
The last select gets a result set where Name and Address are not repeated (but there is an additional autoID field, which can be omitted in the select clause when writing)
(iv) Duplicate queries
select * from tablename where id in (select id from tablenamegroup by idhaving count(id) > 1)
Find duplicate records for a field
Duplicate records found according to sample_code field
SELECT * FROM tb_table WHERE sample_code IN( SELECT sample_code FROM tb_table GROUP BY sample_code HAVING COUNT(sample_code) > 1 );
Find duplicate records for multiple fields (here 2 are examples)
Duplicate records found based on name and code fields
SELECT * from (SELECT *, CONCAT(name,code) as nameAndCode from tb_table) t WHERE t.nameAndCode in ( SELECT nameAndCode from (SELECT CONCAT(name,code) as nameAndCode from tb_table) tt GROUP BY nameAndCode HAVING count(nameAndCode) > 1)
summary
The above is a MySQL SQL statement introduced to you by Xiaobian. MySQL searches for duplicate data according to one or more fields. I hope it will help you. If you have any questions, please leave a message to me. Xiaobian will reply to you in time. Thank you very much for your support!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.