Remove or Highlight Duplicates in Excel

October 20, 2005 · 13 comments

in Tips & Pointers,Work

Not sure who reads this, or who know sof my sick little Microsoft Excel fetish. I love the program, I know liek 1% of its features and am amazed at all the complex things it can do.

Lately I have been using it one way or another for about 4-5 hours a day I will say and occasional get a “woodrow” when a new time saving feature pops up.

I have been having to prune large sets of lists and have been able to use excels remove duplicates feature, but found it lacking. After some research I did for a an easy solution / excel macro I found this excellent free / donation ware utility. It is pretty powerful stuff andI thought I would share:

It is called the duplicate master and can be found here:
http://members.iinet.net.au/~brettdj/

here are some features:
Works with cells or entire rows!
Runs over ranges, multiple sheets or the entire Workbook!
Highlight duplicate cells or rows
Delete duplicate cells or rows
Select duplicate cells or rows
Extract unique cells or rows to a master list
Extract and count the number of duplicate cells or rows to a master list

It is pretty awesome, and it is free! Just follow the install instruactions on that page and raise hell with removing duplicates or highlighting duplicates in excel, without using a macro.

Thanks BrettDj for the software!

{ 13 comments… read them below or add one }

1 eWhisper 10.23.05 at 9:05 am

Nice find, Werty.

Now, how do you handle data sets (i.e. keywords) that are over 65k (the max rows excel can handle is 65xxx something)?

2 wertrose 10.23.05 at 12:49 pm

Ewhisper, I do not deal with data sets that large any more…I am much more targeted in the way that we create large lists, so I avoid that problem. The only time I deal with large lists in excel is if I get combined performance/tracking reports. I have been working with our reporting expert though, to comine that data, so it does not take up 65,000 rows (which I think is the max)

What are you looking at that is over that limit?

3 your_store 10.25.05 at 2:30 pm

IMO, you’re going to want the command line for those larger files. Something like this would clean your file:

sort duplicateKeywords.csv | uniq -u > uniqueKeywords.csv

I just run a pivot report for smaller files, but that requires working w/ two columns in Excel.

4 wertrose 10.26.05 at 8:07 pm

Your_store, that is SOOOO over my head.

Are you going to be in Vegas? I am thinking nickel slots with the old ladies again.

5 Jonathan Clarke 11.03.05 at 11:10 am

Swee-eet!

6 Dave (brettdj) 03.24.06 at 7:01 pm

Hi werty,

Thanks very much for the wrap, I’ve had over 100 hits from your link to date

And I’m now a frequent vistor to your site as well – good stuff 🙂

Cheers

Dave

7 Ron 02.28.07 at 4:11 pm

The Duplicate Master Macro Rocks ! Thanks for promoting the link Werty –

Regards, Ron

8 Ravi 11.15.07 at 9:57 am

I was dyingly looking for this and i got it from you. But as i have EXCEL 2007, it HANGS the excel.

do you have any new release for 2007??

If you reply, pls send the same to my email id too.

regards
Ravi

9 Dup Master new link 08.02.08 at 2:36 pm

see this site for the duplicate master http://xldynamic.com/source/xld.DupMaster.html

10 Safi Ullah 11.06.08 at 12:10 am

Thanks for promoting the link Werty

11 birut 11.24.08 at 12:42 pm

Hi to all, Is anyone have a copy of duplicate master. I cannot find the link . Can anyone please provide a copy and you can send on my email.
tnx a lot,
brut

12 hwpete 10.23.09 at 7:33 am

When I go to this link ‘http://xldynamic.com/source/xld.DupMaster.html’ and click on ‘download’ and get this error 404 – File or directory not found. Any update?

13 ????? 03.08.10 at 3:58 pm

I need this program badly, I have a large list in excel and it’s very slow, but your link not working.

Leave a Comment

You can use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Previous post:

Next post: