maybe a function like this:
http://www.voidtools.com/forum/viewtopi ... =10&t=4408
based on md5 checksum would be nice...
Klaus
finding duplicates
Re: finding duplicates
You can always drag a set of files into a "hasher".
(Might not necessarily be "clean" but its doable.)
Hasher
Or if the hasher has or can have a content menu, might be better yet.
HashMyFiles
(Maybe there's a 15 item limit that HashMyFiles can accept?)
(Might not necessarily be "clean" but its doable.)
Hasher
Or if the hasher has or can have a content menu, might be better yet.
HashMyFiles
(Maybe there's a 15 item limit that HashMyFiles can accept?)
Re: finding duplicates
(
HashMyFiles
Note what he says:
Some time ago (& today too) I reported to him that that is not exactly correct, that is does work (with seemingly) < 16 items, though you may see one file duplicated, or not all selected files listed - so you do need to be aware of this.
He mentions copy/paste or dragging, but that is less convenient.
)
HashMyFiles
Note what he says:
In particular, Notice: Static menu items of Explorer do not support multiple file selection.Explorer Context Menu
HashMyFiles can also be used directly from Windows Explorer. In order to enable this feature, go to the Options menu, and choose the 'Enable Explorer Context Menu' option. After you enable this feature, you can right-click on any file or folder on Windows Explorer, and choose the 'HashMyFiles' item from the menu.
If you run the HashMyFiles option for a folder, it'll display the hashes for all files in the selected folder.
If you run the HashMyFiles option for a single file, it'll display only the hashes for that file.
Notice: Static menu items of Explorer do not support multiple file selection. If you want to get the hash of multiple files from Explorer window, use Copy & Explorer Paste, or drag the files into the HashMyFiles window.
Some time ago (& today too) I reported to him that that is not exactly correct, that is does work (with seemingly) < 16 items, though you may see one file duplicated, or not all selected files listed - so you do need to be aware of this.
He mentions copy/paste or dragging, but that is less convenient.
)
Re: finding duplicates / MD5
First, thanks for providing Version 686.
Regarding this discussion: there are lots of "hashers" around (Beyond Compare, FreeCommander,...)
But what is really needed is a way to get the hashes loaded into the Everything Database and make them visible+sortable as a column!
Based on the identity of MD5 hash, a duplicate function that ignores Names would be very easy to implement (in fact, you can just look at the items in subsequent rows after sorting)
Obviously, updating the hash (MD5) will always be slow (at least compared to reading the MFT). But updating these could be a background task that does not run too often. It would be perfect if it was triggered to run when a file is changed, though.
In a basic functionality, Beyond Compare can do this MD5 hashing, but only on selected directories (in a flattened view, including the subdirs). That's what I have to do for the moment...
Regarding this discussion: there are lots of "hashers" around (Beyond Compare, FreeCommander,...)
But what is really needed is a way to get the hashes loaded into the Everything Database and make them visible+sortable as a column!
Based on the identity of MD5 hash, a duplicate function that ignores Names would be very easy to implement (in fact, you can just look at the items in subsequent rows after sorting)
Obviously, updating the hash (MD5) will always be slow (at least compared to reading the MFT). But updating these could be a background task that does not run too often. It would be perfect if it was triggered to run when a file is changed, though.
In a basic functionality, Beyond Compare can do this MD5 hashing, but only on selected directories (in a flattened view, including the subdirs). That's what I have to do for the moment...
Re: finding duplicates
might be what you need:
http://voidtools.com/forum/viewtopic.php?p=7414#p7414
http://voidtools.com/forum/viewtopic.php?p=7414#p7414
Doesn't everything does that already
dupe: search for duplicated files\folders names.You just add the name and the size: the file size to limit the duplicates displayed by name and size
dupe:.gif size:>1mb
will show only duplicated gif images(same name and contains but must contains gif)that are bigger than 1mb
Or
dupe: wfn:1.gif size:>100kb
Will display duplicated files(for everything,duplicates mean same name)that the whole name is 1.gif and are bigger than 100kb
Re: finding duplicates
Bumping an old thread.
I would love to see an MD5sum stored with each file, and displayable through the existing GUI, as a way of locating copied/renamed files, especially across drives and directories.
I could retire my messy Bash/Excel mashup!
I would love to see an MD5sum stored with each file, and displayable through the existing GUI, as a way of locating copied/renamed files, especially across drives and directories.
I could retire my messy Bash/Excel mashup!
Re: finding duplicates
Please, no! Only if optional. There are a bunch of cli/gui utilities for this task.
Re: finding duplicates
FSUM FAST FILE INTEGRITY CHECKER.
(Separate & untested, Fsum Frontend.)
Some duplicate file finders compute & store hashes.
Some file renamers will compute a hash & rename the file to include it.
(Separate & untested, Fsum Frontend.)
Some duplicate file finders compute & store hashes.
Some file renamers will compute a hash & rename the file to include it.
Re: finding duplicates
No need for that on a modern Windows system:therube wrote:FSUM FAST FILE INTEGRITY CHECKER.
Code: Select all
C:\temp>certutil -hashfile c:\Tools\Everything\Everything.exe MD5
MD5 hash of c:\Tools\Everything\Everything.exe:
0a02476bd4a0e3f367a7922a3d456626
CertUtil: -hashfile command completed successfully.