hi Mr Void and others.
What an awesomeness program. three cheers, hip, hip hooray for Everything
Now i can have learnt about filters showing local and network searches I am in seventh heaven.
However when any change is made to the search locations in the options it must go and do a rescan or search.
That then prevents the normal use of the program.
My "folders" added to everything can take a long time to collect the information initially and then a reasonable time to update once per day.
Is it possible to make the program usable while doing a background reindex/loading the remote network folders? ie Maybe just display on old index? or just local ntfs drives while indexing those pesky remote folders.
background indexing or loading remote folders?
Re: background indexing or loading remote folders?
The initial scan can take a long time.
Everything will be unusable during the initial scan.
Scheduled rescans should occur in the background.
You can use Everything during a background rescan.
The initial scan is designed to be as memory efficient as possible. Unfortunately, this prevents Everything from performing a search until the initial scan completes.
Everything is designed so you only add your folder indexes once.
If you are adding/removing folder indexes constantly, consider multiple instances.
I have on my TODO list to add an option to allow searching while the index builds at the cost of extra memory usage.
Thanks for the suggestions.
Everything will be unusable during the initial scan.
Scheduled rescans should occur in the background.
You can use Everything during a background rescan.
The initial scan is designed to be as memory efficient as possible. Unfortunately, this prevents Everything from performing a search until the initial scan completes.
Everything is designed so you only add your folder indexes once.
If you are adding/removing folder indexes constantly, consider multiple instances.
I have on my TODO list to add an option to allow searching while the index builds at the cost of extra memory usage.
Thanks for the suggestions.
Re: background indexing or loading remote folders?
I have disabled daily updating due to this long scan time.
Will in a near future, and option be added to scan a subset of such large folders?
For example, by context selecting the folder, and then when listed in the search, pressing ctrl+F5 or a menu 'rescan' or something like that.
On my large shares usually the same (small) folders contain changes while the other data is quite static.
Will in a near future, and option be added to scan a subset of such large folders?
For example, by context selecting the folder, and then when listed in the search, pressing ctrl+F5 or a menu 'rescan' or something like that.
On my large shares usually the same (small) folders contain changes while the other data is quite static.
Re: background indexing or loading remote folders?
A quick rescan option is in development.Will in a near future, and option be added to scan a subset of such large folders?
Quick rescanning will only rescan folders that have been modified since the last rescan.
This will only work for NTFS or compatible volumes (does not work for FAT/exFAT).
What is the format of the drive?
If your drives are all NTFS, consider disabling the Everything database and reindexing your NTFS volumes on startup:
- In Everything, from the Tools menu, click Options.
- Click the General tab on the left.
- Uncheck Start Everything on system startup.
- Click the NTFS tab on the left.
- For each NTFS volume:
- Check Include in database.
- Click the Folders tab on the left.
- For each folder:
- Click Remove.
- Click OK.
Everything.exe -nodb
Consider placing a shortcut to launch Everything.exe -nodb in shell:startup
This will perform a fresh re-index of all your drives when Everything is launched.
This fresh re-index will likely be quicker for you than attempting to update the existing database.
Re: background indexing or loading remote folders?
Thank you for this very detailed answer. I will experiment with -nodb and the index from the NASses on file instead of folder. I was wondering if a static index file will work better/faster than the index in the DB and trying to update (the update fails on the largest share, see viewtopic.php?f=5&t=9369). My DB is 125 MB of size from local SSD and a bunch of NAS shared folders.
The local SSD NTFS scan (also the initial scan was) is lightning fast, even with a loaded and large ssd here.
The issue is with the remote Folders, on the NASses. Those shares are huge and slow to fo a full scan. Hence, I thought of the possibility to only scan a (sub-(sub-...)) folder of the share and replace the tree in the database.
The local SSD NTFS scan (also the initial scan was) is lightning fast, even with a loaded and large ssd here.
The issue is with the remote Folders, on the NASses. Those shares are huge and slow to fo a full scan. Hence, I thought of the possibility to only scan a (sub-(sub-...)) folder of the share and replace the tree in the database.
The shares are ext3 and ext4, used by Windows as a SMB share. They are on Synology NAS.
Re: background indexing or loading remote folders?
Thanks for the suggestions and hints, all good.
Most of the time leaving the indexes for the remote/network folders works well and does not restrict use. I just had a few changes (due to network changes) that meant i had to go through the whole reindex again with no access until completed. (edit: an multiple config instance might be the answer here i have learned.) Certainly its working very well with the single configuration to index all, then it updates the remote folders once a day on arrival at work, and using the filters i can show only local or only network or everything functionality quite easy.
ie at work the configuration does the local C: drive and some mapped drives plus network address folders that are all accessible. Then at home with no VPN in operation the whole lot are still in the index but there is no communication to the work network or these mapped drives/remote folders.
Maybe the detection/or not of the available remote network folders could temporarily remove them from the search results? or an option like the ntfs add/remove for plugable drives?
I will investigate multiple configuration instances as you suggest. Again appreciate your efforts and what a great program mate.
Most of the time leaving the indexes for the remote/network folders works well and does not restrict use. I just had a few changes (due to network changes) that meant i had to go through the whole reindex again with no access until completed. (edit: an multiple config instance might be the answer here i have learned.) Certainly its working very well with the single configuration to index all, then it updates the remote folders once a day on arrival at work, and using the filters i can show only local or only network or everything functionality quite easy.
ie at work the configuration does the local C: drive and some mapped drives plus network address folders that are all accessible. Then at home with no VPN in operation the whole lot are still in the index but there is no communication to the work network or these mapped drives/remote folders.
Maybe the detection/or not of the available remote network folders could temporarily remove them from the search results? or an option like the ntfs add/remove for plugable drives?
I will investigate multiple configuration instances as you suggest. Again appreciate your efforts and what a great program mate.