Hello all. Can the Nautilus search be speeded up maybe? Below is the issue I was going to submit - but the issue template says to start a discussion here first.
Motivation
Searching for files in Nautilus is much slower than on the command line with slocate/locate. Times and details:
Command line takes one hundredth of a second:
$ time locate T-200-4
real 0m0.011s
Files takes 13 seconds (measured with a stopwatch). So Nautilus/Files is over a THOUSAND times slower for a typical global search on my laptop.
Machine:
2TB SSD
8 core (16 virtual) AMD Ryzen 7 5700U
32 Gb RAM
Proposal
If locate or slocate is installed, search with that.
Considerations
The slocate database may not have very new files, or may return files that have been deleted very recently.
This suggests that slocate can be run and the presence of the files validated to avoid false positives. A simple prototype suggests that this would take 0.1 seconds for the example above:
real 0m0.105s
user 0m0.055s
sys 0m0.054s
(base) max@pop:~/$ time bash -c 'locate T-200-4 | xargs -d "\n" -I{} stat {}'
locate seems to use a database to get search results. Files already has database search support by using tracker. I think instead of relying on arbitrarily available binaries, that might or might not be used by the user otherwise, the default paths indexed by tracker should be expanded. This should already increase performance significantly.
Try out Nautilus 46’s global search feature (the one from the sidebar), it will give you instantaneous results, although they may be incomplete due to this Tracker issue or this Tracker issue.
As for regular in-folder recursive search that uses both Tracker and the fallback “simple” search, or search performance in general, it would be best to fix the existing performance issues related to search or the views, first. If search is still slow after that, then we can investigate further.