It wasn't that long ago that you had to combine hard drives in an array to get enough space for 100M+ moderately-sized files. Now you can buy single drives with 20TB+ capacity which is big enough for about 200M files where the average size is 100KB.
File systems were never designed for fast search, so there are a number of 'file indexers' that are used to speed up searching. MacOS's Spotlight, Windows Search and Linux's locate are just a few. Each crawls the file system and uses a kind of database to store file metadata for faster lookup. Updating the database can be a slow process when large numbers of files are indexed.
What is the largest number of files that you have successfully indexed using one of these systems? Has anyone tried it on a volume with over 100M files?