Sure, the problem with installing a local search engine is the installer technology. It can't be the petabyte of index information that the search engine actually needs, and the petaflops of CPU it would need to search through it.
Everyone has a PB of SSD disk space, some few TB of RAM and a few thousand CPUs to throw at the search problem, or is happy to type in a search query and give a 16 core CPU a few days to execute it, right?
> or is happy to type in a search query and give a 16 core CPU a few days to execute it, right?
That is just a naive implementation. For the first 10 results you grab ads, the database of those is significantly smaller, for the next 20 results you look at Wikipedia and stackexchange clone sites. Everything after that is indexed using math.random(). If you want to get fancy run the query through a fact creating AI and present the results inline, people are always happy to know that the color of the sky is purple or that the ideal amount of chess players is 5. Disclaimer: I have never seen googles source code nor any patents related to it, any similarity with existing search engines is pure coincidence.
I don’t know why you are framing this as an impossible task. It doesn’t need to be on the scale of Bing/Google to function. There are already some self-hosted search engine solutions that work okay. Just filter out all the trash sites with low quality content like Facebook/Twitter from the database and that 300TB common crawl could probably be cut down to a more reasonable 200TB. Filter out non-English results and it probably halves it further. I’m seeing 8TB drives on Newegg for $129. It absolutely does not take anywhere on the order of “days” to query a properly optimized db of this size.
Everyone has a PB of SSD disk space, some few TB of RAM and a few thousand CPUs to throw at the search problem, or is happy to type in a search query and give a 16 core CPU a few days to execute it, right?