Part of the issue isn't just that they added garbage, it's that sites got better at targeting the algorithm. It's like recipe websites that started to add stories because the algorithm prioritized that. No one was asking for it, but the algorithm was broken in unexpected ways and exploited.
So, someone could make a new search engine with a different algorithm. If it becomes successful though, it's only a matter of time before it becomes targeted and manipulated.
So, someone could make a new search engine with a different algorithm. If it becomes successful though, it's only a matter of time before it becomes targeted and manipulated.
I can think of a solution to this. Once your new search engine is starting to get targeted, you find out who is offering these SEO services. Then you hire a gang of thugs to beat the living shit out of them.
I was actually thinking of a real way to fix it, and I think it's partially solvable. First you add noise and length preferences (short or long) to the search. Then you make those customizable by the user, so everyone can have different settings. If everyone is using different settings, you can only optimize for what is shared. Obviously the issue is of there becomes a "meta" for search settings, then it doesn't work.