The fact is though the average person is starting to replace their search engine with chatgpt, gemini, grok or whatever other llm and I have seen more and more small association using generative ai to make their posters instead of working with artist or doing it themselves.
When everybody uses AI to search, it becomes a closed system that holds all info. Doesn’t need to be productive, but it gatekeeps the knowledge that was free on the internet. It’s a self-reinforcing loop.
It’s because people thinks AI is like those in the movies (cause it’s been advertised so, too), omniscient and infallible. A short while ago I overheard a “imagine, even the AI didn’t know it!” which normally would be “Your search didn’t return any results”.
Search engines have peaked in early 2010s and hav been deteriorating ever since, becoming virtually unusable since ~2020.
Seriously, google has become unusabe without adding “site:reddit.com” to almost every search. I would like to see something like Perplexity be compared to a proper search engine - if it existed.
AI slop has just accelerated the downfall of search engines.
Attention based economy, advertising and SEO are the reason you can’t find anything useful anymore.
The Internet itself is broken and even if there was a good search engine it would struggle to not suffocate in the seo crap out there.
Kagi is great but paid and some people have feeling about then using yandex.
There are so many solutions. People are lazy and keep using trash Google that doesnt even work. People ask me how in the world I find things. Because I am literate and I know how to use a computer. Its not hard guys.
Google started making their search engine worse and always pushing things that they thought would make them money. It’s not surprising people are trying something else.
I work in infrastructure and what’s concerning is that younger guys are skipping learning to script to automate processes and instead just getting slop from LLMs that they have no idea what it’s doing.
Some have also relegated learning problem solving to it as well so when things go wrong, they’re clueless without it.
The fact is though the average person is starting to replace their search engine with chatgpt, gemini, grok or whatever
Yeah, and it makes sense for the average person to do that. Because Google, Bing, etc, have enshittified their search results so badly that the first few pages of results for any question are almost guaranteed to be AI-generated websites anyway. So you can take the answer the AI gives you, or you can click through to an AI generated website, which is just using the AI with extra steps. Or you can commit the extra time and energy to actually get a useful result written by a human being, which is significantly harder than it used to be, because the useful results are hidden behind decades of search engine optimization and the last few years of AI slop.
None of those options are actually good.
The ubiquity of LLMs hasn’t made search results better. It’s made people more willing to accept worse results.
The fact is though the average person is starting to replace their search engine with chatgpt, gemini, grok or whatever other llm and I have seen more and more small association using generative ai to make their posters instead of working with artist or doing it themselves.
A good blogpost on this: The Enclosure feedback loop
When everybody uses AI to search, it becomes a closed system that holds all info. Doesn’t need to be productive, but it gatekeeps the knowledge that was free on the internet. It’s a self-reinforcing loop.
Is this because LLMs are getting better, or because search engines are getting worse?
Because they are definitely getting worse. I get redirected to a brand new slopsite daily.
It’s because people thinks AI is like those in the movies (cause it’s been advertised so, too), omniscient and infallible. A short while ago I overheard a “imagine, even the AI didn’t know it!” which normally would be “Your search didn’t return any results”.
Search engines have peaked in early 2010s and hav been deteriorating ever since, becoming virtually unusable since ~2020.
Seriously, google has become unusabe without adding “site:reddit.com” to almost every search. I would like to see something like Perplexity be compared to a proper search engine - if it existed.
everything peaked in the 2010s ;)
I agree with the concern for Reddit in responses. Many times include the shit posts.
AI slop has just accelerated the downfall of search engines. Attention based economy, advertising and SEO are the reason you can’t find anything useful anymore. The Internet itself is broken and even if there was a good search engine it would struggle to not suffocate in the seo crap out there.
At this point, I am willing to pay a subscription for a search engine if it’s ad-free and shows me what I’m looking for.
Kagi, searxng, startpage, ddg if lazy…
Kagi is great but paid and some people have feeling about then using yandex.
There are so many solutions. People are lazy and keep using trash Google that doesnt even work. People ask me how in the world I find things. Because I am literate and I know how to use a computer. Its not hard guys.
Kagi sounds like what I’m looking for. Will consider it. Thanks for the information!
I never said llm’s or generative ai is good. I was talking about post just being wishfull thinking.
Google started making their search engine worse and always pushing things that they thought would make them money. It’s not surprising people are trying something else.
I work in infrastructure and what’s concerning is that younger guys are skipping learning to script to automate processes and instead just getting slop from LLMs that they have no idea what it’s doing.
Some have also relegated learning problem solving to it as well so when things go wrong, they’re clueless without it.
Yeah, and it makes sense for the average person to do that. Because Google, Bing, etc, have enshittified their search results so badly that the first few pages of results for any question are almost guaranteed to be AI-generated websites anyway. So you can take the answer the AI gives you, or you can click through to an AI generated website, which is just using the AI with extra steps. Or you can commit the extra time and energy to actually get a useful result written by a human being, which is significantly harder than it used to be, because the useful results are hidden behind decades of search engine optimization and the last few years of AI slop.
None of those options are actually good.
The ubiquity of LLMs hasn’t made search results better. It’s made people more willing to accept worse results.
“Enshittified”; a new descriptor for my vocabulary. 😀