Or if you are set on using AI Overviews to research products, then be intentional about asking for negatives and always fact-check the output as it is likely to include hallucinations.
If it is necessary to fact check something every single time you use it, what benefit does it give?
It might be able to give you tables or otherwise collated sets of information about multiple products etc.
I don’t know if Google does, but LLMs can. Also do unit conversions. You probably still want to check the critical ones. It’s a bit like using an encyclopedia or a catalog except more convenient and even less reliable.
You can do unit conversions with powertoys on windows, spotlight on mac and whatever they call the nifty search bar on various Linux desktop environments without even hitting the internet with exactly the same convenience as an llm. Doing discrete things like that with an llm inference is the most inefficient and stupid way to do them.
All things were doable before. The point is that they were manual extra steps.
It hasn’t stopped anyone from using ChatGPT, which has become their biggest competitor since the inception of web search.
So yes, it’s dumb, but they kind of have to do it at this point. And they need everyone to know it’s available from the site they’re already using, so they push it on everyone.
No, they don’t have to use defective technology just becsuse everyone else is.
Yes. They do.