This exactly. If it just said “Here’s sources with info about that, and a summary of what they say” that’s helpful. The whole presenting the info as authoritative is the crux of the problem. People are too stupid to *not" trust it.
Even those summaries with sources are to be used with caution, I’ve had plenty of search summaries where AI just omitted a ‘not’ or other vital parts of the original answer (tbf that’s also the case for man-made summaries, just look at the amount of accidental misinformation on Wikipedia caused by inattentive reading of original sources)
GPT (or any other LLM) is not a source, it’s a relay that disambiguates its original sources and thus washes away any sort of credibility.
This exactly. If it just said “Here’s sources with info about that, and a summary of what they say” that’s helpful. The whole presenting the info as authoritative is the crux of the problem. People are too stupid to *not" trust it.
Even those summaries with sources are to be used with caution, I’ve had plenty of search summaries where AI just omitted a ‘not’ or other vital parts of the original answer (tbf that’s also the case for man-made summaries, just look at the amount of accidental misinformation on Wikipedia caused by inattentive reading of original sources)