- cross-posted to:
- politics@beehaw.org
- cross-posted to:
- politics@beehaw.org
AI-generated dossiers from the Jmail team.
Archived version: https://archive.is/newest/https://www.theverge.com/policy/879508/jikipedia-epstein-email-encyclopedia
AI-generated dossiers from the Jmail team.
Archived version: https://archive.is/newest/https://www.theverge.com/policy/879508/jikipedia-epstein-email-encyclopedia
I think you’re thinking of a search engine, AIs are bad at searching. They just make up a response related to the search term based on their training data.
AI systems designed for searching normally use https://en.wikipedia.org/wiki/Retrieval-augmented_generation afaik, which involves directly referencing documents rather than guessing based on initial training data.
Wouldn’t you just train them on the documents to be searched?
You get a lot fewer hallucinations if it’s presenting data from sources rather than from its neural network alone. Training data isn’t, like, “in” the AI. It’s just used to shape its creation.
Normally no, because that is much more difficult, resource intensive, and harder to get reliable results than separately looking up the information and including it in the prompt.
deleted by creator
And if you train them on the document trove, they will be able to answer questions about it. It is a straight up trivial task.
Make up answers about it. The answers might be right, or they might be wrong, you won’t know unless you read the actual data. So helpful …
“Give me the line numbers corresponding to the saudi sheik saying he liked the torture videos”
Are you trying to be obstinate on purpose?
I don’t think I’m trying to be obstinate, I just am.
A natural! Mazel tov!
If we are talking about LLMs, the other commenter is entirely right about how they function. But I’m not sure you two are talking about the same technology.
Can an LLM provide me the information I want given a search term if trained on the given dataset? Yes. That is all.
Maybe
It can provide you some information that looks similar to what you’d want. Whether it is correct is another question.
RAG can help to a degree but hallucinations still happen quite a bit.