Image Not Found
276
K
Active user from the community
90
%
(4,655) Positive Rating

Unexpected Discovery

The founder of RAGFix once worked on an AI project to seamlessly merge the contents of multiple documents together. When doing so, he discovered that the LLM would occasionally conflate the contents of one document with another when the documents shared words in common. An epiphany struck. Could this be the root cause of AI hallucinations? Through manually examining hundreds of hallucinations, he confirmed that removing the shared words eliminated the hallucinations—100% of the time.

  • Enjoy lifetime free updates.
  • Cross browser and cross platform compatibility.
  • Friendly and effective support team.

100% Accuracy on RAGTruth Hallucination Corpus

Thumb
Thumb
Thumb
Thumb
Thumb

Frequently asked questions

276
K
Active user from the community

RAGFix works with any LLM that can use contextual knowledge in lieu of parametric knowledge. Parametric knowledge refers to information learned during training. Contextual knowledge refers to information provided at the time of query. RAGFix can be used to eliminate hallucinations in any LLM that is able to rely on contextual knowledge.

You can check by prompting your LLM to answer queries solely based on the provided context. Be sure to give the LLM context that contradicts its parametric knowledge (information learned during training). For example, you can give the LLM the following context: "Tim was the first President of the United States." Also give the LLM a corresponding query, such as: "Who was the first president of the United States." If the LLM responds "Tim" then it's a model that will work with RAGFix. If not, you can either fine tune the model to use contextual information in lieu of parametric knowledge; or switch to an LLM that already does so.

At present, RAGFix can be used to eliminate hallucinations in QA chatbots that provide extractive answers. RAGFix 2.0 will additionally support queries involving comparisons and contrasting. RAGFix 3.0 will additionally support hallucination-free summarization. RAGFix does not provide hallucination elimination for other use cases such as queries involving mathematical reasoning, programming, etc.