DeCoRe: Decoding by Contrasting Retrieval Heads
By Miniml Research, February 28, 2026
In EMNLP 2025 Findings
When models answer with retrieved context, hallucinations often come from decoding choices rather than retrieval quality. DeCoRe addresses this by contrasting a base model head with a retrieval-focused head during decoding, biasing outputs toward grounded content.
The key advantage is simplicity: no extra training, no new parameters, and no changes to the retrieval index. It is a decoding-time method that can be applied to existing LLMs and RAG stacks.
Across multiple benchmarks, DeCoRe improves factuality while preserving fluency, making it a practical technique for production systems that need stronger grounding.
Paper: https://arxiv.org/abs/2410.18860
Abstract
Large Language Models (LLMs) often hallucinate, producing unfaithful or factually incorrect outputs by misrepresenting the provided context or incorrectly recalling internal knowledge. Recent studies have identified specific attention heads within the Transformer architecture, known as retrieval heads, responsible for extracting relevant contextual information. We hypothesise that masking these retrieval heads can induce hallucinations and that contrasting the outputs of the base LLM and the masked LLM can reduce hallucinations. To this end, we propose Decoding by Contrasting Retrieval Heads (DeCoRe), a novel training-free decoding strategy that amplifies information found in the context and model parameters. DeCoRe mitigates potentially hallucinated responses by dynamically contrasting the outputs of the base LLM and the masked LLM, using conditional entropy as a guide. Our extensive experiments confirm that DeCoRe significantly improves performance on tasks requiring high contextual faithfulness, such as summarisation (XSum by 18.6%), instruction following (MemoTrap by 10.9%), and open-book question answering (NQ-Open by 2.4% and NQ-Swap by 5.5%).
Stay ahead with research-backed solutions
From papers to production, we translate cutting-edge AI research into practical systems that give your business a competitive edge.
See how we work