Generative AI holds out promise to make legal research faster and more thorough—if a technique that aims to stop the technology from making up things can be perfected.
A method called retrieval-augmented generation, or RAG, is the leading contender to prevent AI hallucinations. Developers of AI tools across a wide range of applications are using RAG, and the method has been widely embraced by the legal technology industry over the last year. But a recent study and real-world use cases are raising questions about how thoroughly RAG eliminates errors in legal research AI systems, according to several people in the ...