Fine-Tuned Large Language Models for Logical Translation: Reducing Hallucinations with Lang2Logic
Published in 2025 IEEE International Symposium on Networks, Computers and Communications (ISNCC), 2025
Abstract
Recent advances in natural language processing (NLP), particularly large language models (LLMs), have motivated the automatic translation of natural language statements into formal logic without human intervention. This enables automated reasoning and facilitates debugging, finding loop invariants, and adhering to specifications in software systems. However, hallucinations—incorrect outputs generated by LLMs—are challenging, particularly for logical translation tasks requiring precision. This work introduces a novel framework that inputs English sentences, converts them into logical expressions, and then translates them into Conjunctive Normal Form (CNF) for satisfiability solving. It employs classical NLP techniques with self-defined grammar, symbolic computation libraries, and a fine-tuned language model to reduce hallucinations.
Recommended citation: M. Pan, D. Kodakandla, and M. Farooque. (2025). "Fine-Tuned Large Language Models for Logical Translation: Reducing Hallucinations with Lang2Logic." 2025 IEEE International Symposium on Networks, Computers and Communications (ISNCC). pp. 1-4.
Download Paper
