AI Navigate

Slang Context-based Inference Enhancement via Greedy Search-Guided Chain-of-Thought Prompting

arXiv cs.CL / 3/17/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper investigates slang interpretation in LLMs, highlighting the challenge due to contextual, cultural, and linguistic factors and the lack of domain-specific data.
  • It presents a greedy search-guided chain-of-thought prompting framework to improve slang meaning inference, with a focus on small language models.
  • The study finds that model size and temperature have limited impact on slang inference accuracy, with larger transformer models not outperforming smaller ones.
  • Experiments show that integrating greedy search with chain-of-thought prompting yields improved slang interpretation accuracy and underscores the value of structured reasoning for context-dependent language tasks.

Abstract

Slang interpretation has been a challenging downstream task for Large Language Models (LLMs) as the expressions are inherently embedded in contextual, cultural, and linguistic frameworks. In the absence of domain-specific training data, it is difficult for LLMs to accurately interpret slang meaning based on lexical information. This paper attempts to investigate the challenges of slang inference using large LLMs and presents a greedy search-guided chain-of-thought framework for slang interpretation. Through our experiments, we conclude that the model size and temperature settings have limited impact on inference accuracy. Transformer-based models with larger active parameters do not generate higher accuracy than smaller models. Based on the results of the above empirical study, we integrate greedy search algorithms with chain-of-thought prompting for small language models to build a framework that improves the accuracy of slang interpretation. The experimental results indicate that our proposed framework demonstrates improved accuracy in slang meaning interpretation. These findings contribute to the understanding of context dependency in language models and provide a practical solution for enhancing slang comprehension through a structured reasoning prompting framework.