AI Navigate

Information Asymmetry across Language Varieties: A Case Study on Cantonese-Mandarin and Bavarian-German QA

arXiv cs.CL / 3/17/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The authors construct a novel QA dataset to study information asymmetry between local language editions (Cantonese vs Mandarin; Bavarian vs German) using Wikipedia as the knowledge source.
  • Experiments show LLMs fail to answer questions about information present only in local editions, though providing context from lead sections and translation can substantially improve performance.
  • The findings demonstrate the value of local Wikipedia editions for both regional and global information and raise questions about inclusivity and cultural coverage of LLMs.
  • The work suggests directions to improve LLMs by leveraging localized sources and translations to close knowledge gaps across language varieties.

Abstract

Large Language Models (LLMs) are becoming a common way for humans to seek knowledge, yet their coverage and reliability vary widely. Especially for local language varieties, there are large asymmetries, e.g., information in local Wikipedia that is absent from the standard variant. However, little is known about how well LLMs perform under such information asymmetry, especially on closely related languages. We manually construct a novel challenge question-answering (QA) dataset that captures knowledge conveyed on a local Wikipedia page, which is absent from their higher-resource counterparts-covering Mandarin Chinese vs. Cantonese and German vs. Bavarian. Our experiments show that LLMs fail to answer questions about information only in local editions of Wikipedia. Providing context from lead sections substantially improves performance, with further gains possible via translation. Our topical, geographic annotations, and stratified evaluations reveal the usefulness of local Wikipedia editions as sources of both regional and global information. These findings raise critical questions about inclusivity and cultural coverage of LLMs.