Prompt-tuning with Attribute Guidance for Low-resource Entity Matching

arXiv cs.AI / 3/23/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • PROMPTATTRIB is a low-resource entity matching method that combines both entity-level prompts and attribute-level prompts with fuzzy logic to infer the final matching label.
  • The approach addresses limitations of prior prompt-tuning EM work by incorporating attribute-level information and improving interpretability.
  • It introduces dropout-based contrastive learning on soft prompts, inspired by SimCSE, to boost EM performance under limited labeled data.
  • Real-world experiments across datasets demonstrate PROMPTATTRIB's effectiveness, showing improved accuracy with minimal supervision and suggesting practical applicability for low-resource EM tasks.

Abstract

Entity Matching (EM) is an important task that determines the logical relationship between two entities, such as Same, Different, or Undecidable. Traditional EM approaches rely heavily on supervised learning, which requires large amounts of high-quality labeled data. This labeling process is both time-consuming and costly, limiting practical applicability. As a result, there is a strong need for low-resource EM methods that can perform well with minimal labeled data. Recent prompt-tuning approaches have shown promise for low-resource EM, but they mainly focus on entity-level matching and often overlook critical attribute-level information. In addition, these methods typically lack interpretability and explainability. To address these limitations, this paper introduces PROMPTATTRIB, a comprehensive solution that tackles EM through attribute-level prompt tuning and logical reasoning. PROMPTATTRIB uses both entity-level and attribute-level prompts to incorporate richer contextual information and employs fuzzy logic formulas to infer the final matching label. By explicitly considering attributes, the model gains a deeper understanding of the entities, resulting in more accurate matching. Furthermore, PROMPTATTRIB integrates dropout-based contrastive learning on soft prompts, inspired by SimCSE, which further boosts EM performance. Extensive experiments on real-world datasets demonstrate the effectiveness of PROMPTATTRIB.