CoAction: Cross-task Correlation-aware Pareto Set Learning

arXiv cs.LG / 5/5/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • Pareto set learning (PSL) trains neural networks to map preference vectors to Pareto-optimal solutions, but prior work often solves one multi-objective problem per time, limiting scalability to multi-task settings.
  • The paper introduces CoAction, a cross-task correlation-aware PSL framework that jointly learns multiple tasks by using a task-aware Transformer architecture.
  • CoAction distinguishes tasks via task-specific embedding vectors while still enabling knowledge sharing and modeling correlations between tasks.
  • The Transformer encoder backbone leverages self-attention to capture complex dependencies across tasks, improving overall multi-task optimization quality.
  • Experiments on multitask test suites (benchmarks and real-world applications) show competitive results across key metrics such as Hypervolume, Range, and Sparsity.

Abstract

Pareto set learning (PSL) is an emerging paradigm in multi-objective optimization that trains neural networks to map preference vectors to Pareto optimal solutions. However, existing PSL methods primarily focus on solving a single multi-objective optimization problem at a time. This limitation not only increases computational costs in multi-objective multitask optimization scenarios by requiring a separate model for each task, but also fails to exploit the inter-task correlations across tasks. To address this, we propose a Cross-tAsk correlation-aware Pareto Set Learning (CoAction) framework, which leverages task-aware transformer to handle multiple tasks simultaneously. Specifically, by assigning task-specific embedding vectors to individual tasks, the model effectively distinguishes between tasks while facilitating knowledge sharing among them. We utilize a Transformer encoder as the backbone architecture to leverage its self-attention mechanism for capturing complex task dependencies. The proposed approach is evaluated on comprehensive multitask test suites covering both benchmark problems and real-world applications, demonstrating effectiveness and competitive performance in Hypervolume, Range, and Sparsity.