Relocation of compact sets in $\mathbb{R}^n$ by diffeomorphisms and linear separability of datasets in $\mathbb{R}^n$

arXiv cs.LG / 4/24/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper develops a theory showing how a finite collection of compact sets in ℝⁿ can be relocated to arbitrary target domains in ℝⁿ using self-diffeomorphisms of ℝⁿ.
  • It proves that for any such collection, there exists a differentiable embedding into ℝⁿ⁺¹ such that the resulting images are linearly separable.
  • As applications to data science, the authors show that finite datasets in ℝⁿ can be made linearly separable by width-n deep neural networks (DNNs) using Leaky-ReLU, ELU, or SELU activations under a mild condition.
  • The study further demonstrates that any finite number of pairwise disjoint compact datasets in ℝⁿ can be made linearly separable in ℝⁿ⁺¹ using a width-(n+1) DNN.
  • Overall, the results link geometric/topological flexibility (diffeomorphisms and embeddings) with practical separability guarantees for neural-network classifiers.

Abstract

Relocation of compact sets in an n-dimensional manifold by self-diffeomorphism is of its own interest as well as significant potential applications to data classification in data science. This paper presents a theory for relocating a finite number of compact sets in \mathbb{R}^n to be relocated to arbitrary target domains in \mathbb{R}^n by diffeomorphisms of \mathbb{R}^n. Furthermore, we prove that for any such collection, there exists a differentiable embedding into \mathbb{R}^{n+1} such that their images become linearly separable. As applications of the established theory, we show that a finite number of compact datasets in \mathbb{R}^n can be made linearly separable by width-n deep neural networks (DNNs) with Leaky-ReLU, ELU, or SELU activation functions, under a mild condition. In addition, we show that any finite number of mutually disjoint compact datasets in \mathbb{R}^n can be made linearly separable in \mathbb{R}^{n+1} by a width-(n+1) DNN.