Chaotic CNN for Limited Data Image Classification
arXiv cs.CV / 4/17/2026
📰 NewsTools & Practical UsageModels & Research
Key Points
- The paper proposes a chaos-based nonlinear feature transformation to improve CNN generalization under limited training data without increasing model complexity.
- It applies logistic, skew tent, and sine maps to normalized feature vectors before the classification layer to reshape the feature space and enhance class separability.
- Experiments on MNIST, Fashion-MNIST, and CIFAR-10 using CNNs of varying depth show consistent gains over a standalone CNN baseline across all datasets.
- The reported maximum improvements include +5.43% on MNIST (skew tent map, 3-layer CNN, 40 samples/class), +9.11% on Fashion-MNIST (sine map, 3-layer CNN, 50 samples/class), and +7.47% on CIFAR-10 (skew tent map, 200 samples/class).
- The method is computationally efficient, introduces no additional trainable parameters, and is straightforward to integrate into existing CNN architectures for data-scarce image classification.



![[Patterns] AI Agent Error Handling That Actually Works](/_next/image?url=https%3A%2F%2Fmedia2.dev.to%2Fdynamic%2Fimage%2Fwidth%3D1200%2Cheight%3D627%2Cfit%3Dcover%2Cgravity%3Dauto%2Cformat%3Dauto%2Fhttps%253A%252F%252Fdev-to-uploads.s3.amazonaws.com%252Fuploads%252Farticles%252Frn5czaopq2vzo7cglady.png&w=3840&q=75)