Projection-Free Evolution Strategies for Continuous Prompt Search
arXiv cs.CL / 3/17/2026
📰 NewsModels & Research
Key Points
- The study investigates continuous prompt search as a computationally efficient alternative to parameter tuning in natural language processing tasks.
- It shows that, despite the prompt space having a low-dimensional structure, random projections fail to capture this essential structure.
- The authors propose a projection-free prompt search method based on evolutionary strategies that optimizes directly in the full prompt space with adaptation to the intrinsic dimension.
- A confidence-based regularization mechanism is introduced to improve generalization in few-shot scenarios by increasing the model's confidence in the target verbalizers.
- Experimental results on seven GLUE tasks demonstrate that the proposed approach significantly outperforms existing baselines.
Related Articles

14 Best Self-Hosted Claude Alternatives for AI and Coding in 2026
Dev.to
[P] Finetuned small LMs to VLM adapters locally and wrote a short article about it
Reddit r/MachineLearning
Experiment: How far can a 28M model go in business email generation?
Reddit r/LocalLLaMA

Qwen 3.5 397b (180gb) scores 93% on MMLU
Reddit r/LocalLLaMA
Qwen 3.5 27B - quantize KV cache or not?
Reddit r/LocalLLaMA