Improving LLM Performance Through Black-Box Online Tuning: A Case for Adding System Specs to Factsheets for Trusted AI
arXiv cs.AI / 3/13/2026
📰 NewsDeveloper Stack & InfrastructureIdeas & Deep AnalysisModels & Research
Key Points
- The paper proposes a black-box online controller that uses only end-to-end measurements over short segments and hill climbing to maximize goodput for LLM serving without internal instrumentation.
- It provides empirical evidence that this black-box online approach is well-founded and effective in practice.
- The method is demonstrated on LLM serving as a concrete example and shows potential throughput gains while meeting service-level objectives.
- The authors argue for integrating system performance and sustainability metrics into Factsheets to help organizations adopting AI systems.
Related Articles
Day 10: 230 Sessions of Hustle and It Comes Down to One Person Reading a Document
Dev.to

5 Dangerous Lies Behind Viral AI Coding Demos That Break in Production
Dev.to
Two bots, one confused server: what Nimbus revealed about AI agent identity
Dev.to

OpenTelemetry just standardized LLM tracing. Here's what it actually looks like in code.
Dev.to

What is MCP?
Dev.to