| If you have an unused phone lying around, you might be sitting on a tiny AI server I’ve been working on a project where I modified Google AI Edge Gallery and turned it into an OpenAI-compatible API server: [Gallery as Server](https://github.com/xiaoyao9184/gallery) Your phone can run local AI inference You can call it just like an OpenAI API (chat/completions, etc.) Instead of letting that hardware collect dust, you can turn it into a lightweight inference node. So yeah—if you have more than one old phone, you can literally build yourself a cluster. [link] [comments] |
Unused phone as AI server
Reddit r/LocalLLaMA / 4/9/2026
💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage
Key Points
- The article suggests repurposing an unused phone as a small on-device AI inference “server.”
- It describes modifying Google’s AI Edge Gallery into an OpenAI-compatible API server so users can call it using standard chat/completions-style requests.
- It links to a GitHub project (“gallery as server”) that provides the approach for exposing the phone’s local inference via an API.
- It notes that multiple old phones could be combined into a lightweight inference cluster for greater throughput.
Related Articles

Black Hat USA
AI Business

Black Hat Asia
AI Business

Meta Superintelligence Lab Releases Muse Spark: A Multimodal Reasoning Model With Thought Compression and Parallel Agents
MarkTechPost
I tested and ranked every ai companion app I tried and here's my honest breakdown
Reddit r/artificial

Big Tech firms are accelerating AI investments and integration, while regulators and companies focus on safety and responsible adoption.
Dev.to