| Usually for a shell our options are either to give an LLM direct access to our system, or set up podman/docker This project has the goal of being a simple alternative to that: agents can search, edit, create files like they'd normally do, in a fully sandboxed environment. It's mainly for Bun/Nodejs but should also work fine in the browser. We can mount directories to the shell, and we can define custom programs. It comes with 39 built-in programs, like ls, rm, sed, grep, head, tail, wc, and so on, as well as an SVG renderer and a CLI for editing TOML files How to useThis is just a TypeScript library to integrate into a project. There's examples on the README, I can make an MCP server if anyone would be interested npm: https://www.npmjs.com/package/wasm-shell repo: https://github.com/amytimed/wasm-shell [link] [comments] |
project: WASM shell for LLM agents, easy, no setup, sandboxed
Reddit r/LocalLLaMA / 3/19/2026
💬 OpinionDeveloper Stack & InfrastructureTools & Practical Usage
Key Points
- The project provides a sandboxed WASM shell for LLM agents to search, edit, and create files without full system access.
- It supports mounting directories, custom programs, and ships with 39 built-in commands plus an SVG renderer and a TOML editing CLI.
- It is mainly designed for Bun/Node.js but should also work in the browser, and is distributed as a TypeScript library with README examples.
- The project is available as an npm package (wasm-shell) and on GitHub (amytimed/wasm-shell), with potential to add an MCP server if there is interest.
Related Articles

Astral to Join OpenAI
Dev.to

I Built a MITM Proxy to See What Claude Code Actually Sends to Anthropic
Dev.to

Your AI coding agent is installing vulnerable packages. I built the fix.
Dev.to

ChatGPT Prompt Engineering for Freelancers: Unlocking Efficient Client Communication
Dev.to

PearlOS. We gave swarm intelligence a local desktop environment and code control to self-evolve. Has been pretty incredible to see so far. Open source and free if you want your own.
Reddit r/LocalLLaMA