llm-echo 0.4

Simon Willison's Blog / 4/1/2026

📰 NewsDeveloper Stack & InfrastructureTools & Practical Usage

Key Points

  • llm-echo 0.4 was released as a debug plugin for LLMs that provides an echo model.
  • The update populates new prompt metadata fields, adding `input_tokens` and `output_tokens` to the response.
  • The release focuses on improving observability/debugging for LLM calls by exposing token usage directly in the plugin output.
Sponsored by: WorkOS — Ready to sell to Enterprise clients? Build and ship securely with WorkOS.

31st March 2026

Release llm-echo 0.4 — Debug plugin for LLM providing an echo model
  • Prompts now have the input_tokens and output_tokens fields populated on the response.
Posted 31st March 2026 at 4:48 pm

This is a beat by Simon Willison, posted on 31st March 2026.

llm 574

Monthly briefing

Sponsor me for $10/month and get a curated email digest of the month's most important LLM developments.

Pay me to send you less!

Sponsor & subscribe