MolmoWeb 4B/8B

Reddit r/LocalLLaMA / 2026/3/25

📰 ニュースSignals & Early TrendsModels & Research

要点

  • MolmoWeb is presented as a family of fully open multimodal web agents that achieve top-tier results versus other similarly sized open models.
  • MolmoWeb-8B is reported to outperform larger closed “set-of-marks” agents built on frontier models such as GPT-4o.
  • The article claims significant improvements via test-time scaling using parallel rollouts and best-of-N selection, boosting pass@4 over pass@1 on WebVoyager and Online-Mind2Web.
  • MolmoWeb-4B is described as based on the Molmo2 architecture, using Qwen3-8B as the language component and SigLIP 2 as the vision backbone.
  • Public model artifacts are provided via Hugging Face, including MolmoWeb-4B/8B and their “Native” variants, alongside linked blog and technical report materials.

MolmoWeb is a family of fully open multimodal web agents. MolmoWeb agents achieve state-of-the-art results outperforming similar scale open-weight-only models such as Fara-7B, UI-Tars-1.5-7B, and Holo1-7B. MolmoWeb-8B also surpasses set-of-marks (SoM) agents built on much larger closed frontier models like GPT-4o. We further demonstrate consistent gains through test-time scaling via parallel rollouts with best-of-N selection, achieving 94.7% and 60.5% pass@4 (compared to 78.2% and 35.3% pass@1)on WebVoyager and Online-Mind2Web respectively.

Learn more about the MolmoWeb family in our announcement blog post and tech report.

MolmoWeb-4B is based on Molmo2 architecture, which uses Qwen3-8B and SigLIP 2 as vision backbone.

https://huggingface.co/allenai/MolmoWeb-8B

https://huggingface.co/allenai/MolmoWeb-8B-Native

https://huggingface.co/allenai/MolmoWeb-4B

https://huggingface.co/allenai/MolmoWeb-4B-Native

submitted by /u/jacek2023
[link] [comments]