microsoft/harrier-oss 27B/0.6B/270M

Reddit r/LocalLLaMA / 3/30/2026

📰 NewsSignals & Early TrendsModels & Research

Key Points

  • Microsoft has released harrier-oss-v1, a family of multilingual text embedding models available in 27B, 0.6B, and 270M sizes via Hugging Face.
  • The models use decoder-only architectures with last-token pooling and L2 normalization to generate dense embeddings for downstream NLP tasks.
  • harrier-oss-v1 targets use cases such as retrieval, clustering, semantic similarity, classification, bitext mining, and reranking.
  • The release reports state-of-the-art performance on the Multilingual MTEB v2 benchmark as of the release date.
  • Multiple model sizes are provided, enabling teams to trade off quality and compute for embedding-related pipelines.

harrier-oss-v1 is a family of multilingual text embedding models developed by Microsoft. The models use decoder-only architectures with last-token pooling and L2 normalization to produce dense text embeddings. They can be applied to a wide range of tasks, including but not limited to retrieval, clustering, semantic similarity, classification, bitext mining, and reranking. The models achieve state-of-the-art results on the Multilingual MTEB v2 benchmark as of the release date.

https://huggingface.co/microsoft/harrier-oss-v1-27b

https://huggingface.co/microsoft/harrier-oss-v1-0.6b

https://huggingface.co/microsoft/harrier-oss-v1-270m

submitted by /u/jacek2023
[link] [comments]