HUOZIIME: An On-Device LLM-enhanced Input Method for Deep Personalization

arXiv cs.AI / 4/17/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsModels & Research

Key Points

  • The paper introduces HUOZIIME, an on-device mobile input method editor (IME) enhanced with a lightweight LLM to provide personalized, privacy-preserving text suggestions.
  • HUOZIIME achieves initial human-like prediction by post-training a base LLM on synthesized personalization data.
  • It uses a hierarchical memory mechanism to continuously capture and leverage each user’s input history for ongoing personalization.
  • The authors report system-level optimizations specifically aimed at making LLM-based IMEs efficient and responsive within mobile hardware constraints.
  • Experiments indicate that HUOZIIME can run effectively on-device and deliver high-fidelity personalization driven by user memory, with code and a package released on GitHub.

Abstract

Mobile input method editors (IMEs) are the primary interface for text input, yet they remain constrained to manual typing and struggle to produce personalized text. While lightweight large language models (LLMs) make on-device auxiliary generation feasible, enabling deeply personalized, privacy-preserving, and real-time generative IMEs poses fundamental challenges.To this end, we present HUOZIIME, a personalized on-device IME powered by LLM. We endow HUOZIIME with initial human-like prediction ability by post-training a base LLM on synthesized personalization data. Notably, a hierarchical memory mechanism is designed to continually capture and leverage user-specific input history. Furthermore, we perform systemic optimizations tailored to on-device LLMbased IME deployment, ensuring efficient and responsive operation under mobile constraints.Experiments demonstrate efficient on-device execution and high-fidelity memory-driven personalization. Code and package are available at https://github.com/Shan-HIT/HuoziIME.