Hey r/LocalLLaMA !
I'm back with a significant upgrade: htmLLM-124M! It’s a specialized base model built specifically for high-fidelity HTML/Bootstrap autocompletion. I wanted to see how much I could push the structural logic compared to my previous 50M version.
The results? I hit a peak validation loss of 0.91 and a training floor of 0.27. It now understands complex grid systems and even script dependency chains.
🔗 Links
- Model on HF: LH-Tech-AI/htmLLM-124M
- Training: Open-source .ipynb included. Trains in ~8h on a single T4.
🛠️ What it can do
Sample 1: Zero-shot Bootstrap Login Grid Input: <form class="p-4 border rounded"> <div class="mb-3"> <label class="form-label">Email</label>
HTML
<div class="mb-3"> <label class="form-label">Email</label> <input type="email" class="form-control" id="email"></input> </div> <fieldset class="form-inline mb-1"> <div class="row"> <div class="col-md-3 text-center"> <div class="input-group mb-2"> <span class="input-group-addon"><i class='fa fa-envelope' style="font-size:10px;"></i></span> </div> <div class="col-md-3 text-center"> <input type="text" class="form-control" id="password"></input> </div> </div> <div class="col-md-3 text-center"> <button type="submit" id="submitSubmit" class="btn btn-primary">Send to</button> </div> Sample 2: Complex Navbar with Toggler Logic Input: <nav class="navbar navbar-expand-lg navbar-light bg-light"> <div class="container-fluid"> <a class="navbar-brand" href="#">LH-Tech AI</a>
<button class="navbar-toggler" type="button" data-toggle="collapse" data-target="#navbarSupportedContent" ...> <span class="sr-only">Toggle navigation</span> <span class="icon-bar"></span> </button> <div class="collapse navbar-collapse" id="navbarSupportedContent"> <ul class="navbar-nav mr-auto"> <li class="nav-item"><a class="nav-link" href="/">Home</a></li> <li class="nav-item"><a class="nav-link" href="/category/programming">Programming</a></li> </ul>Sample 2: Complex Navbar with Toggler Logic Input: <nav class="navbar navbar-expand-lg navbar-light bg-light"> <div class="container-fluid"> <a class="navbar-brand" href="#">LH-Tech AI</a> 🚀 Big Release Weekend
As promised, I am also officially releasing the weights and code for the Apex 1.5 Series (350M) including the Coder variant and FULL and INT8 ONNX exports for local-first inference!
- Apex 1.5 Coder: Link to HF
- Apex 1.5 Instruct: Link to HF
I’d love to hear your thoughts on my "Specialization over Scale" philosophy. See you in the comments!
I don't want to promote anything but instead show the world my opensource models.
Pro-Tip: Use it for Autocomplete!
While it can handle basic instructions, this 124M model shines as a pure Autocomplete engine. It has a deep understanding of Bootstrap structures, jQuery initialization, and even specific framework syntax like Angular Material. It’s the perfect 'copilot' for your IDE's ghost text.
And: Runs on every "potato": 124M parameters means you can run this alongside your IDE, your browser, and 50 other tabs without even feeling it. :D
[link] [comments]




