Banned from cloud services at work. Is a local AI worth it?

Reddit r/LocalLLaMA / 2026/3/24

💬 オピニオンDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage

要点

  • A Reddit user’s workplace has prohibited uploading proprietary data to cloud services, pushing them to consider running an AI assistant locally for document analysis and report writing.
  • With a ~$1500 budget and strong requirements for portability, they’re comparing a small startup device (TiinyAI) advertising ~80GB RAM and 190 TOPS versus a Mac mini M4 with 64GB RAM enabled by a trade-in discount.
  • They don’t need very large models but want to run roughly 30B-class models smoothly and potentially run multiple smaller models concurrently.
  • The core decision hinges on practical constraints like hardware capability for local LLM inference, device size, and the risk that a small startup may not ship on time.

My company just banned us from putting any proprietary data into clould services for security reasons. I need help deciding between 2 pc. My main requirement is portability, the smaller the better. I need an AI assistant for document analysis and writing reports. I don't need massive models; I just want to run 30B models smoothly and maybe some smaller ones at the same time. I currently have two options with a budget of around $1500:

  1. TiinyAI: I saw their ads. 80GB RAM and 190TOPS. The size is very small. However they are a startup and I am not sure if they will ship on time

  2. Mac Mini M4 64GB: I can use a trade-in to get about $300 off by giving them my old Mac

Is there a better choice for my budget? Appreciate your advices

submitted by /u/daksh_0623
[link] [comments]