Having an always-on machine running LLMs locally at home while on the move with a lightweight machine - Experiences?

Reddit r/LocalLLaMA / 5/2/2026

💬 OpinionDeveloper Stack & InfrastructureTools & Practical Usage

Key Points

  • A data science learner with an 8GB MacBook Air is considering upgrading either to a higher-RAM MacBook Pro for on-the-go work or to a Mac Studio/Mac mini with more RAM at home for always-on local LLM use.
  • They want to know whether remote access to the home machine for compute/RAM can feel “seamless” while they’re away, or whether it becomes cumbersome in real life.
  • The post asks for firsthand experiences with both setups, specifically for local LLM workflows and remote development tasks.
  • The core decision centers on balancing laptop portability for café/work travel against the convenience of running LLMs 24/7 at home.
  • They request recommendations from others who have tried similar configurations in order to choose a practical, efficient setup for data science and local AI.

Hi!

I’m currently retraining in data science and my current laptop is an 8 GB MacBook Air, so naturally I’m looking to upgrade. I’m also interested in AI and running LLMs locally, and Ive been thinking about two options:

a) Get a MacBook Pro with 48-64 GB RAM
b) Get a Mac Studio / Mac mini with 64 GB RAM and keep using my MacBook Air

I’m on the go a lot and often work in cafés etc, so having the power directly in the laptop seems useful. But I’m also intrigued by the idea of having an always-on machine at home, for example running my OpenClaw / local LLM stuff 24/7.

What I'm wondering is: if I need the RAM/compute power of the Mac Studio or Mac mini while I’m out, can I access it remotely in a way that actually feels seamless? Or does that become annoying in practice?

Would be interested in experiences from people who have tried either setup, especially for data science, local LLMs, and remote development. What's your recommendation?

Thank you!

submitted by /u/ceo_of_banana
[link] [comments]