AI Navigate

[D] Seeking Advice: WSL2 vs Dual Boot for ML development with an RTX 5080

Reddit r/MachineLearning / 3/15/2026

💬 OpinionDeveloper Stack & InfrastructureTools & Practical Usage

Key Points

  • The post weighs WSL2 on Windows 11 versus a dual-boot Linux setup for ML development and GPU compute with an RTX 5080.
  • The user has two NVMe drives (Windows 990 PRO and an unused EVO Plus) and is considering whether to install Linux natively on the EVO Plus for future-proof ML work or rely on WSL2 CUDA on Windows.
  • The plan is to SSH from a MacBook Pro into a Linux environment to leverage the GPU, aiming for simultaneous access without frequent reboots.
  • They are seeking real-world experiences and any walls or limitations encountered with WSL2 for ML work.

Hi fellow devs,

I'm getting into ML and trying to figure out the best setup for local development and training. My main question: WSL2 or dual boot Windows 11 / Ubuntu?

My situation:

- My current daily driver is Windows 11 home PC, but my laptop is an i7 macbook Pro. The plan is to use my macbook to SSH into the Linux env and leverage the GPU for compute.

- I rarely game, so rebooting into Linux isn't a huge dealbreaker, but having Linux available simultaneously would be more convenient since I already have stuff setup on Windows so I won't always have to reboot to switch over.

PC specs:

- RTX 5080

- AMD 9800X3D

- 64GB RAM

- 2TB Samsung 990 PRO (Windows drive)

- 2TB Samsung 990 EVO Plus (completely unused, I was originally reserving this for a dual boot Linux install before learning about WSL2)

The EVO Plus sitting unused is what's making me lean toward dual boot, it's just sitting there, and a native Linux install feels more future-proof for serious ML work. But WSL2 + CUDA seems like a much faster path to being productive, and I think I can just install WSL2 virtual disk directly onto the EVO Plus.

What would you do in my position, and have you hit any real walls with WSL2 for ML work specifically?

submitted by /u/lipstickpickups
[link] [comments]