My boss recently spent around ~$13k on a high-end workstation intended to run local AI (LLMs / similar), and I’ve been tasked with figuring out how to get everything properly set up. Neither of us are particularly technical.
From what I understand, the system includes:
• AMD Threadripper PRO platform • NVIDIA RTX PRO 6000 (Blackwell) with 96GB VRAM • 128GB ECC RAM • Gen5 NVMe storage • Running Windows currently One of the main drivers here is security/privacy — he’s especially interested in local-first setups (he’s mentioned tools like Nemoclaw), which is why we’re avoiding cloud solutions.
I’m not looking for setup instructions, but rather advice on how to find and vet the right person to do this properly.
Specifically:
• Where do you find people qualified for this type of work? • What kind of background should I be looking for (ML engineer, MLOps, sysadmin, etc.)? • What are red flags when hiring for something like this? • What questions would you ask to confirm they actually know what they’re doing? • Can this realistically be done remotely, or is in-person better? My boss would strongly prefer someone local (East Brunswick, NJ area) who can work with us in person if possible.
I’d really appreciate any advice on how to approach this the right way — I want to avoid wasting time or hiring the wrong person.
[link] [comments]




