| Yes… I should have planned better 😅 What is my best option to mount 2x BIG 3090s into the same home server case when the first card is partially obscuring the second/bifurcated pci-express slot? Both cards will be power limited to 220W. I see three possible solutions. Option 1. Mount the second 3090 in the lowest possible position, below the motherboard, about a half inch above the top of the power supply. Use 180° riser cable to loop back above the motherboard and into the PCI express slot. Airflow to 1/3 fans is somewhat restricted. Option 2. Same as 1 but I move the power supply to the front of the case, providing more airflow to the second card. Option 3. Same as 2, but use a vertical mount to secure the second card to the case. Potentially getting better airflow? Option 2/3 requires finding a way to mount the flipped power supply to the bottom of the case, then running a short extension cord to the back of the case. Is it’s worth it? If so, please send suggestions for how to secure a power supply to the bottom of the case safely. [link] [comments] |
Need help with the logistics of two BIG 3090s in the same case.
Reddit r/LocalLLaMA / 3/30/2026
💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage
Key Points
- The post asks for advice on how to physically install two NVIDIA RTX 3090 GPUs (“BIG 3090s”) into a single home server case where one card partially blocks the other’s PCIe slot.
- The author proposes three mounting approaches: placing the second card in the lowest available slot using an 180° PCIe riser, relocating the PSU to improve airflow, or using a vertical mount for the second card.
- They plan to power-limit both GPUs to 220W, making cooling and airflow management central to the decision.
- Options 2/3 require mounting the PSU at the bottom of the case and using a short extension cord, and the author requests safe, practical suggestions for securing the PSU in that position.
- The discussion is framed as hands-on hardware/logistics problem-solving for a multi-GPU setup commonly used for local inference/training workflows.
Related Articles

Black Hat Asia
AI Business

EZRide Intel — I Built an AI Assistant for Boston's Hidden Free Bus Using Notion MCP
Dev.to

Booting Robikatsu — Day 0 Rebuilding my life while building an AI startup operating system
Dev.to

Notion Newsroom AI
Dev.to

What Is AI Execution Risk? Why AI Governance Fails at the Execution Boundary
Dev.to