| Mine is ~760W measured at the wall by a smart plug. Idle is 90Wish. I haven't tweaked the power limit of the cards or done anything fancy. [link] [comments] |
For those wondering about the power consumption of a dual 3090 rig while inferencing
Reddit r/LocalLLaMA / 5/5/2026
💬 OpinionSignals & Early TrendsTools & Practical Usage
Key Points
- A Reddit user reports that a dual GeForce RTX 3090 setup during inference consumes about 760W as measured at the wall using a smart plug.
- They note that the system’s idle power usage is around 90W (“Idle is 90W-ish”).
- The measurement is presented as a real-world baseline without special optimization.
- The user states they have not adjusted or tweaked the GPUs’ power limits or performed any other advanced power-saving configurations.
Related Articles

Black Hat USA
AI Business

Singapore's Fraud Frontier: Why AI Scam Detection Demands Regulatory Precision
Dev.to

First experience with Building Apps with Google AI Studio: Incredibly simple and intuitive.
Dev.to

Meta will use AI to analyze height and bone structure to identify if users are underage
TechCrunch

Google, Microsoft, and xAI will allow the US government to review their new AI models
The Verge