For those wondering about the power consumption of a dual 3090 rig while inferencing

Reddit r/LocalLLaMA / 5/5/2026

💬 OpinionSignals & Early TrendsTools & Practical Usage

Key Points

  • A Reddit user reports that a dual GeForce RTX 3090 setup during inference consumes about 760W as measured at the wall using a smart plug.
  • They note that the system’s idle power usage is around 90W (“Idle is 90W-ish”).
  • The measurement is presented as a real-world baseline without special optimization.
  • The user states they have not adjusted or tweaked the GPUs’ power limits or performed any other advanced power-saving configurations.
For those wondering about the power consumption of a dual 3090 rig while inferencing

Mine is ~760W measured at the wall by a smart plug.

Idle is 90Wish.

I haven't tweaked the power limit of the cards or done anything fancy.

submitted by /u/sdfgeoff
[link] [comments]