Background

I’m planning on building a secondary server that can process more intense tasks than my current basic home server. Tasks such as light gaming (think “remote Steam Deck”), and later allowed to be upgraded with a Nvidia graphics card to handle AI tasks, such as LLM and SD.

The problem

While I have no problem picking parts to build this as a “desktop computer”, I’m completely lost when trying to make it power efficient for idle load (if it’s even possible with a power-hungry Nvidia card). I’d appreciate some guidance even if it’s not a full parts list suggestion!

Watching Wolfgang’s videos has unfortunately not translated knowledge into practice for me yet. At least I know TDP isn’t an absolute determining factor anymore.


Planning the build

Due to a limited budget, the idea is split the build in 2 phases.

Phase 1 (gaming):

  • Budget: $1000 (ideally below ~$800):
  • Use for local headless gaming (with bazzite?)
  • At least as powerful as a Steam Deck
  • Parts:
    • APU:
      • Perhaps: “AMD Ryzen 7 7800X3D ($400)”
      • or “AMD Ryzen 5 7600X ($300)”?
    • Motherboard:
      • No specific requirement, will mainly just use the PCIe x16 slot for a single GPU when upgrading in “phase 2”.
      • Okay with gigabit ethernet and basic I/O.
    • Power supply:
      • Power efficient power supply that can handle a class 4080/4090 card.
    • Cooling:
      • Air-cooled preferred
    • Storage:
      • Samsung 990 PRO NVMe M.2 SSD 1TB (~$100)
    • RAM:
      • Corsair Vengeance LPX Black DDR4 3200MHz 2x16GB (~$70)
    • Case:
      • As long as it fits a large graphics card

Phase 2 (AI: LLM/SD):

  • Budget: ~$2000
  • After 1-2 years, upgrade with Nvidia graphics card.
    • Ideally something with 24GB VRAM, like the 4090.
    • Prefer Nvidia due to compatibility with SD.
  • Open for suggestion, since wanting a low power draw with a 4090 might sound contradictory.

In case I missed any crucial information, let me know!

  • Mantis8497@lemm.eeOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    With all the helpful comments shared in this thread, I’m starting to realize that this approach is likely the only viable solution.

    Previously when doing my research, I was naive enough that when people said “…30W at idle”, it was specifically for their GPU, and not for their whole system. So now things makes a lot more sense.