I currently have a Synology 220+ and a couple of VPS’s, and I’m looking to consolidate, while getting out of Synology’s walled garden. I’ve already got a couple of 3.5’s in the Synology, and 4 2.5’s lying around and I’m planning on running a number of docker containers and a couple of vms.

That said, I’ve never built anything before, and basically just went to PCPartPicker, started with the case, and checked 5-stars on each component and went from there. So… how absurd is my build?

PCPartPicker Part List

Type Item Price
CPU AMD Ryzen 5 5600X 3.7 GHz 6-Core Processor $135.00 @ Amazon
CPU Cooler Cooler Master MasterLiquid 360L Core ARGB Liquid CPU Cooler $90.71 @ Amazon
Motherboard MSI MAG B550 TOMAHAWK ATX AM4 Motherboard $165.99 @ B&H
Memory TEAMGROUP T-Force Vulcan Z 16 GB (2 x 8 GB) DDR4-3200 CL16 Memory $26.99 @ Amazon
Storage Seagate IronWolf NAS 8 TB 3.5" 7200 RPM Internal Hard Drive Purchased For $179.00
Storage Seagate IronWolf NAS 8 TB 3.5" 7200 RPM Internal Hard Drive Purchased For $179.00
Storage Seagate IronWolf NAS 8 TB 3.5" 7200 RPM Internal Hard Drive $159.99 @ Adorama
Case Fractal Design Meshify 2 ATX Mid Tower Case $173.89 @ Newegg
Power Supply Corsair RM650 (2023) 650 W 80+ Gold Certified Fully Modular ATX Power Supply $89.99 @ Corsair
Prices include shipping, taxes, rebates, and discounts
Total $1200.56
Generated by PCPartPicker 2025-05-23 19:32 EDT-0400
  • DesolateMood@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    ·
    5 months ago

    I don’t know what kind of server you’re running, but if you plan to host any video then you want a dedicated GPU or a cpu with integrated graphics (and even if you’re not, I think it’s a good idea anyway), which the 5600x doesn’t.

    I also think it’s overboard to get $90 watercooling. Just get an air cooler for half or a third of the price

    • themadcodger@kbin.earthOP
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Someone else mentioned the cooler too, so that’s out for sure. To be honest, I never really thought about graphics in the traditional sense, but I do need something for at least Jellyfin transcoding. And maybe a small llm. Would it be better to get a dedicated GPU or CPU with integrated?

      • jws_shadotak@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        5 months ago

        The gold standard for transcoding these days is a newer gen Intel CPU with integrated graphics.

        The integrated GPU on those intel chips can do Intel QuickSync / QSV which will handle a dozen streams at once without breaking a sweat.

      • DesolateMood@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        Dedicated GPUs are obviously going to be more powerful. I’ve never run ai before so maybe someone else can weigh in on the requirements for it, but I can say for sure that an igpu is good enough for jellyfin transcoding. It also depends on your budget, do you want to spend the extra money just for a dedicated GPU?

        If you go igpu route I think that Intel is recommended over AMD, but you should probably do extra research on that before buying

        • yaroto98@lemmy.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 months ago

          My amd igpu works just fine for jellyfin. LLMs are a little slow, but that’s to be expected.

          • themadcodger@kbin.earthOP
            link
            fedilink
            arrow-up
            1
            ·
            5 months ago

            Yeah, I’m not sure if I really want to deal with an llm. It would mostly be for home assistant, so nothing too crazy.

            • yaroto98@lemmy.org
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 months ago

              I have a very similar NAS I built. The Home Assistant usage doesn’t really even move the needle. I’m running around 50 docker containers and chilling at about 10% cpu.

              • themadcodger@kbin.earthOP
                link
                fedilink
                arrow-up
                1
                ·
                5 months ago

                The LLM for home assistant, or just HA in general doesn’t move the needle? My HA is also pretty low key, but I was considering the idea of running my own small llm to use with HA to get off of OpenAI. My current AI usage is very small, so I wouldn’t need too much on the GPU side I’d imagine, but I don’t know what’s sufficient.

                • yaroto98@lemmy.org
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  5 months ago

                  Just home assistant doesn’t move the needle. The llms hit the igpu hard and my cpu usage spikes to 70-80% when one is thinking.

                  But my llms i’m running are ollama and invokeai each with several different models just for fun.

      • Oniononon@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        People buils jellyfin media servers out nases which have complete garbage computing power. It really does not take much unless you have like 4 4k tvs and fell for dolby atmos scam.