• qjkxbmwvz@startrek.website
    link
    fedilink
    English
    arrow-up
    32
    ·
    6 months ago

    Googling around it seems a 21" draws around 100W, which isn’t as much as I thought; it’s kinda a florescent light with more steps. A florescent backlit LCD doesn’t use a whole lot less, and a modern 30-something inch LED backlit uses, as far as in an tell, about 1/3 that. So, for typical sized monitors, only ~70W more for CRT.

    In contrast, the GPU wars mean that (I think?) power consumption in gaming desktops has gone up somewhat substantially — a 500W PSU was fairly beefy in 2003 (I think), whereas 1000W or more is pretty standard for a gaming computer now (obviously it’s not drawing rated power, but assuming headroom % is roughly the same…).

    My completely unsubstantiated guess would be that a LAN party setup as pictured would draw more power at idle, but a modern LAN party would draw more under load.

    • MystikIncarnate
      link
      fedilink
      English
      arrow-up
      15
      ·
      6 months ago

      I know enough about power systems to know that we’re going to hit a hard limit on how much max power we’re going to be able to plumb into a computer soon.

      A single North American power circuit is approximately 120v and limited to 15A. The numbers wiggle a little from place to place, I’ve seen many that are running 115v or even closer to 110v. The 15A limit is not quite accurate either, since it’s not recommended to load a circuit more than 80% for any continuous load, so your realistic maximum continual draw would be around 12A.

      Some newer homes are being built with 20A, but most homes are still generally using 15A breakers.

      At 120v, on a 15A circuit, you shouldn’t consistently pull more than 12A, or 1440W. The line will max out at 1800W when the breaker/fuse will start to trigger.

      So as power supplies hit 1200 to 1400 watts, you’ll need to ensure that nothing else on the circuit will draw any significant power. A few displays and whatnot are fine, but with a 1200w PSU, you can’t exceed 240w of additional draw while operating within the recommendations.

      There are a few solutions to this, the obvious one is move to 20A, which can draw 1920W within the recommended power draw for the circuit, so you could have an 1800w system and about 120w of additional items before hitting the recommended limits, and 480w of total overhead before the breaker goes. The downside is that such circuits require thicker cables in-wall (12awg when 14awg is far more common in homes).

      One option I’m aware of that nobody seems to consider is that in NA, the power delivered to the home is approximately 240v split-phase. 240v is generally only used for things like stovetops and ranges, electric dryers, water heaters, resistive heating in forced air furnaces, and air conditioning/heat pump systems. Though, it is entirely possible to convert a simple outlet to 240v. To be safe, you will need to get new receptacles, but you can reuse the wires already in the wall for 240v. I believe the NEMA 6-15R is the one that’s rated for 240v operation in NA, and it’s not dissimilar to the standard NEMA 5-15R that is the typical “North American” receptacle. However, a standard plug (aka a NEMA 5-15P) will not fit into the 240v receptacle. Any outlets on the same circuit would need to be changed so a 120v only device does not get plugged into the 240v receptacle. You can wire it for 240v at 15A which can provide up to 2880W of power without rewiring the house.

      The trick will be to find a NEMA 6-15P to C13 connection for the PC and peripherals, and double checking that they will accept 240v power (all monitors and whatnot need to be set up to accept 240v power). On older power supplies, this is a simple switch on the outside of the PSU, though newer units will be switched, so they will detect the input power automatically.

      I’m not recommending anyone does that, but if you do, talk to an electrician to ensure you’re complying will all local ordinances.

      Despite that, it is an option that most don’t seem to consider.

      Personally, I need to have some rewiring done in my residence and I’m going to see if I can get a dedicated 240v 20A single receptacle circuit installed for my computer, with the appropriate receptacles and everything.

      To note: most 240v connections also have a neutral line (like what you would find with an oven or dryer), which the NEMA 6 receptacles do not have, they are 240v only. This, along with the fact that most 240v receptacles are rated for 30A+, makes them generally very large. The NEMA 6-15R and NEMA 6-20R are the 15/20A versions of 240v AFAIK, and they’re not really any larger than a standard receptacle.

      • dejected_warp_core@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        I never even thought about that. So we’re back to having a dedicated appliance circuit installed, just like microwaves back in the 1980’s.

        And the idea of running a 240v outlet to my PC? I already need one installed for the car. May as well do both at once now. I’m gonna need a bigger breaker box.

        Oh, and Technology Connections did a video a while back that touched on having a “smart panel” to work around limited house service power. In it, I learned that not everyone is wired for 100A and have 50A instead. And the neighborhood itself may be limited in how many folks can upgrade to 100A. So, some people are going to hit that wall with power-hungry devices like EVs and 1.5+ kilowatt PCs. IIRC, electrical codes limit how many more circuits you can add before you’re “maxed out”, regardless of how much total power draw you typically have for the whole house.

        • A7thStone@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          You still should have a dedicated circuit for your microwave, and a second for your refrigerator in the states according to the national electrical code

        • MystikIncarnate
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          Yup. There should be dedicated circuits for your fridge, stove/range, etc. already. Having a dedicated microwave circuit will depend on your kitchen design. If there’s a dedicated location for your microwave, that circuit should be dedicated. If not (and it’s just sitting on a counter, plugged in to the kitchen outlets, then it’s probably not.

          That seems to be the difference I’ve found.

          My older home has a shared fridge circuit, something I’m planning to have fixed soon.

          The kitchen outlets in my home are also connected incorrectly.

          Kitchen outlets, given most kitchen appliances are high drain, should be split phase top/bottom. Usually installers utilize 14/3 cable here (or 12/3, if you’re running 20A breakers), which have two hot (L1 and L2) plus neutral and ground (the ground isn’t counted in the wire labeling, since it should be included in all lines). So if you go from hot (top outlet) to hot (bottom outlet) with a meter, it should show 240v or something close to it. If it shows zero (or close to it), then it’s wired wrong. Those outlets should also be GFCI or AFCI protected, usually by a single GFCI outlet (which will be the first in the line) or via the breaker… You can get GFCI/AFCI breakers.

          This allows double the current to be handled at the outlet in the kitchen, reducing the probability that you’ll blow a fuse/breaker when using a couple of high power appliances (like blenders, etc).

          There’s a lot of interesting engineering that happens with home electricity writing, and I’m certainly not an authority on the matter. I’ve learned a lot from technology connections and other electrician YouTubers.

          I need to know a lot of this stuff as a hazard of my work. Since I work in IT, I need to be able to calculate power budgets on the fly and figure out if we’re overloading a circuit (or if we are likely to), when setting up racks of equipment.

          So you can sometimes find me in the server room looking at power supply labels doing math to figure out if we’re okay. Let me say, people are not happy when you tell them the 500VA UPS they just bought isn’t sufficient to run the hardware they bought to run on it.

          Converting volts, amps, watts and VA is kind of a requirement. Pro tip, for most things, VA is roughly equivalent to watts. Enjoy.

      • BlackAura@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        6 months ago

        Also efficiency.

        All PSUs have efficiency curves and are rated differently. You can’t always trust the published numbers either.

        Your 750W PSU might pull closer to 937W from the wall when you’re asking it for a full load, if it’s rated at 80% efficiency. Those ratings are not linear though and usually have a sweet spot around 80% of max load.

        I.e. It will be most efficient at around 600W (80% of the 750W max load). At higher loads it will be even worse, possibly drawing closer to 1000W from the wall.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        One option I’m aware of that nobody seems to consider is that in NA, the power delivered to the home is approximately 240v split-phase

        Or component makers could…y’know focus a bit more on efficiency again so we aren’t talking about shifting electrical standards just to play Crisis in a decade?

        On a related note, there’s still tons of homes with electrical circuits that are only 2 prong and don’t have a dedicated ground wire. Better to encourage those to get replaced long before trying to get people to add multiple 240V circuits anywhere they think there might be a computer plugged in.

        • MystikIncarnate
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          I agree. I’m kind of, outside the normal use case. I run several computer lab type things at my residence. My “main” home lab is all virtual, I have several systems doing virtual machines. I also normally have a physical lab set up around my workstation, including a lot of network equipment. So I tend to need more power than the average user.

          I’ve heard, but I have no idea if this is correct, that idle consumption for PSUs is lower on higher supply voltage. Eg, the same PSU running a 60w computer load on a 120v supply will draw more relative amps than the same on a 240v supply. The numbers should scale to double the amps on the 120v, compared to the 240v, but apparently it’s a bit more than that?

          I dunno. Just something I heard. It’s not why I’m thinking of doing it. Sometimes I just need to fire up a few kW of equipment to run some lab testing…

          • Trainguyrom@reddthat.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            6 months ago

            Homelabbers absolutely are a strange edgecase where a 240v circuit or two start making a ton of sense, but we’re far from the norm. Heck the average American doesn’t even own a desktop or laptop computer anymore, let alone the 1-2 dozen computers IT types and homelabbers might have on hand. It’s all about perspective really…

            Edit: Yikes that was a reading comprehension fail. You already said its an unusual usecase

    • Kecessa@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      6 months ago

      Oh yeah, modern PCs use a lot of power under load, I was renovating the main room in my condo in the middle of Canadian winter and was keeping the place warm by having two PCs mining crypto since I didn’t have a baseboard!

      • errer@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        6 months ago

        Key here is “under load.” When a computer is sitting idle nowadays it can use only tens of watts. My older computers often would burn 200w at all times just sitting there doing nothing.