Yeah, sending 48V would be more appropriate, but unfortunately that's not really backwards-compatible (well, the way to do that is typically to allow the PSU to be farther from the use-site, like having a datacenter with tall adult high cabinets (racks; they have 4 posts for holding the equipment, which is standardized as occupying one-or-more slots of 1¾ inch height and free to occupy the entire length) and putting 4 AC/DC converter blocks into each such cabinet, to power a few layers of servers/equipment both above and below the PSU layer.
They do this so that the PSU can be made up of multiple redundant modules, and to integrate some batteries that can cover a generator starting up. This way they don't need to pay for a dedicated UPS battery with chargers and inverters just to convert it back to lower-voltage DC near the server.
They can make it so that for example 5-8 PSUs spread over different electrical circuits can share the load of the servers, even if 2-3 of them fail
(so they, for example, only need 2 extra on top of the 5 that can power the servers and the 1 that recharges the battery (62.5% load if everything works and the servers are running full-power), or even relying on the servers occasionally running below full power to spread the full rated server power over 6 modules with 2 spares (75% load if everything works and the servers are running full-power)).
A big issue in this situation here is though that Nvidia skimps heavily on board space for the 4090 Founders Edition cards, seemingly to use the back for heat sink fins/air-flow cross-section.
It would take up some board area, but in theory, they could add a 48V power option that makes it all sleek and pretty, possibly with a 12V->48V adapter for the people who don't already have a 48V supply.
Maybe it's own power supply? Although this would require electric decoupling from the PC because of switching power supplies. I see here an opportunity for Jensen and his marketing people to come up with a fancy name for it.
Countries like Australia/China 240v is the standard. You need step-down transformers for 110v devices. China has both voltages/plugs running in its cities - not sure how they manage that.
A common misconception about the US is that the US is only 110V. 220V service in homes in America is standard, it's just that standard power plugs around the house get 110V because of split phase breaker boxes. This lets 220V be available for large appliances like Washers, Dryers etc using the full phases, and 110-120V to the rest of the house using only half phase.
In my next house, I think I'll request a 220V outlet in my kitchen with a UK style plug, so I can have my fast tea kettle.
Doesn't seem like a misconception. It seems like y'all have 110v power outlets. Having 220-240v somewhere else is kind of moot... The power socket IS the interface, so if the interface is 110v, then the switcher box, the power line, the high voltage power line, the power plants voltage aren't really relevant.
You plug the AC cable into the PC's power supply, right? That converts to 5V and 12V DC and feeds it to the GPU internally. There's no way to plug external power directly into the GPU.
Often, anything above 48 or 50 volts is considered high voltage and in some places might be subject to regulation. Not to mention just being more dangerous to handle.
The problem is not external, 500 or 1000 W is not that much in the grand scheme of things (a euro plug allows up to 625W, unearthed).
The issue is internal: because the top PSU rail is only 12V, high-power devices (mostly GPU) need to push very high amperage totals. And because PC internal layouts, they can’t do that through large wires with huge connectors.
Exactly. This is why high wattage USB-C cables rely on the power supply negotiating a higher voltage in order to work. Sending 100W over 5V would be ridiculous (20A). Instead the device tells the power supply to switch to 20V and then the cable only needs to be rated for 5A. Something like that for GPUs would make sense.
It is KINDA external. 1000+w is starting to get into “trip the breaker” pretty easily if you have anything else on that circuit with any kind of transient load - like say a minifridge.
> 1000+w is starting to get into “trip the breaker” pretty easily
I was wondering where you'd get that from, and then I remembered that you're probably american so your baseline is only 1800W, in which case... yeah that's true. If you're installing a really powerful PC in the US you probably want either a dedicated circuit or a 20A circuit (or both).
A gaming PC uses maybe 20-30% of power of an average kitchen appliance like a kettle or a blender. Those are plugged into standard wall sockets just fine.
Depends on how many GPUs are in the PC. Circa 2011 I had quad-Radeon 6970 machines pulling ~1000w each from 1200 watt power supplies, measured with a kill-a-watt.
Those machines ran 24/7 for at least 36 months before I had to start servicing stuff like GPU fans and heat sinks.
This stuff can be designed NOT to melt the dang 12v rail
Depends on how many GPUs are in the PC. Circa 2011 I had quad-Radeon 6970 machines pulling ~1000w each from 1200 watt power supplies, measured with a kill-a-watt.
Those machines ran 24/7 for at least 36 months before I had to start servicing stuff like GPU fans and heat sinks.
This stuff can be designed NOT to melt the dang 6 pin 12v rail
Ha ha only serious. There were cases on local news when homes burned down because PC caught fire. Not only circuit breaker — please give me something (IR sensor?) that will trip and cut power if anything in the room and in the PC enclosure exceeds, say, 120°C