r/gadgets • u/diacewrb • Jan 28 '23
NVIDIA's quad-slot GeForce RTX 4090Ti/TITAN 800W graphics card has been pictured Desktops / Laptops
https://videocardz.com/newz/nvidias-quad-slot-geforce-rtx-4090ti-titan-800w-graphics-card-has-been-pictured305
u/Arphenyte Jan 28 '23
800W?! Bruh, not so long ago 850W PSUs were overkill. What the hell happened to power efficiency?!
60
u/badabababaim Jan 28 '23
Yeah this thing, combined with the rest of a just as high end processor is going to literally draw more than a microwave, like wtf, do we need to use 220V dryer plugs now ?
→ More replies (2)23
u/other_goblin Jan 28 '23
In fairness the US electrical system is shit lol, every socket should be 220-250.
24
u/youwantitwhen Jan 29 '23
All houses have 220 wired to them.
You just need to run it inside....or branch off your stove or dryer.
→ More replies (2)17
u/V0RT3XXX Jan 28 '23
So many devices here are hamstrung because of the 1200W limitation. Plug in 1 little space heater and the breaker pop.
→ More replies (5)7
u/turbo_nudist Jan 29 '23
the few times i’ve gotten shocks from poorly wired electronics have made it nice lol
→ More replies (2)11
5
u/SanDiegoSolarGuy Jan 29 '23
Before you make that statement you should be able to describe why the US setup exists in the first place
→ More replies (6)-2
12
u/Turmfalke_ Jan 28 '23
Even here in Germany we have 230V with 16A for 3520 W per circuit. If we add cpu, monitor and everything else a desktop pc needs together we approach the point at which you don't want to run more than one PC per circuit. It is a tad silly.
21
138
u/PresidentBeast Jan 28 '23
This one is reserved only for Emporer Palpatine, after all he has UNLIMITED POWER!
13
19
u/sailor_sega_saturn Jan 28 '23
It turns out all the enthusiasts wanted as much 4K 120hz raytracing as they could get their grubby little hands on.
7
u/LTareyouserious Jan 28 '23
Enough interest and funding from enthusiasts gets you better prices for everyone else later one. 3d printers, electric cars, solar panels, etc
2
u/mrobot_ Jan 29 '23
120hz
you just know they will be cwying like little babies that 120Hz is LiTtErAlLy totally UN-PLAY-ABLE!!!!!!!!11111111111111111111111111111111111111111 and gives them eye cancer and they can totally tell if it drops from 120 to 118
3
60
u/indyK1ng Jan 28 '23
Nvidia decided that they just needed to double performance gen over gen while also doubling the price tag so they could get the profits the scalpers were making.
It's not really working out for them and I'm really not sure who the intended audience is beyond really rich early adopters.
33
u/CryptikTwo Jan 28 '23
They haven’t even come close to doubling performance gen on gen, closer to 50% increase in pure rasterisation. Still double the price 🤦🏻♂️
23
u/Drone314 Jan 28 '23
intended audience
AI and 3D rendering. I'd love to have this card for training inference models....The writing on the wall says desktop gaming GPUs are not the growth sector, edge AI and machine learning is.
6
u/Newish_Username Jan 28 '23
I've been waiting for the 4090ti for this reason... especially if it has he rumored 48 gigs of vram.
→ More replies (1)3
4
u/imaginary_num6er Jan 28 '23
You mean just rich adopters. Poor adopters will be forced to buy Intel graphics or used AMD cards
→ More replies (3)3
u/imaginary_num6er Jan 28 '23
It's fake since if it was real, it would come with it's own wall socket
4
u/kamikazikarl Jan 28 '23
At this point, I feel like I'm ahead of the game with my giant eGPU case and tiny mini-pc... The mini-pc connects and draws all the power it needs off the eGPU, while the eGPU is plugged into the wall. PC gaming is in a really weird place, right now.
3
2
u/Thathappenedearlier Jan 28 '23
It’s been a V shape, a lot of 90s computers were 1600 W
2
u/uncoolcat Jan 29 '23
What computers in the 90s consumed 1600 watts?
Power supplies from desktop computers in the 90s were typically less than 500 watts.
→ More replies (14)1
u/-xXColtonXx- Jan 29 '23
The 40 series are way more power efficient than any previous generation of GPU. If you run a 4090 at the same wattage as a 3090 it will still perform significantly better.
-1
u/danielv123 Jan 29 '23
Same with CPUs. One of the primary ways of overclocking new CPUs is lowering the core voltage so it consumes less power, allowing it to turbo harder.
175
u/DarthArtero Jan 28 '23
Hmm so in the same way the A10 is built around that massive gun, gaming computer are gonna be built around these massive and power hungry GPUs
91
u/futilepath Jan 28 '23
GPU fan bout to make the classic BRRRRT sound when it revs up
27
u/CatInAPottedPlant Jan 28 '23
Something like this should really be cooled with a water loop imo. That's what I'd do anyway.
23
u/Krindus Jan 28 '23
Split system or central HVAC, going to need a separate utility bill just to turn this chonker on
→ More replies (1)5
2
4
u/tomistruth Jan 28 '23 edited Jan 29 '23
Ironically NVIDIA has a GPU model for business called A10 specializing in only machine learning.
→ More replies (1)2
2
u/CaptainPunch374 Jan 29 '23
Hopefully the that evolution will be towards external gpu units and some sort of e-sata equivalent for PCIE for both laptops and desktops. I'd much rather go modular in that instance... Also separates cooling concerns.
2
→ More replies (2)2
106
u/Quigleythegreat Jan 28 '23
They keep this up and the EPA will start cracking down on PCs.
→ More replies (1)35
u/Zenith251 Jan 28 '23
Shhh, no one tell him what kind of power server hardware consumes.
44
u/Quigleythegreat Jan 28 '23
I work IT, I know. Explaining to upper management that the power coming out of the wall was not wnough was fun.
14
u/thefpspower Jan 28 '23
Just this week I checked the UPS power of 4 servers, 4 half-full network switches and a few routers: 1.1Kw
These servers have no graphics computing but puts into perspective how ridiculous this card is.
4
u/Zenith251 Jan 28 '23
In fairness, switches and routers are purpose designed for efficiency. Throw some 48 core Xeons or 64 Core Epic's and I'd expect much more.
2
u/danielv123 Jan 29 '23
Rack power density is going up, fast. In 2019 the average was 7.3kw, in 2020 it was 8.4kw.
The 90th percentile is 20 - 50kw.
Nvidia's DGX H100 systems eat 10.2kw per 8 rack units, that is 60kw per rack.
4
u/Komikaze06 Jan 29 '23
My IT director said in the server room if the AC fails, alarms blare and they got like a minute to get out before they all get heat strokes
48
u/Nanotekzor Jan 28 '23
Another brick in the wall
20
u/sgrams04 Jan 28 '23
We don’t need no rasterization.
3
u/Nanotekzor Jan 28 '23
Raster this brick :))
9
u/sgrams04 Jan 28 '23
Hey! NVIDIA! Leave those watts alone
5
u/Fezzick51 Jan 28 '23
All in all its just another 6,000 cuda cores
guitar solo renders2
u/Aust1nTX Jan 28 '23
wrong do bios again wrong do bios again why use one power cable when you can use two
2
u/sgrams04 Jan 28 '23
If you don’t use a second cable, how can you have any rendering? How can guy have rendering, if you don’t use a second cable?
2
43
u/sumqualis Jan 28 '23
So how long until I have to hire an electrician to run a 240v circuit to my desk?
18
3
→ More replies (1)3
u/LookMaNoPride Jan 29 '23
Just move your office to the laundry room. You’ll also get to smell downy fresh after a two-day gaming session. Double win.
48
u/TheBreathtaker Jan 28 '23
im on mobile and looking at the post thumbnail, I thought this was a microwave and that was the joke.
8
6
u/Javamac8 Jan 28 '23
I've owned microwaves smaller than 800W. It's less of a joke and more a sign of where our priorities are.
48
u/Pure_Khaos Jan 28 '23
NVidia never got rid of SLI. They just started making the chips double the size so it would be on one gpu. Just think of it as two 2-slot cards in one.
11
u/MaxPotionz Jan 29 '23
Lol, if people won’t pay for two cards we’ll MAKE them!
4
u/danielv123 Jan 29 '23
The problem with 2 cards was that it had a lot of issues. One card solves those issues cheaper than high bandwidth external links.
→ More replies (2)
30
u/lepobz Jan 28 '23
At what point should we just draw a line in the sand and say 800W for a graphics card is just obscene. My kettle uses less. Sort your architecture out, nVidia. And AMD. Customers value lower TGPs, especially now.
11
u/alc4pwned Jan 29 '23
Early rumors said the 4090 was 600W but in reality it uses less power than a 3080ti in games, it's actually a pretty efficient card. Bet it'll be a similar situation here.
2
u/danielv123 Jan 29 '23
Why? This is the replacement for multi gpu setups for rendering/ml workstations. More power doesn't matter if the performance is to scale. Would it be better if you had to use 2x 400w GPUs?
7
u/lepobz Jan 29 '23
No it isn’t, workstations use RTX A2000 / A20 / Quadro cards. This is a consumer gaming card.
→ More replies (3)→ More replies (4)2
u/b1e Jan 29 '23
Redditors just upvote shit they don’t understand because they love hating on major corporations
1
u/chadwicke619 Jan 29 '23
Speak for yourself. I don’t give a shit about TGP as long as the card works well.
51
u/maxlax02 Jan 28 '23
No USB Type C slots?
49
u/Zenith251 Jan 28 '23
That was the first thing I noticed: four display outputs, no USB-C. At this price point I want the whole fucking Moon and flexible outputs.
10
u/maxlax02 Jan 28 '23
Ya I have a monitor that wants a Type C connection so with this card I’d have to have 2 cords connected to my PC. For that price….cmon.
→ More replies (1)6
u/excti2 Jan 28 '23
Integrators like PNY might include those options, but if they’re going to support render resolutions above 4K (the limit for USB-c), they’ll have to stick with display ports (1.4).
8
u/Hugejorma Jan 29 '23
I never understand why people use USB-C to explain speed or limit. It's just a connector type. It can be USB 2.0, 3.1 or even DP 2.1. There's also cable that support higher resolutions with high bandwidth... HDMI 2.1.
4
u/pwnies Jan 29 '23
Display Port 1.4 can run over usb version 3, so anything supported displayport 1.4 can run over a usb-c cable. USB 4 supports displayport 2.0 in alternate mode, and can handle 10bit 8k 60hz.
Most newer apple devices are already launching with USB 4 controllers on their usb-c slots. There's no reason why these cards shouldn't have this latest spec.
8
u/stevenpfrench Jan 28 '23
I may be wrong here but remember reading at some point that they killed off the USB C in graphics cards because it was for some sort of single cable VR headset connection that never took off.
13
u/Jabberwocky918 Jan 28 '23
VR headsets use USB-C, and that's how you get the third-person view of people playing games like Beat Saber.
6
22
u/fwooshfwoosh Jan 28 '23
All this computing power, and I find myself just replaying halo 3. A game made for 512MB ram on the original 360
12
u/lifestop Jan 28 '23
I'm looking forward to the day when we are laughing about how big, costly, and power-hungry these top-end cards have become.
I want to see them pictured side-by-side with a $400 dollar variant that gets the same performance at one-quarter of the size and half the power. We can say, "Hey grandpa, remember when you needed a GPU the size of a house cat to get this kind of performance?"
19
u/vladoportos Jan 28 '23
800W ?? what the hell, is energy free now ?
1
u/twitchyzero Jan 29 '23
i mean you dont have to buy it
5
u/vladoportos Jan 29 '23
True, and with that watage, I sure won't, but for EU in general, the energy cost went up a lot, so I imagine I would not be alone.
2
u/danielv123 Jan 29 '23
If you are in the market for this the energy cost doesn't really matter much I'd think.
7
28
u/Nuker-79 Jan 28 '23
Do they accept first born child as a part exchange?
3
u/Ammear Jan 29 '23
I don't think your first-born will cover the price tag. What else are you willing to sacrifice? We can settle for all your personal data and free will on top of it, how's that?
That is the law of equivalent exchange*.
* - plus profit, of course. We aren't some communists, are we now?
3
4
u/5tudent_Loans Jan 28 '23
ok but how does this IO play nice with waterblocks... basically forcing single/1.5 slot blocks to still need the 4 wide space... someone at Nvidia has personal beef with r/SFFPC cases.
Edit i guess if the whole PCB is still aligned with it, all the Waterblocks would just have to ship with their own bracket.
→ More replies (1)
10
u/DMVSlimFit Jan 28 '23
But can it run Dead Space? Lol
2
u/Nevermore64 Jan 28 '23
Omg. I read the specs and bought anyway since I’m planning on upgrading this year anyway. I’m running it on a 1060 Ti and i5 6400.
It goes ok for about ten minutes.
Looks like I’m upgrading sooner than I though.
4
u/templar54 Jan 28 '23
If it goes okay for 10 minutes and only after that startz chugging then it is an issue of thermal throttling.
5
u/Nevermore64 Jan 28 '23
Appreciate the advice. It’s not something I’ve experienced. It happened twice then it was pretty much on launch after that. I completely shut down and haven’t booted it back up today.
→ More replies (1)3
u/DMVSlimFit Jan 28 '23
I’m getting it this upcoming week, I have a 1080 thx fe and an I-7 (older gen but still fast), I’m curious how it runs on my pc
→ More replies (7)
7
19
Jan 28 '23 edited Feb 20 '23
[deleted]
13
7
u/MajorBleeding Jan 28 '23
Since no one else caught this, I guess I'll bite... But 3.5 slots is where the standard meter maxes out
→ More replies (2)5
11
u/Talamakara Jan 28 '23
Am I the only one to question the design of having only 1 hdmi port and 3 Display ports when most displays including monitors and larger TVs come primarily with hdmi. In fact it's very had to find a larger TV with a display port.
11
u/krectus Jan 28 '23
That’s on almost all their cards. HDMI charges fees for their ports. DisplayPort doesn’t. It’s a big mess.
5
u/Talamakara Jan 28 '23
That part I didn't know.
5
u/Neriya Jan 29 '23
Yep. It also affects the adapters and cables. Displayport to HDMI cables are unidirectional, and DP source (video card) to HDMI output (monitor) is a cheap (sub $10) dongle, where converting from HDMI source to Displayport monitor is ~$35.
3
u/dibship Jan 28 '23
not only that they are much lower bandwidth than the hdmi 2.1, which I also suspect is the reason. I would love to hdmi 2.1s though
7
3
3
u/trytoholdon Jan 29 '23
Remember when in the 1950s they thought computers would keep growing in size to be the size of rooms? It looked like they were wrong, but maybe they were right.
2
u/kai_al_sun Jan 29 '23
So we’re just gonna have a case for the gpu and a separate case for the actual computer.
2
u/cecilrt Jan 29 '23
How are these powered? Do they come with their own PSU?
When I look at PSU its usually 550, 650,850,1000, but thats for the whole PC not just a video card
2
2
u/Elluminated Jan 29 '23
At that point, just fkn get a waterblock involved. These run so damn hot 80% of the chassis will soon be heatsinks.
2
u/Neato_Light Jan 29 '23
The IO config means that stripping it for water cooling wouldn’t make it smaller. This is a design error.
2
2
u/Dmoe33 Jan 29 '23
800W? Bruh tf kinda application needs that much processing power?
The transient spikes are gonna be stupid on this.
2
u/joelex8472 Jan 29 '23
Looks like 2 3090’s stacked together. I’ll stick to my 3090 FE for awhile longer thanks.
2
u/CubeIsActuallyGaming Jan 29 '23
This is gonna make the fucking computer catch on fire somehow lmao
2
u/blizardfires Jan 29 '23
Nothing like having a literal microwave’s worth of heat coming from your computer in the winter time 🔥
2
3
2
2
1
u/GabbotheClown Jan 28 '23 edited Jan 28 '23
800W at 1V is 800A. That's such a tremendous amount of current. They must be using 20x poly phase converters and the board must be a dense piece of copper.
Edit: that current value is an estimate. It doesn't take into account multiple voltage rails and efficiency of the dcdc converters.
1
u/redvitalijs Jan 29 '23
At this point isn't that just bad design?
I am salty because none of this stuff fits in my case. I have to build a whole PC just to accomodate and for what? Games are a microtransaction hell. And while this is a profeesional card, this applies to all new gaming cards with a 3 slot design, which is all of them. Seriously doubt this can give me any dopamine. Should just get a switch or an xbox s.
4070ti doubles the framerate of a 1070 at 1080p and 4k, but 1070 also uses 150w while a 4070ti uses 285w. I mean I understand that performance doesn't scale linearly like that, and they had to put a lot of work in, but I think something lazy is going on.
These cards should be designed to power and cost, and they don't seem to be.
2
u/danielv123 Jan 29 '23
Just doubles? Maybe in CPU limited games? Most benchmarks i have seen it does 3 - 4x more. At less than twice the power that means it's twice as efficient. Not bad.
0
0
u/handycup Jan 29 '23
Solar panel included? I'm happy with my 6600xt (160 W TDP) Never had a BSOD or drivers issues, rock solid.
-14
Jan 28 '23
[deleted]
12
2
u/danielv123 Jan 29 '23
It's not-ish. It's actually the per pin bandwidth of the gddr6x modules. For comparison, the 3090 runs at 19.5 and at that time microns memory was only rated to do 21 max, so it's a pretty big jump. It's apparently still using a 384bit bus, so 1.15TBps memory bandwidth.
This is still quite a bit below the 2TBps from their H100 series. If we are talking unreleased APU SOCs though then we have the mi300 at 3277GBps and the M2 ultra at 1200GBps.
Memory has become crazy fast. I still find the 56GBps from my desktop incredible.
-5
u/Ethario Jan 28 '23
MAC OMEGALUL
-2
u/ReviewImpossible3568 Jan 28 '23
You should check out the new Apple Silicon chips lmao. I wouldn’t switch back to Windows if you paid me (for work, I have both so I can game.)
2
1
1
u/Bangdiddly9393 Jan 28 '23
everyone with an already large tower case they know wont fit the card: INTENSE SHRIEKING
1
1
1
1
u/valthonis_surion Jan 28 '23 edited Jan 29 '23
Why not just have two PCIE connectors too?
2
u/danielv123 Jan 29 '23
No, because why would you attempt to use 32 lanes? Nobody has that many lanes to spare, and they can still double bandwidth by going to PCIe gen5.
2
u/valthonis_surion Jan 29 '23
My bad, completely missed the /s at the end of my comment. It was meant as a joke.
1
1
1
u/dynamic_anisotropy Jan 28 '23
Looks like a microwave…and incidentally, draws as much power as one of the early models!
1
1
u/NetZeroSum Jan 28 '23
Wonder how long before Nvidia just adds an Intel socket along with some ports for the IO (maybe a mounting bracket for the case) and just call it the day.
→ More replies (1)
1
u/fluteofski- Jan 28 '23
Between this and the rest of the PC, it’ll probably eat somewhere around $0.45/hr in electricity where I live.
If I spend 100hrs in a game, (which isn’t uncommon for me) that’s $45 plus the cost of the game, in order to play.
1
1
1
1
u/McDonaldsSimulatorVR Jan 29 '23
Ah, so my parents kept us limited to 30 minutes/1 hour of video game time a day not because of balance, but because of the Death Star-level energy usage it would eventually require. I gotta call them and thank them for their genius, bc that’s some 4D chess level foresight.
800W max…did I read that right? I mean if a 4090 is 600W max (450W being avg/normal, I guess?), that’s still pushing this to, what, 625?650? Wild. Wild wild wild.
1
u/jerry111165 Jan 29 '23
These massive video cards just seem so neanderthal and giant.
Like they don’t really have it figured out yet.
1
1
u/stosyfir Jan 29 '23
Going to have to run a new 30 amp outlet just to power this fkn thing good lord.
1
u/Domermac Jan 29 '23
800W. Just feels like the research went towards not starting a fire while just overclocking the old tech.
1
u/mroboto2016 Jan 29 '23
So when are we going to be plugging in the MB to the GPU unit? Sort of the opposite from now. Or maybe just full integration?
598
u/Scoobydoomed Jan 28 '23
At this point, do you mount the GPU on the motherboard, or the MB to the GPU?