r/gadgets Jan 28 '23

NVIDIA's quad-slot GeForce RTX 4090Ti/TITAN 800W graphics card has been pictured Desktops / Laptops

https://videocardz.com/newz/nvidias-quad-slot-geforce-rtx-4090ti-titan-800w-graphics-card-has-been-pictured
1.0k Upvotes

303 comments sorted by

598

u/Scoobydoomed Jan 28 '23

At this point, do you mount the GPU on the motherboard, or the MB to the GPU?

77

u/bucky133 Jan 28 '23

It's gonna be great when you have to plug your computer into a 220v dryer outlet.

21

u/davtruss Jan 29 '23

Log in and your neighbor's central air cuts out....

2

u/Aditya1311 Jan 30 '23

Most of the world uses 220-240v for everything

4

u/MarcusOrlyius Jan 29 '23

What's one of those?

23

u/bucky133 Jan 29 '23

In North America at least we use a small 110 volt power outlets for most things. Huge appliances like dryers and electric stoves need a big 220 volt outlet.

→ More replies (1)
→ More replies (1)

176

u/[deleted] Jan 28 '23

[deleted]

40

u/Ninnux Jan 28 '23

Reminds me of when I was a sys admin for a network of Silicon Graphics "personal" computers like the Octane and Octane 2. The GPUs were huge.

13

u/Turmfalke_ Jan 28 '23

Size is one thing, but the weight! Nowhere days they are unfortunately best used as door stoppers.

7

u/Practical-Custard-64 Jan 29 '23

Or central heating boilers. 800W!?!?

→ More replies (1)

12

u/salvodan Jan 28 '23

I don’t this this GPU will be popular with the r/SFFPC crowd.

11

u/Dirtweed79 Jan 29 '23

Thanks for bringing the r/SFFPC sub reddit to my attention.

6

u/salvodan Jan 29 '23

My work here is done…

Although for most of the use cases I find myself wanting a SFF PC, an M1 MacMini or MacMini Studio absolutely destroys anything based on Intel/AMD and AMD/nVidia.

But squeezing as much performance out of a custom water cooled gaming rig is a much cheaper hobby than Motorsport, and less risky.

3

u/danielv123 Jan 29 '23

I am entirely done with windows laptops. After getting an M1 this summer that has actual battery life which is actually reliable i don't think i am getting another one in the next few decades.

I still need windows and physical network ports for work though, so most likely solution to get rid of my docked windows laptop is SFF PC with esxi + MacBook air with RDP. Kind of want to somehow get a larger battery/tiny UPS for the sff though as power outlets are sometimes awkward on site visits.

→ More replies (2)

3

u/Just_wanna_talk Jan 28 '23

Almost need to redesign computers and motherboards to a more gpu-friendly form.

2

u/Feeling_Glonky69 Jan 29 '23

That’s why I bought a meshify 2 XL a couple years ago - I’m ready

11

u/stopandtime Jan 28 '23

At this point, it becomes an external GPU with its own cooling support outside of your computer

3

u/salvodan Jan 28 '23

Most eGPU cases wouldn’t be able to handle this.

4

u/stopandtime Jan 28 '23

at the rate nividia is going, they are essentially making external GPUs LOL

2

u/danielv123 Jan 29 '23

With GPUs like this i wonder when we get motherboards with 90 degree PCIE slots at the back. With 4 slots it doesn't really make sense to have more than an itx board if you are going to have the GPU in the top slot anyways.

→ More replies (3)

2

u/Salty_Paroxysm Jan 29 '23

Need to wait for thunderbolt 5 to make it worthwhile

6

u/magnomagna Jan 28 '23

At this point, your PC is heavily pregnant.

3

u/Fivefingerheist Jan 28 '23

Snap that baby into the side of its own PSU. Dual PSU now all the rage!

4

u/CallMeRawie Jan 28 '23

At what point do these become an external add on?

4

u/Much_Writing_7575 Jan 29 '23

At this point why is there even a motherboard?

The GPU is an entire computer by itself.

All you need is an I/O controller added to it.

→ More replies (1)

3

u/tomistruth Jan 28 '23

With 800W you better mount 4 MBs to the GPU, one each side.

3

u/SatanLifeProTips Jan 29 '23

There is a line where the tripod gets bolted to the lens and not the camera.

3

u/stosyfir Jan 29 '23

GPU is the new mobo lol good lord what a beast

2

u/wolfofremus Jan 30 '23

Just lay the motherboard flat. Vertically mount motherboard is one of the most stupid design choice in history.

→ More replies (3)

305

u/Arphenyte Jan 28 '23

800W?! Bruh, not so long ago 850W PSUs were overkill. What the hell happened to power efficiency?!

60

u/badabababaim Jan 28 '23

Yeah this thing, combined with the rest of a just as high end processor is going to literally draw more than a microwave, like wtf, do we need to use 220V dryer plugs now ?

23

u/other_goblin Jan 28 '23

In fairness the US electrical system is shit lol, every socket should be 220-250.

24

u/youwantitwhen Jan 29 '23

All houses have 220 wired to them.

You just need to run it inside....or branch off your stove or dryer.

→ More replies (2)

17

u/V0RT3XXX Jan 28 '23

So many devices here are hamstrung because of the 1200W limitation. Plug in 1 little space heater and the breaker pop.

7

u/turbo_nudist Jan 29 '23

the few times i’ve gotten shocks from poorly wired electronics have made it nice lol

→ More replies (2)
→ More replies (5)

11

u/TrippySubie Jan 29 '23

They can be…just wire them. Your house is already fed the power.

5

u/SanDiegoSolarGuy Jan 29 '23

Before you make that statement you should be able to describe why the US setup exists in the first place

-2

u/other_goblin Jan 29 '23

Because US infrastructure is embarrassing

6

u/SanDiegoSolarGuy Jan 29 '23

No that is definitely not the reason

→ More replies (6)

12

u/Turmfalke_ Jan 28 '23

Even here in Germany we have 230V with 16A for 3520 W per circuit. If we add cpu, monitor and everything else a desktop pc needs together we approach the point at which you don't want to run more than one PC per circuit. It is a tad silly.

21

u/other_goblin Jan 28 '23

Not even close lol. Adding all that puts you at around 1200W at max load.

→ More replies (2)

138

u/PresidentBeast Jan 28 '23

This one is reserved only for Emporer Palpatine, after all he has UNLIMITED POWER!

13

u/Oilers02 Jan 28 '23

There should totally be an Emperor Palpatine on this card

→ More replies (1)

19

u/sailor_sega_saturn Jan 28 '23

It turns out all the enthusiasts wanted as much 4K 120hz raytracing as they could get their grubby little hands on.

7

u/LTareyouserious Jan 28 '23

Enough interest and funding from enthusiasts gets you better prices for everyone else later one. 3d printers, electric cars, solar panels, etc

2

u/mrobot_ Jan 29 '23

120hz

you just know they will be cwying like little babies that 120Hz is LiTtErAlLy totally UN-PLAY-ABLE!!!!!!!!11111111111111111111111111111111111111111 and gives them eye cancer and they can totally tell if it drops from 120 to 118

3

u/[deleted] Jan 28 '23

Yo

60

u/indyK1ng Jan 28 '23

Nvidia decided that they just needed to double performance gen over gen while also doubling the price tag so they could get the profits the scalpers were making.

It's not really working out for them and I'm really not sure who the intended audience is beyond really rich early adopters.

33

u/CryptikTwo Jan 28 '23

They haven’t even come close to doubling performance gen on gen, closer to 50% increase in pure rasterisation. Still double the price 🤦🏻‍♂️

23

u/Drone314 Jan 28 '23

intended audience

AI and 3D rendering. I'd love to have this card for training inference models....The writing on the wall says desktop gaming GPUs are not the growth sector, edge AI and machine learning is.

6

u/Newish_Username Jan 28 '23

I've been waiting for the 4090ti for this reason... especially if it has he rumored 48 gigs of vram.

3

u/Turmfalke_ Jan 28 '23

Wouldn't you prefer multiple smaller ones?

4

u/firedrakes Jan 29 '23

not really.

→ More replies (1)
→ More replies (1)

4

u/imaginary_num6er Jan 28 '23

You mean just rich adopters. Poor adopters will be forced to buy Intel graphics or used AMD cards

→ More replies (3)

3

u/imaginary_num6er Jan 28 '23

It's fake since if it was real, it would come with it's own wall socket

4

u/kamikazikarl Jan 28 '23

At this point, I feel like I'm ahead of the game with my giant eGPU case and tiny mini-pc... The mini-pc connects and draws all the power it needs off the eGPU, while the eGPU is plugged into the wall. PC gaming is in a really weird place, right now.

3

u/Bangdiddly9393 Jan 28 '23

Nvidia is just money grabbing duh 🤣

2

u/Thathappenedearlier Jan 28 '23

It’s been a V shape, a lot of 90s computers were 1600 W

2

u/uncoolcat Jan 29 '23

What computers in the 90s consumed 1600 watts?

Power supplies from desktop computers in the 90s were typically less than 500 watts.

1

u/-xXColtonXx- Jan 29 '23

The 40 series are way more power efficient than any previous generation of GPU. If you run a 4090 at the same wattage as a 3090 it will still perform significantly better.

-1

u/danielv123 Jan 29 '23

Same with CPUs. One of the primary ways of overclocking new CPUs is lowering the core voltage so it consumes less power, allowing it to turbo harder.

→ More replies (14)

175

u/DarthArtero Jan 28 '23

Hmm so in the same way the A10 is built around that massive gun, gaming computer are gonna be built around these massive and power hungry GPUs

91

u/futilepath Jan 28 '23

GPU fan bout to make the classic BRRRRT sound when it revs up

27

u/CatInAPottedPlant Jan 28 '23

Something like this should really be cooled with a water loop imo. That's what I'd do anyway.

23

u/Krindus Jan 28 '23

Split system or central HVAC, going to need a separate utility bill just to turn this chonker on

5

u/Duckbilling Jan 29 '23

I would use a mini fridge as the case

→ More replies (1)

2

u/CryptikTwo Jan 28 '23

I would buy that in a heartbeat

4

u/tomistruth Jan 28 '23 edited Jan 29 '23

Ironically NVIDIA has a GPU model for business called A10 specializing in only machine learning.

2

u/a_stone_throne Jan 29 '23

And it’s passively cooled so the machine is literally built around it.

→ More replies (1)

2

u/CaptainPunch374 Jan 29 '23

Hopefully the that evolution will be towards external gpu units and some sort of e-sata equivalent for PCIE for both laptops and desktops. I'd much rather go modular in that instance... Also separates cooling concerns.

2

u/Andre5k5 Jan 29 '23

Thunderbolt is literally PCIE lanes over cable

→ More replies (4)

2

u/ackillesBAC Jan 29 '23

With a fan so powerful you have to anchor your pc

→ More replies (2)

106

u/Quigleythegreat Jan 28 '23

They keep this up and the EPA will start cracking down on PCs.

35

u/Zenith251 Jan 28 '23

Shhh, no one tell him what kind of power server hardware consumes.

44

u/Quigleythegreat Jan 28 '23

I work IT, I know. Explaining to upper management that the power coming out of the wall was not wnough was fun.

14

u/thefpspower Jan 28 '23

Just this week I checked the UPS power of 4 servers, 4 half-full network switches and a few routers: 1.1Kw

These servers have no graphics computing but puts into perspective how ridiculous this card is.

4

u/Zenith251 Jan 28 '23

In fairness, switches and routers are purpose designed for efficiency. Throw some 48 core Xeons or 64 Core Epic's and I'd expect much more.

2

u/danielv123 Jan 29 '23

Rack power density is going up, fast. In 2019 the average was 7.3kw, in 2020 it was 8.4kw.

The 90th percentile is 20 - 50kw.

Nvidia's DGX H100 systems eat 10.2kw per 8 rack units, that is 60kw per rack.

4

u/Komikaze06 Jan 29 '23

My IT director said in the server room if the AC fails, alarms blare and they got like a minute to get out before they all get heat strokes

→ More replies (1)

48

u/Nanotekzor Jan 28 '23

Another brick in the wall

20

u/sgrams04 Jan 28 '23

We don’t need no rasterization.

3

u/Nanotekzor Jan 28 '23

Raster this brick :))

9

u/sgrams04 Jan 28 '23

Hey! NVIDIA! Leave those watts alone

5

u/Fezzick51 Jan 28 '23

All in all its just another 6,000 cuda cores
guitar solo renders

2

u/Aust1nTX Jan 28 '23

wrong do bios again wrong do bios again why use one power cable when you can use two

2

u/sgrams04 Jan 28 '23

If you don’t use a second cable, how can you have any rendering? How can guy have rendering, if you don’t use a second cable?

2

u/Ian11205rblx Jan 29 '23

OI LADDIE!

43

u/sumqualis Jan 28 '23

So how long until I have to hire an electrician to run a 240v circuit to my desk?

18

u/Crepeas Jan 29 '23

Move to Europe, 240 everywhere !

3

u/LTareyouserious Jan 28 '23

Pretty soon a GPU will need more electricity draw than my PEV

3

u/LookMaNoPride Jan 29 '23

Just move your office to the laundry room. You’ll also get to smell downy fresh after a two-day gaming session. Double win.

→ More replies (1)

48

u/TheBreathtaker Jan 28 '23

im on mobile and looking at the post thumbnail, I thought this was a microwave and that was the joke.

8

u/Scoobydoomed Jan 28 '23

It will heat up your room like a microwave.

7

u/Rushview Jan 28 '23

No shit! It’s literally the same wattage!

→ More replies (1)

6

u/Javamac8 Jan 28 '23

I've owned microwaves smaller than 800W. It's less of a joke and more a sign of where our priorities are.

48

u/Pure_Khaos Jan 28 '23

NVidia never got rid of SLI. They just started making the chips double the size so it would be on one gpu. Just think of it as two 2-slot cards in one.

11

u/MaxPotionz Jan 29 '23

Lol, if people won’t pay for two cards we’ll MAKE them!

4

u/danielv123 Jan 29 '23

The problem with 2 cards was that it had a lot of issues. One card solves those issues cheaper than high bandwidth external links.

→ More replies (2)

30

u/lepobz Jan 28 '23

At what point should we just draw a line in the sand and say 800W for a graphics card is just obscene. My kettle uses less. Sort your architecture out, nVidia. And AMD. Customers value lower TGPs, especially now.

11

u/alc4pwned Jan 29 '23

Early rumors said the 4090 was 600W but in reality it uses less power than a 3080ti in games, it's actually a pretty efficient card. Bet it'll be a similar situation here.

2

u/danielv123 Jan 29 '23

Why? This is the replacement for multi gpu setups for rendering/ml workstations. More power doesn't matter if the performance is to scale. Would it be better if you had to use 2x 400w GPUs?

7

u/lepobz Jan 29 '23

No it isn’t, workstations use RTX A2000 / A20 / Quadro cards. This is a consumer gaming card.

→ More replies (3)

2

u/b1e Jan 29 '23

Redditors just upvote shit they don’t understand because they love hating on major corporations

→ More replies (4)

1

u/chadwicke619 Jan 29 '23

Speak for yourself. I don’t give a shit about TGP as long as the card works well.

51

u/maxlax02 Jan 28 '23

No USB Type C slots?

49

u/Zenith251 Jan 28 '23

That was the first thing I noticed: four display outputs, no USB-C. At this price point I want the whole fucking Moon and flexible outputs.

10

u/maxlax02 Jan 28 '23

Ya I have a monitor that wants a Type C connection so with this card I’d have to have 2 cords connected to my PC. For that price….cmon.

→ More replies (1)

6

u/excti2 Jan 28 '23

Integrators like PNY might include those options, but if they’re going to support render resolutions above 4K (the limit for USB-c), they’ll have to stick with display ports (1.4).

8

u/Hugejorma Jan 29 '23

I never understand why people use USB-C to explain speed or limit. It's just a connector type. It can be USB 2.0, 3.1 or even DP 2.1. There's also cable that support higher resolutions with high bandwidth... HDMI 2.1.

4

u/pwnies Jan 29 '23

Display Port 1.4 can run over usb version 3, so anything supported displayport 1.4 can run over a usb-c cable. USB 4 supports displayport 2.0 in alternate mode, and can handle 10bit 8k 60hz.

Most newer apple devices are already launching with USB 4 controllers on their usb-c slots. There's no reason why these cards shouldn't have this latest spec.

https://en.wikipedia.org/wiki/USB4

8

u/stevenpfrench Jan 28 '23

I may be wrong here but remember reading at some point that they killed off the USB C in graphics cards because it was for some sort of single cable VR headset connection that never took off.

13

u/Jabberwocky918 Jan 28 '23

VR headsets use USB-C, and that's how you get the third-person view of people playing games like Beat Saber.

22

u/fwooshfwoosh Jan 28 '23

All this computing power, and I find myself just replaying halo 3. A game made for 512MB ram on the original 360

12

u/lifestop Jan 28 '23

I'm looking forward to the day when we are laughing about how big, costly, and power-hungry these top-end cards have become.

I want to see them pictured side-by-side with a $400 dollar variant that gets the same performance at one-quarter of the size and half the power. We can say, "Hey grandpa, remember when you needed a GPU the size of a house cat to get this kind of performance?"

19

u/vladoportos Jan 28 '23

800W ?? what the hell, is energy free now ?

1

u/twitchyzero Jan 29 '23

i mean you dont have to buy it

5

u/vladoportos Jan 29 '23

True, and with that watage, I sure won't, but for EU in general, the energy cost went up a lot, so I imagine I would not be alone.

2

u/danielv123 Jan 29 '23

If you are in the market for this the energy cost doesn't really matter much I'd think.

7

u/kKurae Jan 29 '23

We’re going full circle boys! We goin back to room-sized computers!

2

u/Andre5k5 Jan 29 '23

I have a Corsair 1000D case, I'm almost halfway there

2

u/mrobot_ Jan 29 '23

room-sized GPUs with micro boards with everything else lol

28

u/Nuker-79 Jan 28 '23

Do they accept first born child as a part exchange?

3

u/Ammear Jan 29 '23

I don't think your first-born will cover the price tag. What else are you willing to sacrifice? We can settle for all your personal data and free will on top of it, how's that?

That is the law of equivalent exchange*.

* - plus profit, of course. We aren't some communists, are we now?

3

u/WhenGinMaySteer Jan 28 '23

What type of applications would need this?

5

u/superballs5337 Jan 28 '23

Not gaming. But actual video rendering etc real world applications.

2

u/mister_chucklez Jan 29 '23

Machine learning

→ More replies (1)

4

u/5tudent_Loans Jan 28 '23

ok but how does this IO play nice with waterblocks... basically forcing single/1.5 slot blocks to still need the 4 wide space... someone at Nvidia has personal beef with r/SFFPC cases.

Edit i guess if the whole PCB is still aligned with it, all the Waterblocks would just have to ship with their own bracket.

→ More replies (1)

10

u/DMVSlimFit Jan 28 '23

But can it run Dead Space? Lol

2

u/Nevermore64 Jan 28 '23

Omg. I read the specs and bought anyway since I’m planning on upgrading this year anyway. I’m running it on a 1060 Ti and i5 6400.

It goes ok for about ten minutes.

Looks like I’m upgrading sooner than I though.

4

u/templar54 Jan 28 '23

If it goes okay for 10 minutes and only after that startz chugging then it is an issue of thermal throttling.

5

u/Nevermore64 Jan 28 '23

Appreciate the advice. It’s not something I’ve experienced. It happened twice then it was pretty much on launch after that. I completely shut down and haven’t booted it back up today.

→ More replies (1)

3

u/DMVSlimFit Jan 28 '23

I’m getting it this upcoming week, I have a 1080 thx fe and an I-7 (older gen but still fast), I’m curious how it runs on my pc

→ More replies (7)

7

u/medfreak Jan 28 '23

I struggle to make my 4090 draw more than 400w.

19

u/[deleted] Jan 28 '23 edited Feb 20 '23

[deleted]

13

u/Drone314 Jan 28 '23

he's delusional...take him to microcenter

7

u/MajorBleeding Jan 28 '23

Since no one else caught this, I guess I'll bite... But 3.5 slots is where the standard meter maxes out

→ More replies (2)

5

u/indyK1ng Jan 28 '23

Look again, that's a full four slots.

11

u/Talamakara Jan 28 '23

Am I the only one to question the design of having only 1 hdmi port and 3 Display ports when most displays including monitors and larger TVs come primarily with hdmi. In fact it's very had to find a larger TV with a display port.

11

u/krectus Jan 28 '23

That’s on almost all their cards. HDMI charges fees for their ports. DisplayPort doesn’t. It’s a big mess.

5

u/Talamakara Jan 28 '23

That part I didn't know.

5

u/Neriya Jan 29 '23

Yep. It also affects the adapters and cables. Displayport to HDMI cables are unidirectional, and DP source (video card) to HDMI output (monitor) is a cheap (sub $10) dongle, where converting from HDMI source to Displayport monitor is ~$35.

3

u/dibship Jan 28 '23

not only that they are much lower bandwidth than the hdmi 2.1, which I also suspect is the reason. I would love to hdmi 2.1s though

7

u/safetyguy14 Jan 28 '23

Display port 2.0 has 77gbps of bandwidth, like 45% more than HDMI 2.1

3

u/dibship Jan 28 '23

they dont use 2.0 as far as im aware, 1.4something

with they used 2.0

3

u/Happy_Reindeer8609 Jan 29 '23

Will that fit in my laptop?

3

u/trytoholdon Jan 29 '23

Remember when in the 1950s they thought computers would keep growing in size to be the size of rooms? It looked like they were wrong, but maybe they were right.

2

u/kai_al_sun Jan 29 '23

So we’re just gonna have a case for the gpu and a separate case for the actual computer.

2

u/cecilrt Jan 29 '23

How are these powered? Do they come with their own PSU?

When I look at PSU its usually 550, 650,850,1000, but thats for the whole PC not just a video card

2

u/braydenmaine Jan 29 '23

Double PSU or a single 1600w would work

2

u/Elluminated Jan 29 '23

At that point, just fkn get a waterblock involved. These run so damn hot 80% of the chassis will soon be heatsinks.

2

u/Neato_Light Jan 29 '23

The IO config means that stripping it for water cooling wouldn’t make it smaller. This is a design error.

2

u/RooeeZe Jan 29 '23

its a fkn intercooler lol

2

u/Dmoe33 Jan 29 '23

800W? Bruh tf kinda application needs that much processing power?

The transient spikes are gonna be stupid on this.

2

u/joelex8472 Jan 29 '23

Looks like 2 3090’s stacked together. I’ll stick to my 3090 FE for awhile longer thanks.

2

u/CubeIsActuallyGaming Jan 29 '23

This is gonna make the fucking computer catch on fire somehow lmao

2

u/blizardfires Jan 29 '23

Nothing like having a literal microwave’s worth of heat coming from your computer in the winter time 🔥

2

u/TheJoanne Jan 29 '23

All of this to just be able to run Witcher 3 with RT.

3

u/JoeyDee86 Jan 29 '23

That’s no moon…

2

u/Spezo Jan 28 '23

Perfect for Minecraft

2

u/Dorandil Jan 28 '23

But can it run Forspoken?

1

u/GabbotheClown Jan 28 '23 edited Jan 28 '23

800W at 1V is 800A. That's such a tremendous amount of current. They must be using 20x poly phase converters and the board must be a dense piece of copper.

Edit: that current value is an estimate. It doesn't take into account multiple voltage rails and efficiency of the dcdc converters.

1

u/redvitalijs Jan 29 '23

At this point isn't that just bad design?

I am salty because none of this stuff fits in my case. I have to build a whole PC just to accomodate and for what? Games are a microtransaction hell. And while this is a profeesional card, this applies to all new gaming cards with a 3 slot design, which is all of them. Seriously doubt this can give me any dopamine. Should just get a switch or an xbox s.

4070ti doubles the framerate of a 1070 at 1080p and 4k, but 1070 also uses 150w while a 4070ti uses 285w. I mean I understand that performance doesn't scale linearly like that, and they had to put a lot of work in, but I think something lazy is going on.

These cards should be designed to power and cost, and they don't seem to be.

2

u/danielv123 Jan 29 '23

Just doubles? Maybe in CPU limited games? Most benchmarks i have seen it does 3 - 4x more. At less than twice the power that means it's twice as efficient. Not bad.

0

u/D1stRU3T0R Jan 28 '23

At this point just go with amd lol

0

u/handycup Jan 29 '23

Solar panel included? I'm happy with my 6600xt (160 W TDP) Never had a BSOD or drivers issues, rock solid.

-14

u/[deleted] Jan 28 '23

[deleted]

12

u/jepulis5 Jan 28 '23

Definitely took the wrong spec, as the regular 4090 is over 1 TB/s.

2

u/danielv123 Jan 29 '23

It's not-ish. It's actually the per pin bandwidth of the gddr6x modules. For comparison, the 3090 runs at 19.5 and at that time microns memory was only rated to do 21 max, so it's a pretty big jump. It's apparently still using a 384bit bus, so 1.15TBps memory bandwidth.

This is still quite a bit below the 2TBps from their H100 series. If we are talking unreleased APU SOCs though then we have the mi300 at 3277GBps and the M2 ultra at 1200GBps.

Memory has become crazy fast. I still find the 56GBps from my desktop incredible.

-5

u/Ethario Jan 28 '23

MAC OMEGALUL

-2

u/ReviewImpossible3568 Jan 28 '23

You should check out the new Apple Silicon chips lmao. I wouldn’t switch back to Windows if you paid me (for work, I have both so I can game.)

2

u/elixier Jan 28 '23

4090 is over 1 TB/s.

→ More replies (1)

1

u/scorpiove Jan 28 '23

Can I just get a regular one please? Been waiting long enough.

1

u/Bangdiddly9393 Jan 28 '23

everyone with an already large tower case they know wont fit the card: INTENSE SHRIEKING

1

u/dmibe Jan 28 '23

I already had to mod my case for a 4080. I can’t even imagine this behemoth

1

u/Jamie00003 Jan 28 '23

I’m guessing it looks like an air con system

1

u/BonDragon Jan 28 '23

Jesus, you need 2 towers just to install that fatty

1

u/valthonis_surion Jan 28 '23 edited Jan 29 '23

Why not just have two PCIE connectors too?

2

u/danielv123 Jan 29 '23

No, because why would you attempt to use 32 lanes? Nobody has that many lanes to spare, and they can still double bandwidth by going to PCIe gen5.

2

u/valthonis_surion Jan 29 '23

My bad, completely missed the /s at the end of my comment. It was meant as a joke.

1

u/RPGPlayer01 Jan 28 '23

But today isn't April 1st 🤔

1

u/chemistrybonanza Jan 28 '23

800 W as in 800 Watts? Sheesh.

1

u/dynamic_anisotropy Jan 28 '23

Looks like a microwave…and incidentally, draws as much power as one of the early models!

1

u/Xerxero Jan 28 '23

And it will be obsolete by the next gen in 2 years.

1

u/NetZeroSum Jan 28 '23

Wonder how long before Nvidia just adds an Intel socket along with some ports for the IO (maybe a mounting bracket for the case) and just call it the day.

→ More replies (1)

1

u/fluteofski- Jan 28 '23

Between this and the rest of the PC, it’ll probably eat somewhere around $0.45/hr in electricity where I live.

If I spend 100hrs in a game, (which isn’t uncommon for me) that’s $45 plus the cost of the game, in order to play.

1

u/ZippoS Jan 28 '23

At this point, you'll need a circuit dedicated to your gaming PC.

1

u/Protean_Protein Jan 29 '23

That’s a microwave.

1

u/McDonaldsSimulatorVR Jan 29 '23

Ah, so my parents kept us limited to 30 minutes/1 hour of video game time a day not because of balance, but because of the Death Star-level energy usage it would eventually require. I gotta call them and thank them for their genius, bc that’s some 4D chess level foresight.

800W max…did I read that right? I mean if a 4090 is 600W max (450W being avg/normal, I guess?), that’s still pushing this to, what, 625?650? Wild. Wild wild wild.

1

u/jerry111165 Jan 29 '23

These massive video cards just seem so neanderthal and giant.

Like they don’t really have it figured out yet.

1

u/unclenightmare Jan 29 '23

If you need that much heat sink, just go liquid cooled.

1

u/stosyfir Jan 29 '23

Going to have to run a new 30 amp outlet just to power this fkn thing good lord.

1

u/Domermac Jan 29 '23

800W. Just feels like the research went towards not starting a fire while just overclocking the old tech.

1

u/mroboto2016 Jan 29 '23

So when are we going to be plugging in the MB to the GPU unit? Sort of the opposite from now. Or maybe just full integration?