- Thread Starter
- #3,151
I'm planning to do that (with more peripherals), but I'm also planning to cap the 4090 at ~70% TDP.Would a 850W power supply be sufficient to run a PC with a 12900k CPU, a 4090 GPU and almost no peripherals?
I'm planning to do that (with more peripherals), but I'm also planning to cap the 4090 at ~70% TDP.Would a 850W power supply be sufficient to run a PC with a 12900k CPU, a 4090 GPU and almost no peripherals?
Yeah I also saw that. I'm curious what the supply situation will be like.According to the ”notify me” email sent today 03.00 by nvidia, the 4090 FE will be available to purchase today 15.00 CEST.
Only links to third party models for me, no FE sold through nvidia.comYeah I also saw that. I'm curious what the supply situation will be like.
I'm not in a huge hurry.
Do you know if NVIDIA stopped selling the FE through nvidia.com? Hmmm, I am not sure if I should monitor nvidia.com or one of the retailers selling third party modelsAs far as I know, for Germany (and Austria) the FE should be available at notebooksbilliger.de , but for now I only see third party models there.
Those have remained in stock for a while now though, so I very muc hdoubt we'll see a repeat of a 30-series situation stock-wise.
I got my 3090 från nvidia.com as far as I remember. Impossible to get it at launch though, I had to update nvidia.com daily for a couple of weeks to get oneThey certainly did. Unless they restarted it since the 30 series.
I got my 3090 FE from NBB in 2020.
How would you do that, exactly?I'm planning to do that (with more peripherals), but I'm also planning to cap the 4090 at ~70% TDP.
Alt-Z to open the Geforce Experience overlay -> Performance -> Power sliderHow would you do that, exactly?
So 4080 and 4080 12 GB are slower than 3080?Nvidia's dropping some 4080 16&12gb benchmark in their blog, performance is way lower than a 4090 in a plague tale requiem.
Most likely via MSI Afterburner, there's a power limit slider.
Why not check the DLSS OFF part of the graph? Looks like the 12GB performs quite similarly in that game. Marginally better than 3080.So 4080 and 4080 12 GB are slower than 3080?
I mean, they compare DLSS3 to DLSS2, so each DLSS3 result should be cut in half, right?
The difference between the 16gb and 12gb is more than just vram. It has less cuda, tensor, rt cores and memory bandwidth. Basically what many people are saying is that the 4080 12gb should be a 4070 or even 4060 ti based on its specs.Why not check the DLSS OFF part of the graph? Looks like the 12GB performs quite similarly in that game. Marginally better than 3080.
How do the additional 4GB vram give that version of the 4080 an extra 10 fps though? Does the game constantly choke streaming textures or what?
Artifacts aside, anyone tested 30fps to DLSS3 to see how it feels like playing? I imagine it's still crappy for racing and shooters? Or they all tested 60 to 120 etc..?
I think the DLSS off results allow for a good comparison of rasterised performance.So 4080 and 4080 12 GB are slower than 3080?
I mean, they compare DLSS3 to DLSS2, so each DLSS3 result should be cut in half, right?
If you care about crazy high PPI on your monitors, 8K might be interesting at anything above 27 inch to come near to 300 PPI. Of course, such a high pixel density is pretty pointless, unless your face face is literally glued to your screen when playing games.I think 8K gaming is percent of a percent luxury. Even most 4090 owners won’t be playing games at that resolution.
I mean, let’s be honest, who has space for a TV big enough to ever reap the benefits of 8K? And is there even a point in 8K monitors?
Someone correct me if I’m wrong but surely you won’t even be able to tell the difference between 8K and 4K on most home-sized TVs. We’re talking cinema-size screens before 8K and 4K can be differentiated.
I guess if you want to downsample 8K to 4K you might want to render at that resolution but again, I think I’d rather take the performance than have a microscopic visual enhancement.
I don't know if it's a known thing, but I have been speculating that nvidia pays partners to offer one of the partner cards at the same price as the FE cards in countries where FE isn't available.Thinking of skipping the FE (not even available in my country).
Asus TUF gaming 4090 seems to be the somewhat sane choice. Big heat sink, not too loud, and the same price as the FE.
I would have preferred a FE, but with the official non-availability here in Sweden and the scarcity elsewhere it seems like I would have to pay an unreasonable import premium to get it, while also getting a lesser warrantyWhile it's not great that they are cheaping out a bit on a card with this price, I don't think it will actually matter if you run it without OC (or even better, at 350W power limit for 96% of the performance and much better efficiency).
Also, the 4090 is a lot harder to buy than I initially expected. I haven't even seen any hint of a FE yet.
So you just get a 4cm long piece of wood and use that to keep the wire straight?
And 4090/12VHPWR connector owners might want to read this anyway if you don't have the time to watch the video:
12VHPWR Cable Guide – CableMod
cablemod.com
You might be on to something here....Maybe Nvidia can ship that for future GPUs. Call it "RTX Ultra Support Stick" or something, and put some RGB lights on it or something.
With how huge these cards are and the placement of the power connector, it wouldn't surprise me at all if a lot of people got some bent power cables in their cases as a result.
Kind of scary to be honest.
One is ddr4 and other ddr5 .. And 7000 have some kind of gpu on all cpu.. Are there finally boost on all cpu cores or just one at a time ???5800X3D is a gaming beast that goes toe-to-toe with their 7000 series CPUs in some games.
You’d be set for years if you got one of those.
I doubt we’ll see DDR5 benefits for a while. If you really want DDR5 I’d wait a CPU generation or two.One is ddr4 and other ddr5 .. And 7000 have some kind of gpu on all cpu.. Are there finally boost on all cpu cores or just one at a time ???
Yeah.. The only benefit is that with IGPU ...20-30% more with ddr5 .. But when they release G series CPU..The DDR5 issue will be for a while the total platform cost. You need a new cpu, mobo and ram. DDR5 is still relatively expensive and at least the AM5 boards so far have been pretty pricey too.
As the the last hurrah for the AM4 platform, the 5800X3D is quite beefy on gaming in particular, games like that big boy cache. I thought about upgrading to it too but 5600X is more than enough for me, for now anyway.
I would consider splashing out for a PCI-e Gen 4.0 NVMe.posted in steam thread but looking for a new build soon(?), a little tbd on timescales but within next few months/Q1 2023 I expect
First stab at a new build here
keyboard, mouse, monitor, GPU would be carry overs from current build
Ideally a prebuilt (yes yes yes I kn0w, diy etc etc but no and I'm willing to pay extra if the builder is good)
I expect this is fairly good? Any concerns? Anything that would age fairly quickly?
Thanks, updated the listI would consider splashing out for a PCI-e Gen 4.0 NVMe.
You won’t see a difference in current games but DirectStorage is on the horizon and you will want to make the most of your IO.
You can get a Ryzen 7 5800X for a little bit more than the 5700X so I would consider that instead.