Discussion in 'Video Cards & Monitors' started by Sphinx2000, Sep 1, 2020.
issue is SF750 only comes with 2 PCIE 6+2 Cables.
Daisy chaining 2 of the 8pin connectors for 3x8pin cards has been largely called fine, even Evga Jacob on twitter was telling people that was no problem for the ftw3. Where it's not fine is when you have a 2x8pin card and daisy chain the whole thing. Probably a different story if you're loading up a 500w+ bios and trying to crank the dial to 11 but then you just should be outside of sfx psus anyway imo and are probably benching for epeen scores.
It should be fine - the cabling sizes of the 6+2 connector pairs should be enough to support the current.
A little unrelated, but it seems like Gigabyte have quietly updated the Aorus Master to haver 3 x 8pin as well. Probably didn't sell too well given the asking price being up high but not offering the extra power headroom.
I think you can just buy a SF750 PCIE 8pin cable from major retailers if you want to shove a 3x8 pin card into your ITX case.
They are $30 per cable
For those interested, undervolting/underpowering is really the way to go to lower the heat being dumped into your room this Summer.
I haven't bothered to show the 450w benching profile as it would probably crash out after some time in a heavily ray traced game like Cyberpunk 2077, though it's happy to run other benchmarks all day long at 2100mhz+.
With just 10fps effective difference (55fps vs 65fps) between a 220w and 400w profile, this is a small price to pay for reduced heat this Summer. For what it's worth, 220w profile still completely spanks my overclocked RTX3070 on a 270w profile, from memory similar settings was about 40fps, so still well down on a heavily undervolted/underpowered 3080 running at just 220w.
For those interested, with the 220w profile I also dropped the memory OC somewhat. Pretty sure this allows the core to stay about 15-30mhz higher, though I've yet to log enough data to definitively prove this.
Agreed - the performance impact is minimal.
I'm already done benching this 3080 i got today. Will sit on 2130Mhz all day in 3DMark slurping 400w, but i'm now settled on a reasonable undervolt sitting at 2010Mhz on 0.918v drawing under 300w. Card tops out at 55 - 60c and whisper quiet - all the while dropping maybe 5FPS
For anyone interested in arguably the best waterblock for the FTW3/FTW3 Ultra, I just had this email response to my inquiry:
We'll be opening up orders for the FTW3 blocks in a couple of weeks with delivery a week or so after that.
Optimus Advanced Water Cooling
That’s actually sexy af.
Yup. As seen here:
When you undervolt do you run any core voltage boosts?
or you just undervolt the power curve to default clocks?
Dang... 0.918v doing 2010mhz stable is a good chip - is that Cyberpunk stable? Again, I can 3Dmark stable like 80mhz higher... but at best I can hold 2025-2040mhz stable in CP with all settings dialled up, but requires 1.03v, and 400W+ to remain stable.
And I choked having seen EKWB current prices for waterblocks. I remember crying over spending $180 on a GPU block, wtfs going on atm!
That's why I didn't mind waiting and paying $1260 for my 3080... plenty of room left to buy a premium block and out pace an $1800 Strix.
In afterburner, i raise the clock vs voltage curve to have 0.918v sit at 2025Mhz and then cap all higher voltages to the same clock speed so it'll never go higher. Then i cap power usage at 80% to create a 300w TDP cap.
This nets me a 300w and 0.918v cap, with a frequency that jiggles between 1995Mhz - 2010Mhz.
Im waiting for a few patches to fix the bugs before buying Cyberpunk so i can't tell you sorry.
It's been stable in all benches (Timespy, Port Royal, Heaven, Superposition), and so far in RDR2 for a couple of hours. I've just confirmed stable (although only for 15 minutes) in Control with everything maxed and RT on. About as close to Cyberpunk as i can get i think.
3080 owner with factory overclocked cards are already faster then the normal 3090 at stock for gaming. i got a 3090 because 10gb vram seems limiting but it really does feel like a dumb decision when the 3090 cant even run my games properly at 5120x1440 and DLSS looks like hot garbage because it upscales from a much lower resolution and the performance gain is pretty crap while 4k offers better graphics at better performance.
my CPU is also holding back some performance.....
And then you have to remember the prices listed are in U.S dollars. $500 (not including shipping) for a sexy water block is hard to justify
Haha dw I remembered, it's why I bothered to rant. I guess a $300 AUD EKWB is middle range now
Digital Foundry / Everyone disagrees.
Yeah the Optimus is nuts, but it does have a Delta T of <10 degrees.
That's a bit confusing TBH.
5120x1440 is a lower resolution than 4k (3840x2160) - which the 3080 runs perfectly fine let alone the 3090.
As for DLSS, DLSS2 on the 'Quality' setting has been pretty much shown to be as good (and in some cases better) than native 4k rendering in terms of image quality. Although not sure exactly how DLSS works on super ultrawide resolutions.