1. OCAU Merchandise is available! Check out our 20th Anniversary Mugs, Classic Logo Shirts and much more! Discussion in this thread.
    Dismiss Notice

Crazy video card pricing 2021-2022

Discussion in 'Video Cards & Monitors' started by Slug69, Mar 16, 2021.

  1. groovetek

    groovetek Member

    Joined:
    Oct 19, 2003
    Messages:
    3,805
    Location:
    Melbourne
    I agree with The Beast however 10-20% doesn't make 40fps turn into 60fps. However, any gains within this 40-60fps range are worthwhile.

    Sadly however, at least with the RTX 30 series, say a 3080, 10% gain is possible, but not much over that. For those interested, in the example below, I've tried to make the difference as apparent as I can.

    320w Stock Clocks vs 400w OC Clocks.jpg

    Back to back comparison below using 320w with reference 1710mhz 3D/boost clock, versus my 400w 100% stable OC gaming profile, representing 2 ends of the spectrum. 50fps vs 56fps (just for a moment it happened to show 57fps when I took the screenshot but it was mostly sitting at 56).

    This represents a 12% gain. Of course however, if you buy a high-end 3x 8pin RTX3080, your starting point also isn't 320w power limit, nor is it a 1710mhz boost clock. In my example, default out of the box profile was sitting at 53fps.

    So then, in this case, this would only be a 6% gain for my particular product. Is it worth OC'ing then? Yes, when it brings the experience closer to 60fps, it's noticeable enough to be worth doing.
     
    jjjc_93 and ashenIC like this.
  2. ashenIC

    ashenIC Member

    Joined:
    Sep 21, 2019
    Messages:
    187
    I'm not saying developers are lazy, I'm saying developers have a set of known resolution variables and hardware, to which they should meet.
     
  3. The Beast

    The Beast Member

    Joined:
    Jun 27, 2001
    Messages:
    5,137
    Location:
    Gaufres. Bière. Chocolat. Moules Frites.
    Yeah, I'm a derp. I was thinking 50 > 60 but wrote 40 cos derp.
    I only get 10-12% uplift with my 3080 in CP2077 too, though with frames that low 10% is important.
    FWIW I have a 450W bios but CP doesn't tap more than 400W (like your screenshot above).
    A better comparison would be to return your ram and CPU settings to default (per people who 'don't overclock') and compare the total package.

    The uplift with RDNA2 is even more.

    Sorry, I don't understand. Known hardware and resolution is a console development territory - in which case I agree, devs should absolutely do the work to ensure framerates are acceptable. PCs are quite the opposite, clearly.
     
  4. ashenIC

    ashenIC Member

    Joined:
    Sep 21, 2019
    Messages:
    187
    How is PC hardware any kind of unknown? The games they are developing on console (fixed resolution) or pc (known variables) are running on something (known) which the developer has access to. Where do I get this opposite hardware you speak of? Developers should absolutely do the work to ensure framerates are acceptable on all the platforms they release games on.
     
  5. power

    power Member

    Joined:
    Apr 20, 2002
    Messages:
    68,462
    Location:
    brisbane
    they do, resolution options, inbuilt benchmarks, low, medium, high settings all exist. Games don't start and lock FULL RES ULTRA.....
     
  6. The Beast

    The Beast Member

    Joined:
    Jun 27, 2001
    Messages:
    5,137
    Location:
    Gaufres. Bière. Chocolat. Moules Frites.
    You realise how many "known variables" there are right? Suggesting devs can magically optimise for every hardware combination and find 10-20% more performance than they currently do is ludicrous. Even with completely fixed hardware (consoles) devs struggle to find the gains you're talking about. That's why overclocking remains relevent for most gamers, you can get 10-20% more out of your particular hardware, on top of any optimisation done in-game.

    Returning to your original question...

    ...yes, in my humble experience the juice is worth the squeeze.

    As I've said before, when you're trying to achieve a locked 60 or 90 or 120 or 144 fps then 10-20% can make a noticeable difference to your minimum framerate and consistent frame times. You might think 130fps average is plenty for an F1 title, but if your running at 119 locks fps on a 120hz monitor (as I do), and get the odd drop to 110fps or 105fps, that blip or stutter in frame time can completely throw you off as you're racing - so you really want to hold 150fps average to ensure the min drops are >120fps. It's even worse in VR, reprojection helps but ideally you don't want to rely on those techniques. Yes, you can guard against that by reducing settings and/or resolution, but sometimes that's a compromise you don't want to make (or can't, in VR for instance), and brute power is required.

    This appears to be a hard concept to grasp.
     
  7. randomman

    randomman Member

    Joined:
    Oct 21, 2007
    Messages:
    5,290
    I put myself on EVGAs waitlist in March, they just emailed me saying they're making it fairer by only allowing people who haven't bought one yet to buy first, you also have to drop your "preferred list" from whatever it was to 2. I've got 2x 3060 TIs on my list. Wonder if I'll be allowed to buy one soon. :o

    I've got a 3700x, but the 3060 TI is probably still my best bang for buck even though I can't use resizable bar, right?
     
  8. phreeky82

    phreeky82 Member

    Joined:
    Dec 10, 2002
    Messages:
    9,827
    Location:
    Qld
  9. Deus_Ex_Machina

    Deus_Ex_Machina Member

    Joined:
    Jun 6, 2007
    Messages:
    877
    Location:
    Geelong VIC
    It depends on your appetite for DLSS and Ray Tracing, personally I'm not too fussed about RT, DLSS is a great tech and as more games support it would really start to weigh into my decision. Right now you can buy a comparable 6700XT $200 cheaper with raster performance over the 3060Ti greater than what you are talking with resizable bar tech (which it can use too by the way)

    They are all too bloody dear though lol. Just food for thought.
     
  10. phreeky82

    phreeky82 Member

    Joined:
    Dec 10, 2002
    Messages:
    9,827
    Location:
    Qld
    I bought my 6700 XT some time back for < $900 new and was called crazy for spending that much. Turns out it wasn't such a bad decision.

    I've also got a "crappy" 5500XT 4GB that was a stand-in until I got a better card, now used for the occasional 1080p time-killer RDR2/GTA5 session on the lounge room TV (plays really well btw, and yeah gets SAM/resizable bar too) that was also said to be a bad buy at the time by many, but I suspect many people would jump at it for what I paid.

    So it seems that any GPU purchase made in the last 2yr (prob longer honestly) has turned out to be a wise purchase.
     
  11. Current

    Current Member

    Joined:
    Aug 10, 2021
    Messages:
    1,230
    Do you get MSRP or pay inflated bullshit price on the waitlist ?
     
  12. Deus_Ex_Machina

    Deus_Ex_Machina Member

    Joined:
    Jun 6, 2007
    Messages:
    877
    Location:
    Geelong VIC
    Agreed. Especially the 2070S/1080Ti/5700XT range.

    Last few cards I have owned and the changeover in the last 2 years are.

    Gigabyte G1 GTX970 - Gigabyte 1660 = $70
    Gigabyte 1660-MSI Titanium 1070Ti = $100
    MSI Titanium 1070Ti - Sapphire Nitro 5700XT = $100
    Sapphire Nitro 5700XT - Sapphire Nitro 6700XT = $225

    So I have spent $495 on GPU's in 2 years and had a pretty solid gaming experience. If you are in the market and keep a keen eye it's not so bad, the price of entry right now though is shiiiiiiit.

    You know what's funny though? I paid $600 for a GeForce 2 GTS upon release 21 years ago, I reckon that was about 3 weeks full time apprentice wages for me. So nothing much has changed at the top end lol.
     
  13. randomman

    randomman Member

    Joined:
    Oct 21, 2007
    Messages:
    5,290
    I wouldn't mind a 6700 XT either but the EVGA waitlist is the only truly "hands off" wait and see approach I've found and they only do NVIDIA.

    Saw that but NVIDIA says no to Ryzen 3000 series, only yes to 5000 series from what I could find.

    Well their website still lists them from $450-$530 (USD) so I'd hope MSRP.
     
  14. adamsleath

    adamsleath Member

    Joined:
    Oct 17, 2006
    Messages:
    20,650
    Location:
    Sunnybank Q 4109
    Last edited: Oct 8, 2021
  15. Yehat

    Yehat Member

    Joined:
    Aug 4, 2002
    Messages:
    663
    Location:
    Melbourne
    I was genuinely interested to know the answer to your question as well, I've done some digging and summarised it below.

    There's 2 parts to the answer
    1. Does the 3700x support resizable bar / SAM ? - Yes the CPU does, you just need your motherboard to support it (some models may require a bios update). It was originally for the zen3 processors but as per the amd site link in the footnotes it says it also extends to Ryzen 3000 series processors e.g.
    2. Does nvidia resizable bar implementation for RTX 3xxx gpus support the 3700x ?
    Officially No: https://www.nvidia.com/en-us/geforce/news/geforce-rtx-30-series-resizable-bar-support/
    At the time of writing they officially support zen3 cpu's (Ryzen 3/5/7/9 5xxx), on a 400 or 500 series chipset (provided the motherboard vendor has the required bios).

    Also depending on your video card you may also need a vbios update to enable rebar on the GPU:
    What about the 3700x and other ryzen 3xxx - CPU's Unofficially ? In their own words "your milage may vary". In that:
    So the ryzen 3xxx processors are not officially supported by nvidia but it may work as several people on reddit and other sites have reported it working. e.g. This person who did some comparative benchmarks with a ryzen 3600 -- https://www.reddit.com/r/Amd/comments/mhirm6/ryzen_3600_performance_rebar_on_vs_off_with/
    Some titles benefit from rebar on, and others it may make no difference and some it can actually make performance worse. What nvidia have done is in their GeForce Game Ready Driver's is they selectively enable/disable rebar optimisations based on if in their testing they think the title will benefit.

    The "your milage may vary part" appears to be that, in some games (even if that title is rebar enabled by nvidia drivers indicating that it should see a benefit) when paired with a ryzen 3xxx may actually deliver less performance with rebar on, while some other games still show a benefit. As per the previous reddit link it's theorised that nvidia drivers are not optimising rebar with the ryzen 3xxx CPU's in mind (rather the ryzen 5xxx as per their officially supported status) so that can also lead to some titles potentially having performance drops as shown in that post.

    So one approach for users using rebar with the combination of ryzen 3xxx CPU with nvidia 3xxx GPU, is they might check out benchmarks for the particular game they're going to play with rebar on/off (or benchmark it themselves) to see what the optimal setting is for that case. Someone correct me if I'm mistaken but afaik the downside of that is while the nvidia driver can selectively choose which games rebar is enabled on, theres currently no option for the user to manually set this on a per game basis from within windows ?
     
    Last edited: Oct 9, 2021
    RnR likes this.
  16. randomman

    randomman Member

    Joined:
    Oct 21, 2007
    Messages:
    5,290
    RTX 2070 Super on FB market for $825 CAD (basically AUD). Is this considered reasonable? 3060TI is going for at least $950 on FB. It's a blower model so would be great for my NR200 but also seems $$$ for old tech.
     
  17. Deus_Ex_Machina

    Deus_Ex_Machina Member

    Joined:
    Jun 6, 2007
    Messages:
    877
    Location:
    Geelong VIC
    In the current environment I suppose it is but I wouldn't pay it, especially not for a blower/ref style card. You could prob get a 6600XT brand new locally for a touch more.
     
  18. phreeky82

    phreeky82 Member

    Joined:
    Dec 10, 2002
    Messages:
    9,827
    Location:
    Qld
    IMO it would be more worthwhile putting the effort in to get ReBAR/SAM working if you're VRAM limited. Ironically the more modern GPUs that have official SAM/ReBAR support are probably the ones with a lessor need.
     
    Yehat likes this.
  19. Yehat

    Yehat Member

    Joined:
    Aug 4, 2002
    Messages:
    663
    Location:
    Melbourne
    Last edited: Oct 11, 2021
    RnR likes this.
  20. groovetek

    groovetek Member

    Joined:
    Oct 19, 2003
    Messages:
    3,805
    Location:
    Melbourne
    Does this imply that reBAR helps work around VRAM capacity limitations? I wasn't aware of anything like this - can you confirm/cite some source for this?
     

Share This Page

Advertisement: