1. Check out OCAU's review of the SpaceX Starlink satellite internet service!
    Dismiss Notice

NVIDIA GeForce RTX 30 Series | RTX 3090 3080 3070 | Discussion

Discussion in 'Video Cards & Monitors' started by Sphinx, Sep 1, 2020.

  1. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,498
    Location:
    Tas
    As I was saying before, that's because they're clowns and are in a power limited state - this is why their '2.8Ghz OC' runs at 2550Mhz. Have a look at the Valhalla results that I posted, you're basing your opinion on bad information.
     
  2. macktheknife

    macktheknife Member

    Joined:
    Jul 26, 2005
    Messages:
    1,882
    Hardware Unboxed might be the worst people to take any information from. They're AMD shills and they're incompetent.
     
  3. Elchupacabraj

    Elchupacabraj Member

    Joined:
    Dec 24, 2017
    Messages:
    858
    I hope AMD & Nvidia continue this trend of releasing very competitive products that we can purchase based on features and which product best fits our personal requirements.

    This is what's happening, or at least would be if availability wasn't the primary factor when it comes to actually trying to buy, it's why there can be arguments about which is the best superior product and the guy with the AMD card can actually have a straight face, hasn't been like that for a while.
     
  4. maxrig

    maxrig Member

    Joined:
    Jul 28, 2011
    Messages:
    712
    Nah, that's about the limit of AMD reference 6900XT without mods.

    Here's a Sapphire RX 6900 XT Toxic Review, 360mm AIO Liquid Cooler hitting +2.7ghz and it's a whopping 4% faster than the reference 6900XT manual O/C.

    Sapphire RX 6900 XT Toxic, 360mm AIO Liquid Cooler, 2.7 GHz Overclock - YouTube
    You do realised Steve at GamerNexus also got similar results to HU when they did their reviews of AMD reference 6900XT.








     
  5. maxrig

    maxrig Member

    Joined:
    Jul 28, 2011
    Messages:
    712
  6. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,498
    Location:
    Tas
    The AMD reference 6900XT has the same VRM setup and power delivery capability as many of the higher TDP AIB's. It's not a 'mod' it's called overclocking, just because it's not an 'enable oc' button in some tray trash software doesn't make it something else. It's simply about what the card is capable of, not what the default drivers and software lets you do. By the same logic you can't overclock a Nvidia card at all, because you need third party software to do that.

    Anyway we're at the end of this discussion. The mindset of some seems to be that of Nvidia is the default option until AMD or someone else comes along and blows them out of the water in every aspect, regardless of use case or preferences/priorities, it was a similar story with Ryzen vs Intel. I just came in to contest and challenge that mindset, I like to tinker with hardware and have a fair bit of experience between both manufactures across several generations. I found Ampere lacking while RDNA2 was a nice surprise in many ways, and is in my opinion quite underrated - which is in part due to incompetent tech media, but also due to fluctuations in the retail price and availability of cards from both manufacturers.

    I do think this is a bit of a turning point or milestone for AMD, and that we will start seeing some strong competition in the near future, especially with MCM GPU's on their way.
     
    Last edited: May 30, 2021
    mAJORD likes this.
  7. The Beast

    The Beast Member

    Joined:
    Jun 27, 2001
    Messages:
    4,490
    Location:
    Gaufres. Bière. Chocolat. Moules Frites.
    Nice for us too, they go into excruciating detail transparently quantifying and qualifying their analysis - stubbornly ignoring it isn't particularly convincing. I suspect most people are going to accept DF conclusions more than a subjective "it looks like shit".

    Here are some quotes from the analysis that you clearly didn't read:

    Gotta wonder why AMD are working hard on their DLSS rival if it's no good! :cool:

    Don't you know, all tech reviews are incompetent shills! Only OCAU forum members know the truth! :rolleyes:

    Sure, that's a reasonable position, but then you dropped clangers like this:

    That's some pretty slippery calculations there.

    3080 default core boost clock is 1710Mhz, I happily run at 2100Mhz on mine (+23%) but it's very common to be stable around 2000Mhz (17%)
    3080 default memory GDDR6X runs 2376Mhz, I happily run at 2613Mhz (10%)
    3090 is similar, but with an ever lower default boost clock of 1695Mhz

    Flashing a bios to increase TDP is literally as trivial as using 'more power tool', in fact it's quicker and not limited to a per system registry hack.

    6800/6900 XT default core boost clock is spec'd at 2250Mhz, so 2750Mhz is (+22%)
    6800/6900 XT default memory GDDR6 runs at 2000Mhz, max permissible overclock is 2150Mhz (7.5%)

    RDNA2 is a great product (I own three) but I'm not seeing the huge overclock advantage you're referring to.
     
  8. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,498
    Location:
    Tas
    I've read it before, comparing stationary images shows a pretty compelling result, however in game with motion - from either pov camera or the environment changes the actual in game experience dramatically. I don't understand how others cant see it.

    Either way, you're focusing far too much on this this like a straw man side quest, I'm not addressing / am dismissive of your dlss content articles because I don't care about it, it isn't currently relevant to my use case. As good as the technology is or isn't or where it's going in the future, still isn't relevant for the majority of games people actually play. As I said before, I'm basing my opinion off my own disappointing experience that time I found myself finally wanting to play a game that actually supported it (and RT).

    It's clearly useful technology. Regardless Nvidia have set the narrative (as I'm sure was the intention) - just look how people preach about it as a must have. AMD obviously have to provide their own equivalent to compete.

    The 2250Mhz clock on the 6800/6900 is the actual gaming frequency whereas the 1710Mhz is just the rated minimum boost clock. In game even a non OC edition RTX 3080 reference card generally runs ~ 1900 ± 50Mhz in a reasonably cooled system, and this is the frequency that they would be running in reviewer performance comparison benchmarks. This is where the 5-10% comes from - how much you can improve on the out of box operation, not how much you can increase the numbers from rated specifications.
     
    Last edited: May 31, 2021
  9. The Beast

    The Beast Member

    Joined:
    Jun 27, 2001
    Messages:
    4,490
    Location:
    Gaufres. Bière. Chocolat. Moules Frites.
    I mean by your own admission you've only tried it on one game, at 1440p. For many DLSS is a game changer that absolutely changes the value equation when comparing NVidia against AMD.

    All I'm saying is you're not comparing apples with apples. Both AMD and NVidia automatically boost gaming frequency above the rated spec, the difference is that AMD are being more conservative with their boost algorithm and leave more on the table for the savvy user to go and find. NVidia do a pretty good job of getting close to the maximum out each individual card based on dynamic conditions. You could therefore either say Ampere are poor overclockers, or NVidia engineers are good optimisers - just depends on the angle you want to spin.
     
  10. groovetek

    groovetek Member

    Joined:
    Oct 19, 2003
    Messages:
    3,402
    Location:
    Melbourne
    Ultimately, Skitz has his reasons for believing RDNA2 is a "far superior product". Seeing things through red-tinted glasses to some extent for sure, but nothing wrong with that ha.

    It's true that say a 6800XT vs 3080, if you strip away DLSS or RT, as Techspot/HWU have shown, at 1440p they are overall even, and at 4k, the 6800XT is on average only 6% behind. Both OC'd, on average this should make the 6800XT on-par with the 3080 at 4k where the extra framerate really matters. For example in many of the titles, on these cards at 1440p the FPS in the latest AAA titles are 100fps or higher where at 4k they're down at 40fps or less.

    If we end the discussion there (where Skitz seems to have), then yeah one could argue the 6800XT is a better product (slightly), due to the slightly lower MSRP, and lower power consumption (at full load).
     
  11. maxrig

    maxrig Member

    Joined:
    Jul 28, 2011
    Messages:
    712
    Yup, every reviewers are Nvidia shill and incompetent, anyone that praised DLSS and RTX are shills. At the start of this generation AMD had zero interest in Raytracing and DLSS like features and yet they'll be supporting these features. I guess AMD will be shilling themselves once they release a competing DLSS like features.

    These comments bring me back memories, wait for Vega, wait for big Navi, wait for RDNA 1 >>><<< 2 or maybe RDNA 3? 4? . I'm waiting , bring it on, please.
     
  12. c4dderly

    c4dderly Member

    Joined:
    Aug 2, 2001
    Messages:
    1,391
    Location:
    Melbourne
    Hey look at me… I need to justify my GFX purchase so I will shitpost this thread and go around in circles with my head up my ass ;)
     
  13. BlueRaven

    BlueRaven Brute force & optimism

    Joined:
    Jul 29, 2010
    Messages:
    5,375
    Location:
    2076
    F*** the technical discussion, I wanna talk shit about people who talk shit about nvidia. Or AMD for that matter.
    :D
     
    macktheknife likes this.
  14. Antmanoz

    Antmanoz Member

    Joined:
    Jan 30, 2018
    Messages:
    16
    Go Go GO! Intel

    :)
     
    BlueRaven likes this.
  15. BlueRaven

    BlueRaven Brute force & optimism

    Joined:
    Jul 29, 2010
    Messages:
    5,375
    Location:
    2076
    Oh god what have I done.
     
    mAJORD likes this.
  16. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,498
    Location:
    Tas
    Haha, at least it's more exciting than moaning about card prices and crypto mining :p
     
    The Beast, BlueRaven and raincloudx like this.
  17. raincloudx

    raincloudx Member

    Joined:
    Dec 18, 2007
    Messages:
    1,989
    Location:
    Victoria
    Speaking of cards, anyone know when the Ti will be released?
     
    nCrypt likes this.
  18. nCrypt

    nCrypt Member

    Joined:
    Feb 12, 2010
    Messages:
    2,507
    Location:
    Melbourne
    i am a sucker for TI's,. so i guess ill have to try and get one...

    need to time it and offload the 2080 Ti at the right time, may even keep the 2080Ti and offload the 3080, since prices are stupid.
     
    raincloudx likes this.
  19. 151528

    151528 Member

    Joined:
    Jan 23, 2007
    Messages:
    1,264
    Location:
    Vic
    raincloudx likes this.
  20. c4dderly

    c4dderly Member

    Joined:
    Aug 2, 2001
    Messages:
    1,391
    Location:
    Melbourne

Share This Page

Advertisement: