NVIDIA GeForce RTX 30 Series | RTX 3090 3080 3070 | Discussion

Discussion in 'Video Cards & Monitors' started by Sphinx, Sep 1, 2020.

  1. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,505
    Location:
    Tas
    2600Mhz is a 15% increase over stock. Also as per my previous point, despite the frequency it is possible that the TDP or TDC - wattage or current is limited which would impact performance at any given clock. From my own tinkering even on the 6900 XT, performance does scale linearly with clocks in most cases, despite the same 256bit bus and HBC that the 68's have.

    You introduced that, I was just talking about RDNA2 vs Ampere in general. My discussion leads towards the high end though as that's where I have the most experience on both sides.


    That is what they typically do. But what also do is introduce these new tiers and phase out others, like with Super's. The 3080 might take a step back, with the 3070 Ti and 3080 Ti instead becoming options at either end.

    8nm Kepler.

    If Nvidia had kept pushing their performance and improving their architecture AMD would never have caught up to them like they have now. Their architecture is stagnant near end of life, this is why their marketing focuses so heavily on RT and DLSS performance
     
    Last edited: May 29, 2021
  2. BlueRaven

    BlueRaven Brute force & optimism

    Joined:
    Jul 29, 2010
    Messages:
    5,446
    Location:
    2076
    I'm mildly concerned that the '70 tier' cards are going to be significantly overvalued in the current market (crypto "bust" or not, discussion elsewhere), and that will flow on down to high prices expected in the secondhand market a few years from now for 3070 etc.

    Looking to continue my fine streak of 970/1070/... at good value/lifespan. We'll see... :Paranoid:
     
  3. maxrig

    maxrig Member

    Joined:
    Jul 28, 2011
    Messages:
    712
    Yeah but HU got a 3% more from manual O/C.

    Quote,

    At 1440p we’re looking at a mere 3% performance improvement for the Nitro+ over the AMD reference card, while we were able to squeeze a further 3% with a manual overclock.

    That's not how company operate i.e, never kick your competition to the ground, you do enough to stay ahead. Nvidia CEO once said, it's the competition that set the prices.

    Hence why Nvidia is a very successful company, they innovate. Just imagine people owning GPU that's capable of 1080P/1440P ( there are shit loads of them by now) than there are no reasosn to upgrade but if they introduce new technology such as Ray tracing to tank your GPU than it gives reasons for people to upgrade. Once Ray tracing is conquered you can bet they'll introduce a new tech to make people upgrade. It's a never ending cycle.
     
    Last edited: May 29, 2021
    Sankari likes this.
  4. BlueRaven

    BlueRaven Brute force & optimism

    Joined:
    Jul 29, 2010
    Messages:
    5,446
    Location:
    2076
    By your own earlier statements, they innovate just enough to stay ahead of the market, as does AMD, or at least that's the C-suite approved plan.
    Neither company is afraid of stretching the perceived "1%-3%" benefits of refinements on a current tech node, got to keep the marketing folk employed somehow. :)

    I'd honestly really like to see Intel get themselves out of their current process funk, hire some talented/dedicated folk, and mix it up in the discrete GPU space.
    Wouldn't that put the cat amongst the pigeons? :)
     
  5. maxrig

    maxrig Member

    Joined:
    Jul 28, 2011
    Messages:
    712
    Agree, 2 players in a market is just not enough for competition especially both CEO are the same bloodline. lol
     
    groovetek and BlueRaven like this.
  6. groovetek

    groovetek Member

    Joined:
    Oct 19, 2003
    Messages:
    3,456
    Location:
    Melbourne
    I think the consensus is that historically, NVIDIA can bounce back hard and fast whenever they feel the heat from AMD. RDNA2 is a great trigger for NVIDIA to push a little harder with upcoming refresh cards but also for next gen.

    I've been an AMD guy since AMD Radeon 8500 days, but only when it's been truly a superior product. RDNA2 have come close this gen, but not close enough. I had fun tuning/tweaking an MSI RX6800 Gaming X Trio, and to be honest, the gains were not linear to clock speed.

    On the same note, having also owned 3070's and 3080's, I've found you can undervolt a 3080 quite significantly to the point where at 200W you can get 3070 equalling performance. Quite impressive if you consider that perspective. No doubt they pushed the efficiency envelope for the 320W "stock" profile. Running my Aorus Xtreme 3080 @ 450W for example is literally pointless other than for bench numbers.
     
  7. BlueRaven

    BlueRaven Brute force & optimism

    Joined:
    Jul 29, 2010
    Messages:
    5,446
    Location:
    2076
    Went from a pair of crossfired 5870s under water to 2 x 560Ti's in SLI, air-cooled, because there was no arguing with the Perf/Watt equation at that point, well before taking OC's into account.

    Really hope AMD manage to shift the architectural perf/watt gains they've made in the CPU space over to RDNA in short order. Somehow. Shit is basically black magic at this point. :lol:
     
  8. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,505
    Location:
    Tas
    Outside of very specific circumstances I find that extremely unlikely. You might also have found yourself limited by TDP and TDC at a given clock.

    Yes it's definitely the case that Nvidia has pushed the performance of this architecture and node far outside of its efficiency (cough, because it's inadequate) so undervolting of course provides significant gains. This is also why there is such a limited OC headroom and the power usage goes through the roof and becomes unviable for daily use. My main point on power consumption with RDNA cards is how the power draw changes dynamically based off load, as does the frequency. Nvidia cards do of course do this but to a far lesser extent
     
    BlueRaven likes this.
  9. BlueRaven

    BlueRaven Brute force & optimism

    Joined:
    Jul 29, 2010
    Messages:
    5,446
    Location:
    2076
    You reckon NV still has the edge on low-level dynamic performance vs. load adaptation?
     
  10. groovetek

    groovetek Member

    Joined:
    Oct 19, 2003
    Messages:
    3,456
    Location:
    Melbourne
    I don't think we're really disagreeing on anything, other than me saying the comment "RDNA2 is a far superior product" is a little far fetched. Perhaps if someone offered you a 3080 and 6800XT for the same price, or for free let's say, you'd pick the 6800XT, but I'd say the vast majority would still pick the 3080. There's overall more compelling reasons to go with nvidia still this time round.

    As for the scaling - this shouldn't come as a surprise that performance gains are not linear vs clock speed. I'm not saying there weren't good gains - certainly better than the 3080, percentage wise from stock. But, can you show some numbers? i.e. lock your 6800XT/6900XT core to 2000mhz, then again to 2600mhz, and show that a game goes from 100fps up to 130fps? I'd be surprised, if you don't mind my skepticism!
     
    BlueRaven likes this.
  11. BlueRaven

    BlueRaven Brute force & optimism

    Joined:
    Jul 29, 2010
    Messages:
    5,446
    Location:
    2076
    Mah feels tell me this.
    Brain says "but why bro?"
    It's closer than I remember it being since... oh, I dunno... the last time I looked into it.

    Anecdotes aren't data apparently. :)
     
  12. 151528

    151528 Member

    Joined:
    Jan 23, 2007
    Messages:
    1,286
    Location:
    Vic
    Citation needed

    I don't remember nvidia ever hitting it out of the park after having a year behind amd, and amd hasn't really been behind for as long as it feels, it's just that when they caught up last time miners bought all the gpu's so gamers were basically forced to stay team green.

    Considering the 'battlefield' amd has been playing on against both nvidia and intel, if anything, they have proven to be the drivers of innovation, if it wasn't for them intel and nvidia would be selling overpriced "good enough" hardware.
     
    BlueRaven likes this.
  13. The Beast

    The Beast Member

    Joined:
    Jun 27, 2001
    Messages:
    4,575
    Location:
    Gaufres. Bière. Chocolat. Moules Frites.
    Digital Foundry and anyone who has used DLSS 2.0 completely disagree, but you do you.

    Ok, but what if you didn't have to do either? That's what DLSS is.

    And high-end GPUs. I've used DLSS with Cyberpunk, Warzone, F12020, Control, Deliver Us The Moon, BF V, Metro Exedus, Outriders and SOTR on my 3080 to run 4K all bells and whistles (+RT where I like it) 100-120fps.

    It often makes the image look better, there is literally no reason not to use it.

    Ask me how I know you don't understand how DLSS works. :D

    NVidia do need to find more efficiency per W, but let's not pretend even in pure raster comparisons that RDNA2 is streets ahead. 6800XT/6900XT are about 12-14% more efficient (perf per watt) than 3800/3900, and perf per dollar is about the same for RDNA2/Ampere (excluding 6900XT and 3090 which are both overpriced fps/$). These are non-DLSS, no-RT comparisons -

    As above this is misguided. Also, 6800/6800XT/3080/3090 are definitely 4K gaming cards, excluding 4K because you use 1440p is disingenuous.

    Of these cards, the Steam Hardware survey shows 3080/3090 with 1.24%, rx 6000 series don't even rank.
    Meanwhile, Steam Hardware survey also shows 4k (3840 x 2160) usage at 2.41%.

    So, way more 4k users than high-end Ampere/RDNA2 users, therefore most high-end GPU purchasers are probably looking for 4K performance, not 1440p.

    People over-extended, probably spent more than they can afford and now the earning rate isn't at the level they thought.

    500MH/s is about $1000USD per month (before costs), so if this guy is AUD $11-12K in the hole for this rig perhaps he is concerned that a slump in etherium price won't allow him to cash-out without much profit.

    Depending how old the rig is, he may already have 2-3 Eth coins and is happy to sit on that and wants to take $5-6k AUD profit on the rig while prices are still high.

    None of that indicates a crash, yet, though eth is currently trending down by the time you read this it will be back up.
     
    Straferight likes this.
  14. OJR

    OJR Member

    Joined:
    Jan 19, 2013
    Messages:
    5,765
    Location:
    Melbourne
    Dlss is awesome in Warzone. But don’t go above balanced or it starts to look retarded.
     
  15. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,505
    Location:
    Tas
    Um, OK? nice for them I suppose.

    It looks fine at 4K, but to me at 1440p it looks like shit. Even the quality DLSS setting is a 1080p render, while it doesn't look as bad as 1080p native it looks no where near as good as 1440p native, even on DLSS 2.0. I admit I haven't played any of the other games you have listed, so it might look better in those. But for the couple of games I have played where it was an option, it has noticeably worse than native, particularly in Cyberpunk 2077. When you are close to your screen, there is a huge drop in detail particularly when close up to say an NPC in game but really I noticed it everywhere. I actually did want to play this title with RT on, but the performance hit combined with the DLSS quality hit made it worse overall, which was my only point.

    The tech is fantastic, but it is also a distraction from what actually matters at the moment in the games people actually play. My experience of the quality is anecdotal and only my opinion from the one title I've actually played for a significant amount of time that effectively uses any of these technologies since owning my Turing card.

    please :rolleyes:

    No, I actually use a 4K screen on my 6900 XT. I use a 1440p high refresh monitor for high fps on multiplayer titles on my 2080 Ti, like most other people with high end gpu's do.

    This is not useful data and isn't even able to be correlated to your assumption.
     
    Last edited: May 30, 2021
  16. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,505
    Location:
    Tas
    I think that RDNA2 performs better in this space. My experience with nvidia cards outside of idle and really low gpu utilisation they are basically at full boost. Whereas RDNA2 scales very well based off load. This is just an observation, I don't actually care about power usage in a desktop pc. The only reason power usage is a concern as it usually comes with increased temperatures which flow on to both increased noise and lower performance - which I do care about.

    I've had every generation of both AMD and Nvidia cards for the last several years, RDNA2 is a significant improvement and I think a significant turning point. I can't wait to see how it performs in laptops - of which I am in the market for.

    Yeah so that's what I did, after having both a 3080 and a 6900 XT (actually two 6900 XT's - by accident). It was an easy choice at the time based off the performance in the games I played, the features/functionality that I like, noise, etc. In fact I also decided it wasn't worth replacing my water cooled 2080 Ti with an air cooled 3080 either. The value of each card at the time vs the quality of life impact or additional costs (like a WB for the 3080), considering most of the games I play on my desktop don't actually need more performance than what a 2080 Ti can produce. I have a gsync monitor which ties me into the Nvidia ecosystem on my desktop pc for now.

    That said, in hindsight knowing what I know now about GPU prices and mining profitability, I would have purchased a 3090 and hashed it's face off :p.

    Sure can, though I did mean near linear not exactly linear, with how the cards performance scales performance across the whole range. Basically meaning that there wasn't just a wall because it ran out of puff on the memory side (like I was expecting when I first got this card) - that more core clocks continue to increase performance. So from say 1500-200 is pretty similar scaling to 2000-2500. You can even see that scaling with 2500 to 2700.

    I only have a few games installed (that I play/have played through) that have in-game benchmarks. I've already wasted too much time on this for the moment but I might visit a few more down the track. I was hoping to include RDR2 in this but as my display is only 60hz it won't scale past that, and the benchmark is pretty bad anyway.

    HZD and Valhalla are probably a funny couple of games to use for this purpose and are probably at opposite ends, though I have spent probably a hundred hours in each so they're representative of my use case. HZD is a good example of a game that does get a bit choked on the memory at 4K - so I've added 1440p results as well.

    The quality settings under custom for Valhalla are actually Ultra settings with a few things tweaked down where there is no or low perceivable visual difference, as per HWUB recommended settings. I didn't want to change this and set it all up again, I did leave HZD on the top preset though (which I run with only minor changes, like motion blur).

    A summary being, from the 2000mhz clock to the 2750 clock results in about 25% increase in performance for a 37.5% increase in clockspeed (with the exception of HZD at 4K [17.5%]). Which brings it in line with the 15-20% performance increase I was talking about over stock.

    I think you would see more than this on the 6800 or 6800 XT, as they have the same memory bandwidth as the 6900 XT but less CU's so I imagine there is a bit more headroom.
     

    Attached Files:

    Last edited: May 30, 2021
  17. maxrig

    maxrig Member

    Joined:
    Jul 28, 2011
    Messages:
    712
    I don't see how it's happening, you're over selling it.

    According to HU AMD reference 6900XT runs @2260mhz out of the box.

    Quote,

    After 30 minutes in Shadow of the Tomb Raider we saw an average clock frequency of 2260 MHz which is around 50 MHz higher than the 6800 XT.

    Again, RDNA2 is nothing amazing at O/C just like Nvidia Ampere.

    Quote,

    Just like our AMD reference 6800 XT, the 6900 XT graphics card isn’t a great overclocker. Pushing it up to 2.7 GHz saw a typical clock frequency in games of around 2550 MHz, and while that’s a 13% increase, it only boosted performance in Assassin’s Creed Valhalla by 5%, 3% in Shadow of the Tomb Raider and 5% in Watch Dogs Legion.

    AMD Radeon RX 6900 XT Review | TechSpot
     
  18. Sledge

    Sledge Member

    Joined:
    Aug 22, 2002
    Messages:
    9,047
    Location:
    Adelaide
    Is there a way to configure it? Or is it just a matter of turning it on in game?
    I've got it on in f1 2020, but can't see a way to choose quality level at all..
     
  19. PiNoY

    PiNoY Member

    Joined:
    Jan 29, 2004
    Messages:
    1,378
    Location:
    doncaster east, vic
  20. OJR

    OJR Member

    Joined:
    Jan 19, 2013
    Messages:
    5,765
    Location:
    Melbourne
    It depends on how the game implements it. Some just have enable/disable while others have a more granular control over DLSS.
     

Share This Page

Advertisement: