AMD RX 6000 series (RDNA2) Discussion thread

Discussion in 'Video Cards & Monitors' started by SnooP-WiggleS, Mar 6, 2020.

  1. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,500
    Location:
    Tas
    As what jjjc_93 said, just set a ceiling limit on your frame rate, should take care of any coil wine caused by excessive fps, which is likely the case in older games. Running settings like anti-lag might cause it too, as the card is running at its peak to serve up that frame milliseconds sooner.

    I have been playing a fair bit of Genshin Impact lately (locked 60fps) and initially had mild coil wine and the card running at its peak OC with anti-lag enabled in global settings. I cut my GPU clocks in half on a custom game profile and still render 4K60 easily, with silent fans and coils, and half the power consumption. Radeon Chill might also be worth playing with as well, or if you don't want to dial in custom clocks.
     
  2. Gibbon

    Gibbon grumpy old man

    Joined:
    Jun 30, 2001
    Messages:
    6,574
    Location:
    2650
    Interesting. I've only used my 6900XT Red Devil with an EVGA 1200W P2 PSU, and haven't noticed coil whine. Maybe that's why.
     
  3. The Beast

    The Beast Member

    Joined:
    Jun 27, 2001
    Messages:
    4,510
    Location:
    Gaufres. Bière. Chocolat. Moules Frites.
    I get coil whine from my 6000 series (the ones I have tested) when under load, not necessarily when FPS is crazy high. For instance, if I run any taxing benchmark it'll squeal, even if the FPS is ~60.

    It's not a PSU issue for me, I have a Corsair AX1200i, it's just the GPU. None of my NVidia cards whine - but that could just be luck of the draw.
     
  4. OJR

    OJR Member

    Joined:
    Jan 19, 2013
    Messages:
    5,661
    Location:
    Melbourne
    No coil whine on my 6900XT Red Devil either.
     
  5. pH@tTm@N

    pH@tTm@N Member

    Joined:
    Jun 27, 2001
    Messages:
    2,242
    Location:
    BRISBANE
    If a PSU is a factor in coil whine, it is more likely voltage ripple than it not having enough capacity, so always worth checking with another if you have one available.
     
  6. OP
    OP
    SnooP-WiggleS

    SnooP-WiggleS Member

    Joined:
    Sep 20, 2004
    Messages:
    3,239
    Location:
    Ballarat, Victoria
  7. mAJORD

    mAJORD Member

    Joined:
    Jun 4, 2002
    Messages:
    12,068
    Location:
    Griffin , Brisbane
    I get Steve's point overall.. all GPU prices are bad. So what do you do.. I mean its for sale price is amazing value compared to anything else so what does that make those other cards if this is "insultingly bad value"? Is their any point in his loose comparison to the 3060 at the end when they're twice as much in store , regardless of having the same MSRP?


    Is there an alternative to the 6600? At the same price in store? How does it compare ( * cough 1650 or 1660* )

    If not than it's a bit hard to justify slaughtering a specific card in this market. Especially when ironically its the ' best value'
     
    DangerMaus, RnR, 151528 and 2 others like this.
  8. 151528

    151528 Member

    Joined:
    Jan 23, 2007
    Messages:
    1,268
    Location:
    Vic
    Most gpu review conclusions for the last 9+ months have been a joke, not enough people are basing their verdict on the real world pricing...

    GN steve's taken this dumb approach for a few ones now and I would've thought he'd be smart enough to ignore msrp when it's as non-relevant as ever
     
  9. Sorrow

    Sorrow Member

    Joined:
    Jun 11, 2003
    Messages:
    7,606
    Location:
    Brisbane
    Yeah agree, I did an upgrade earlier this year on everything except the GPU and have been hanging for prices to go down.. Feel like I'm going to have the bite the bullet soon though as the reason I upgraded was due to my unstable gfx card.

    I've never spent more than $1k on a GPU before, but this will likely be the last time I spend seriously and I've got more disposable income than I did when I was in my 20's im looking at the best performance/$ spent and the 3080ti whilst still stupid expensive is decent. It's also got the lowest % markup against MSRP than most of the others. But I've also been delaying weeks on biting the bullet as im not sure I want to spend the $$.
     
  10. Phido

    Phido Member

    Joined:
    Jun 20, 2003
    Messages:
    7,551
    Location:
    Dark City
    Oh wow.. 6600 is pretty unimpressive, 5600 would have been a much better buy..
    So glad I got my 6800 for $950.. Its at least got decent rasterization performance.

    Surprised at the leap in performance from 6600/6600xt/6700xt and the 6800.
     
  11. OP
    OP
    SnooP-WiggleS

    SnooP-WiggleS Member

    Joined:
    Sep 20, 2004
    Messages:
    3,239
    Location:
    Ballarat, Victoria
    Depending on your res etc 6800xt worth considering. pccg has powercolor 6800xt red devil (very well built/oc model) in stock for $1900, compared to $2500 for a basic gigabyte 3080 ti (~31% dearer). Ignoring ray tracing/dlss etc 3080 ti is 3% faster at 1440p, but 12% faster at 4k on average from techspot/hardware unboxed. And I'd expect the gap will be less given we're comparing oc powercolor AMD model to base spec gigabyte nvidia model.
    https://www.techspot.com/review/2264-geforce-rtx-3080-ti/
     
  12. Kelvin

    Kelvin Member

    Joined:
    Jul 1, 2001
    Messages:
    2,923
    Location:
    Melbourne 3103
    I think,. you have to think of GPUs like property prices.. they have risen to an irreasonable new level.. but realistically, I dont think they are going to come down any time soon..

    I paid, Around $1200 for my 1080ti, during one of the crytomining booms.. and swore, I would never pay over $1000 for a nother video card..
    grabbed a RTX 2080 new for $1000, just after release...

    then when the 30xx series was released..
    I felt sick and cheaped out on a $1139 on a EVGA 3080, .. took nearly 9 months to arrive.. but at least it did arrive..

    Now.. you just have to look for reasonable bang for buck..

    https://imgur.com/q6Ho35P
    https://imgur.com/Wwv6q9F

    See links above
    ,

    Pretty much the 3060ti is best bang for buck at $1099 and in stock
    (other than the AMD 6600 and 6600Xt)
    the 3080ti is 40% faster, but costs 117% more

    if you are only 1440P gaming.. obviously this is fine... if you are intending on 4k gaming the choices are pretty limited


     
    Last edited: Oct 16, 2021
    mAJORD likes this.
  13. anthonyl

    anthonyl Member

    Joined:
    Jan 13, 2002
    Messages:
    11,404
    Location:
    Brisbane
    Thats why i jumped ship to AMD and got an ASUS ROG STRIX 6800XT LC OC..similar performance (in some games) to a 3080Ti / 3090 for far less... Im happy.
     
  14. Winterheart

    Winterheart Member

    Joined:
    Jul 4, 2001
    Messages:
    809
    Location:
    Canberra
    I'm severely tempted to trade off my second card to an AMD, but my main holdback is my monitor being GSync, and not wanting to drop another $2k+ on another 4k hi refresh screen to run both with Freesync..... :(
     
  15. OP
    OP
    SnooP-WiggleS

    SnooP-WiggleS Member

    Joined:
    Sep 20, 2004
    Messages:
    3,239
    Location:
    Ballarat, Victoria
    RnR likes this.
  16. Winterheart

    Winterheart Member

    Joined:
    Jul 4, 2001
    Messages:
    809
    Location:
    Canberra
  17. OP
    OP
    SnooP-WiggleS

    SnooP-WiggleS Member

    Joined:
    Sep 20, 2004
    Messages:
    3,239
    Location:
    Ballarat, Victoria
    Dude you have a gsync compatible monitor, Ie a freesync/adaptive sync monitor, no gsync module in that monitor, will work fine on any AMD card and new console. Nvidia FUD is working as planned to trick you into thinking an industry standard will only work on your nvidia card.
     
  18. The Beast

    The Beast Member

    Joined:
    Jun 27, 2001
    Messages:
    4,510
    Location:
    Gaufres. Bière. Chocolat. Moules Frites.
    Both companies don't make it entirely clear. Why AMD had to rename something they didn't invent and had no patent on (Freesync) is beyond me.

    FWIW Sony PS5 doesn't yet have VRR (another stupid name for the same thing).

    This explainer from Viewsonic is quite good.

    AMD FreeSync VS NVIDIA G-Sync: What’s the Difference?
    AMD FreeSync is no different from VESA Adaptive Sync. It utilizes VESA’s royalty-free technology to sync the refresh rate to the FPS. It also works on most monitors, which keeps the prices down. However, AMD has left the framerate range in the hands of the manufacturers which reduces the usefulness of the sync technology.

    NVIDIA G-Sync uses the same principle as Adaptive Sync. But it relies on proprietary hardware that must be built into the display. With the additional hardware and strict regulations enforced by NVIDIA, monitors supporting G-Sync have tighter quality control and are more premium in price.

    [​IMG]

    Both solutions are also hardware bound. If you own a monitor equipped with G-Sync, you will need to get an NVIDIA graphics card. Likewise, a FreeSync display will require an AMD graphics card. However, AMD has also released the technology for open use as part of the DisplayPort interface. This allows anyone can enjoy FreeSync on competing devices. There are also G-Sync Compatible monitors available in the market to pair with an NVIDIA GPU.
     
    Last edited: Oct 19, 2021 at 5:24 PM
  19. jjjc_93

    jjjc_93 Member

    Joined:
    Jul 3, 2009
    Messages:
    3,451
    Location:
    Perth
    Pretty much all freesync monitors will work with nvidia cards, Nvidia just goes the extra step of testing and certifying some as gsync compatible. Likewise newer gsync modules are freesync compatible and will work with amd cards. Basically just look up the spec sheet and reviews of your monitor to see what it will support as often enough they'll work both ways.
     
  20. The Beast

    The Beast Member

    Joined:
    Jun 27, 2001
    Messages:
    4,510
    Location:
    Gaufres. Bière. Chocolat. Moules Frites.
    Be careful with that assumption, FreeSync is AMD's proprietary version or edit to the HDMI standard (yes, they did it too) allowing Adaptive Sync running over HDMI 2.0 - but only with AMD cards, because Adaptive Sync is actually a Display Port 1.2A extension.

    While FreeSync compatible monitors don't require an AMD licensed module and are therefore cheaper, it's a myth to suggest FreeSync is just Adaptive Sync.

    NVidia doesn't or can't (to my knowledge) support Adaptive Sync through HDMI 2.0, only display port as it was designed.

    Meanwhile, VRR is an HDMI 2.1 standard (but can be added to 2.0 - confused yet?), and HDMI 2.1 is actually called "Ultra High Speed" HDMI now just to befuddle even more people.

    It's a mess.
     

Share This Page

Advertisement: