1. OCAU Merchandise now available! Check out our 20th Anniversary Mugs, Classic Logo Shirts and much more! Discussion here.
    Dismiss Notice

AMD Vega GPU

Discussion in 'Video Cards & Monitors' started by Agg, Jan 6, 2017.

  1. sTeeLzor

    sTeeLzor Member

    Joined:
    Dec 12, 2005
    Messages:
    2,651
    Yeah I was struggling a bit on overclocking my Vega 64. If I try HBM above 1025MHz, it seems to just stick to 800MHz, even with voltage increases. My best bet was to leave it on Auto Voltage with 1025.If I try undervolt the core or increase frequency it typically doesnt work. I found leaving it at auto was my best option.

    In the end, I increased my fans ability a smidgen and upped the Power Limt to +50% along with HBM at 1025MHz.

    My results in 3Dmark Firestrike

    6700K + 980Ti (overclocked hard) - 17572
    6700K + Vega 64 balanced - 17920
    6700K + HBM 1025 + Power Limit 150% - 19084

    Anyone got any tips for UV/OC on Vega 64.
     
  2. Aussiejuggalo

    Aussiejuggalo Member

    Joined:
    Sep 14, 2010
    Messages:
    2,073
    Location:
    Sunshine Coast Queensland
    From what I've heard about that Asus card it's just a 1080 cooler slapped on Vega so cooling is pretty meh, probably also why they did fuck all tweaks to the card it's self.
     
  3. aussie-revhead

    aussie-revhead Member

    Joined:
    Feb 21, 2005
    Messages:
    19,483
    Location:
    St Clair 2759
    The old 980Ti holds up well.

    :thumbup:
     
  4. sTeeLzor

    sTeeLzor Member

    Joined:
    Dec 12, 2005
    Messages:
    2,651
    Yeah it does. Wouldn't have replaced it had it not melted down and let out the magic smoke.
     
  5. adamsleath

    adamsleath Member

    Joined:
    Oct 17, 2006
    Messages:
    18,418
    Location:
    Sunnybank Q 4109
    a lot of grfx cards (nvidia aswell) seem to neglect to cool these components. w t f. built-in redundancy.

    surely it is a simple enough exercise to put some thermal pads on them :rolleyes:

    https://www.nowinstock.net/computers/videocards/amd/rxvega56/full_history.php - :lol: cya later amd vega
     
    Last edited: Sep 7, 2017
  6. Aussiejuggalo

    Aussiejuggalo Member

    Joined:
    Sep 14, 2010
    Messages:
    2,073
    Location:
    Sunshine Coast Queensland
    They probably ignore VRM's because they can easily run over 100° so why bother focusing on cooling them? gotta save dem precious pennys :rolleyes:.
     
  7. boneburner

    boneburner Member

    Joined:
    Sep 17, 2009
    Messages:
    2,667
    It occurs to me that the biggest fatality in this tepid Vega saga, is better availability/choice of freesync monitors *sigh*
     
  8. adamsleath

    adamsleath Member

    Joined:
    Oct 17, 2006
    Messages:
    18,418
    Location:
    Sunnybank Q 4109
    so why bother putting coolers on them on a motherboard / pcb?
     
  9. Frontl1ne

    Frontl1ne Member

    Joined:
    Mar 10, 2008
    Messages:
    3,886
    On the flip side (I went with nvidia after Vega's launch), seeing all the gsync monitors cost an extra $200 over the freesync variants sucks :(
     
  10. sTeeLzor

    sTeeLzor Member

    Joined:
    Dec 12, 2005
    Messages:
    2,651
    Yeah if you really wanted Variable Sync, it made more sense to go Vega (at a respectable price) + Freesync. I got the $669 Vega 64 so its more than value.

    Just working to get the most out of it.
     
  11. Aussiejuggalo

    Aussiejuggalo Member

    Joined:
    Sep 14, 2010
    Messages:
    2,073
    Location:
    Sunshine Coast Queensland
    Because people will bitch lol and CPU's can pull a shit load more power / produce more heat than GPU's can if you seriously push them.

    The VRM design on Vega is pretty expensive according to GN so saving a few cents on the cooling is probably the best way to save money unless they charge an utter fortune for the cards without ever dropping the price.

    That'd be the proprietary G-Sync module bumping up the price, Nvidia can easily charge $200 extra for it's manufacture plus use of it because well, Nvidia.

    I'd love to see someone do a break down of the module and work out the exact cost of components, I bet it'd be worth maybe $50 - $75 total build, probably much less than that even.
     
    Last edited: Sep 7, 2017
  12. luke o

    luke o Member

    Joined:
    Jun 15, 2003
    Messages:
    3,621
    Location:
    WA
    Agreed! A lack of an ips 30'' freesync screen is what's stopping me
     
  13. Tazor

    Tazor Member

    Joined:
    May 23, 2010
    Messages:
    2,385
    Location:
    Canterbury, 3126
    There are plenty of VA panels though, and some WMVA panels are just as good, if not better, than IPS in some respects.

    Check out Samsung's CHG70 which is an awesome piece of kit by the look of it.
     
  14. SnooP-WiggleS

    SnooP-WiggleS Member

    Joined:
    Sep 20, 2004
    Messages:
    3,089
    Location:
    Ballarat, Victoria
    Note exactly a teardown, but this review has a high res a photo of one. It's got an Altera Arria V FPGA rather than say an general purpose ARM chip or ASIC so it's probably dearer than say a rasbery pi to produce. Since then they might have revised the module.

    http://www.anandtech.com/show/7582/nvidia-gsync-review
     
    Last edited: Sep 7, 2017
  15. maxrig

    maxrig Member

    Joined:
    Jul 28, 2011
    Messages:
    618
    This monitor is trash for HDR. People need to watch reviews before buying HDR PC monitor just to be safe these days. 300-400 nits are not good enough to have a good HDR experience.

    https://www.youtube.com/watch?v=5BuKtMfHT00&t=66s
     
  16. Tazor

    Tazor Member

    Joined:
    May 23, 2010
    Messages:
    2,385
    Location:
    Canterbury, 3126


    Didn't know there were reviews, I'll check it out, thanks for sharing.
     
  17. boneburner

    boneburner Member

    Joined:
    Sep 17, 2009
    Messages:
    2,667
    The g-sync module's history is a tortured one. It was only necessary in the beginning - "In order to get shipping variable refresh monitors in the market last year, NVIDIA had no choice but to create a module to replace the scalar in modern displays. No monitors existed then with embedded DisplayPort interfaces and the standard had not been created for Adaptive Sync" - PC Perspective

    Now it feels as welcome as any proprietary plug/interface and simply reeks of padding the bottom line.
     
  18. SKITZ0_000

    SKITZ0_000 Member

    Joined:
    Jul 19, 2003
    Messages:
    1,207
    Location:
    Tas
    representn' :Pirate:

    Just bought a HIS Vega 56, couldn't help myself. Will tinker with it, flash the 64/Liquid bios on it then see if I can adapt some sort of AIO to it.
     
  19. adamsleath

    adamsleath Member

    Joined:
    Oct 17, 2006
    Messages:
    18,418
    Location:
    Sunnybank Q 4109
  20. partybear

    partybear Member

    Joined:
    Jun 7, 2011
    Messages:
    868
    Location:
    Ballarat, Victoria
    I was messing around with overclocking and wasn't having much luck. So I went on hwbot to see the top overclocks, and they are only like 40 mhz more then I got. The cards just don't seem to overclock very well.
     

Share This Page

Advertisement: