1. OCAU Merchandise is available! Check out our 20th Anniversary Mugs, Classic Logo Shirts and much more! Discussion in this thread.
    Dismiss Notice

What has happened to the GPU market?

Discussion in 'Video Cards & Monitors' started by Gonadman2, Jan 18, 2016.

  1. eightyeight

    eightyeight (Banned or Deleted)

    Joined:
    May 17, 2013
    Messages:
    2,729
    Well yes, they could cram more transistors into the same area. But as it sits the die is simply too big for them to cool. They NEED to make it smaller so they can cool it enough to make a dual GPU card.

    Otherwise they're just back to the "who has the biggest single" competition of old.
     
  2. dirtyd

    dirtyd Member

    Joined:
    Jan 4, 2006
    Messages:
    4,150
    Location:
    Melbs
    There was a fabrication issue; AMD cancelled their 20nm GPU for unspecified reasons - it was either too expensive to produce or the performance was lacking. Hence why the 300-series was fairly uninspiring architecturally, they had to scramble.

    28nm planar and 14nm FinFET are chalk and cheese, so doing a shrink is not straight forward. Also, I wouldn't trust wikipedia for information about upcoming anything.
     
  3. mesaoz

    mesaoz Member

    Joined:
    Jan 15, 2015
    Messages:
    8,387
    Location:
    South East QLD
    AMD Fury X4

    Quad Xfire on a single card, now that milkshake would bring the boys to the yard.
     
  4. eightyeight

    eightyeight (Banned or Deleted)

    Joined:
    May 17, 2013
    Messages:
    2,729
    The question becomes whether the next gen are 20nm or 14. Something tells me the 14 just won't be ready yet.
     
  5. terrastrife

    terrastrife Member

    Joined:
    Jun 2, 2006
    Messages:
    18,819
    Location:
    ADL/SA The Monopoly State
    Theres no reason why AMD can't make a dual Fury X right now, the Nano is 175w and is a Fury X at a slightly lower clock.
    It would only need one 6 and one 8 pin connector.
     
  6. greenhawk

    greenhawk Member

    Joined:
    Dec 18, 2003
    Messages:
    791
    the gpu market are following intel and other manufactures. Power decreasing over performance increasing.

    While I do agree with you, if you go back to the early 2000's, you will most likely find that mid ranged then was about $400 then too. The bump about 2012 is more of a exception than the norm for hardware in general.
     
  7. dirtyd

    dirtyd Member

    Joined:
    Jan 4, 2006
    Messages:
    4,150
    Location:
    Melbs
    AMD Polaris is confirmed 14nm at GlobalFoundries, who have licensed Samsung's process(es) for this node. It almost certainly going to be the new and improved LPP variant rather than the earlier LPE that Samsung used for their phone SoC.

    With the process advantage it should comfortably take the lead, the only question is by how much, and where Nvidia are with Pascal.
     
  8. eightyeight

    eightyeight (Banned or Deleted)

    Joined:
    May 17, 2013
    Messages:
    2,729
    Yes but they can't COOL it. The 295x2 was already water cooled, and the fury is even bigger/hotter.

    Good to know. A solid 6 months away at least though.
     
  9. mesaoz

    mesaoz Member

    Joined:
    Jan 15, 2015
    Messages:
    8,387
    Location:
    South East QLD
    Remember how the fury x cooler is rated for something like double the heat output the chip currently produces, pretty good indicator that the shit they're already using can do it.
     
  10. Chaffe

    Chaffe Member

    Joined:
    Aug 6, 2010
    Messages:
    1,648
    Location:
    Shitney
    GPU's are about power so I doubt that's the plan by choice for either makers. It's more likely that both chipmakers are forced to head in that direction due to being stuck on the 28nm Node for so long.
     
  11. eightyeight

    eightyeight (Banned or Deleted)

    Joined:
    May 17, 2013
    Messages:
    2,729
    o_O doesn't the fury x overclock like 40mhz if you're lucky?
     
  12. mesaoz

    mesaoz Member

    Joined:
    Jan 15, 2015
    Messages:
    8,387
    Location:
    South East QLD
    What does that have to do with making a dual chip card?
     
  13. terrastrife

    terrastrife Member

    Joined:
    Jun 2, 2006
    Messages:
    18,819
    Location:
    ADL/SA The Monopoly State
    Have you seen what cools the Nano? It's about 1/4th the size of a Gigabyte Windforce solution, and can cool the card at 150% power target.

    The Fury X and Nano are the most power efficient high end cards out. The Fury is bigger due to HBM, but it is NOT hotter.
     
  14. eightyeight

    eightyeight (Banned or Deleted)

    Joined:
    May 17, 2013
    Messages:
    2,729
    keeping up with the heat output?
    But the chips are still massive nonetheless...

    The way I see it is they've had to watercool the SINGLE gpu. If they need water for one, what on earth do they do with two?
     
  15. mesaoz

    mesaoz Member

    Joined:
    Jan 15, 2015
    Messages:
    8,387
    Location:
    South East QLD
    Exactly what I was trying to say before which you seem to be missing the point of. The cooler it already has for the single chip is rated for 500 watts of thermal capacity. They shouldn't have to radically change anything to support 2 chips :confused:
     
  16. eightyeight

    eightyeight (Banned or Deleted)

    Joined:
    May 17, 2013
    Messages:
    2,729
    Dual chip is far more than 500 no?

    I'm not sure where they get the rating from though. It seems to me it can barely keep up with one gpu.
     
  17. mesaoz

    mesaoz Member

    Joined:
    Jan 15, 2015
    Messages:
    8,387
    Location:
    South East QLD
    Shouldn't be, Fury X should be pumping less than it's 275w (some say 300?) max draw as heat. 500w should still be adequate, if not give it a longer radiator - still not a hard fix.

    What load temps are you getting? 52c supposedly for the Fury X and 72 for the non X, shows how effective the water cooler is...
     
    Last edited: Jan 20, 2016
  18. philscomputerlab

    philscomputerlab Member

    Joined:
    Sep 29, 2014
    Messages:
    1,843
    One thing that annoys me about the GPU market is that quite power efficient card with a single PCIe 6 pin plug, need to have 2 or 3 XXL fans slapped on it just so it "looks faster".

    I miss the days of a 8800GT, single slot and packed a punch.

    The 750 Ti is a good example of what's possible, I'm not sure why there is no followup with a 9 series card.

    I'm really looking forward to these 14 nm cards, I think it will be a game changer and many folks like me on a GTX 660 can finally upgrade to something with decent value.
     
  19. ShadowBurger

    ShadowBurger Member

    Joined:
    Feb 19, 2008
    Messages:
    2,744
    Location:
    Melbourne
    in reply to OP: i agree... I've been looking for an excuse to upgrade my GTX680 to a 980ti but the improvement is marginal (my monitor is only 60hz anyway) and the upgrade won't actually improve the game play so I really can't justify the cost. As far as I'm aware there's no new features I want to be gained with an upgrade. I'm in the same boat with my monitor (I paid $200 for a 1440p korean display which has absolutely fantastic image quality, much better than every other monitor I have seen) so until I find a cost effective upgrade to a 4K display there wouldn't be a point buying a faster GPU.
     
  20. munchkin1

    munchkin1 Member

    Joined:
    Jun 6, 2010
    Messages:
    4,657
    I'm in a similar boat, 1440p with 770 SLI. Just don't see the value in upgrading GPU right now despite having had my 770's for over 2 years now. The only way to get a real performance improvement would be going SLI again which would cost heaps, and as of right now I can still play any new game on pretty high settings...

    Patiently awaiting next gen, and same goes for CPUs
     

Share This Page

Advertisement: