Nvidia GeForce GTX 500 Series | 580 570 560 | Reviews | General Discussion

Discussion in 'Video Cards & Monitors' started by DiGiTaL MoNkEY, Oct 15, 2010.

Tags:
Thread Status:
Not open for further replies.
  1. DiGiTaL MoNkEY

    DiGiTaL MoNkEY Inverted Monkey

    Joined:
    Jun 28, 2005
    Messages:
    26,896
    Location:
    Melbourne, Victoria
    [​IMG]


    Launch Dates:

    GeForce GTX 500 Series
    Nvidia GeForce GTX 59X Launch = 1H 2011
    Nvidia GeForce GTX 580 Launch = REVIEWS | RELEASED!
    Nvidia GeForce GTX 570 Launch = December ~7th 2010
    Nvidia GeForce GTX 560 Launch = Q1 2011
    Nvidia GeForce GTS 550 Launch = Q1 2011


    _________________________


    Nvidia GeForce GTX 580:

    [​IMG]

    Core Clock: 772MHz
    Process: 40nm
    Shader Clock : 1544 MHz
    Memory Capacity: 1536MB
    Memory Clock: 4008MHz (4x 1002MHz)
    Memory Type: GDDR5
    Memory Interface: 384-bit
    CUDA Processors: 512
    Standard DirectX Version: 11.0
    Nvidia Technologies: SLI, CUDA, PhysX, 3D Vision Surround
    Display Connectors: Dual DVI + Mini-HDMI
    Power Connector(s): One 6-Pin & 8-pin PCI-E Connectors
    Card Length: 10.5 inches (267 mm)
    Launch MSRP: $499USD


    Nvidia GeForce GTX 570:

    Core Clock: 732MHz
    Process: 40nm
    Shader Clock : 1464 MHz
    Memory Capacity: 1280MB
    Memory Clock: 3800MHz (4x 950MHz)
    Memory Type: GDDR5
    Memory Interface: 320-bit
    CUDA Processors: 480
    Standard DirectX Version: 11.0
    Nvidia Technologies: SLI, CUDA, PhysX, 3D Vision Surround
    Display Connectors: Dual DVI + Mini-HDMI
    Power Connector(s): Two 6-Pin PCI-E Connectors


    ________________________

    [​IMG]. [​IMG]

    [​IMG]. [​IMG]

    [​IMG]. [​IMG]

    [​IMG]. [​IMG]


    _________________________


    Post Launch Rumours, Topics and Handy Links:
    NVIDIA GeForce GTX 570 on December 7th, Specifications Leaked http://vr-zone.com/articles/nvidia-geforce-gtx-570-on-december-7th-specifications-leaked/10392.html
    Nvidia's dual chip card waits for Antilles http://www.fudzilla.com/graphics/item/20924-nvidias-dual-chip-card-waits-for-antilles
    NVIDIA Readying Dual-GF110 GPU, Eying Total Performance Leadership http://forums.overclockers.com.au/showthread.php?t=919484
    Disable GeForce GTX 580 Power Throttling using GPU-Z http://www.techpowerup.com/134460/Disable-GeForce-GTX-580-Power-Throttling-using-GPU-Z.html
    Geforce GTX 560 comes next year has 384 shaders http://fudzilla.com/graphics/item/20835-geforce-gtx-560-comes-next-year-has-384-shaders
    NVIDIA GeForce GTX 570 Rumored for Release, Replaces the GTX 480 http://news.softpedia.com/news/NVID...for-Release-Replaces-the-GTX-480-166253.shtml
    NVIDIA to release GeForce GTX 570 and GTX 560 soon? http://www.hexus.net/content/item.php?item=27478
     
    Last edited: Nov 30, 2010
  2. AEKaBeer

    AEKaBeer Member

    Joined:
    Dec 4, 2003
    Messages:
    4,146
    Location:
    North Melbourne
    This I gotta see.

    I think nvidia's spreading those rumors themselves so people will think they should wait i"it won't be that far behind ATI's release" then 6 months after the promised date tadaaa not what we were expecting! :tongue:

    EDIT: Naturally, I'll be happier if it's the real deal.
     
  3. OP
    OP
    DiGiTaL MoNkEY

    DiGiTaL MoNkEY Inverted Monkey

    Joined:
    Jun 28, 2005
    Messages:
    26,896
    Location:
    Melbourne, Victoria
    Quite possible; since Nvidia have been very quiet on anything new in the high-end realm, and AMD holding the top spot, I'd be surprised if its a quiet one for the test of Nvidia's year.

    Since AMD had time to revamp their series, I'm sure Nvidia have had the same opportunity.
     
  4. DARREN

    DARREN Member

    Joined:
    Jun 28, 2001
    Messages:
    1,373
    Location:
    Arana Hills -Nth Brisbane
    Sounds promising. Nvidia NEED something good to bring back some balance in the force.
    Maybe this is the chosen one..........:D
     
  5. Wootwagon

    Wootwagon Member

    Joined:
    Jun 10, 2010
    Messages:
    462
    Location:
    Melbourne (boronia)
    lol i was just sayin in the 6000 series discussion thread that this needed to happen :thumbup::thumbup:
     
  6. Danthemanz

    Danthemanz Old School Admin

    Joined:
    Jun 27, 2001
    Messages:
    2,443
    Location:
    Sydney
    Im saying no. Too big, too hot not enough time for nvidia.
     
  7. Alba

    Alba Member

    Joined:
    Jan 2, 2002
    Messages:
    4,754
    Location:
    Shellharbour
    Don't know how they will manage to keep this at the same TDP as the 480 while adding 32 cuda cores, 512bit mem up from 384, 68 extra TMU's and 512Mb more Vram, and even if they do, it will still be a hot and loud card. The performance increase won't make it better card, at best it will keep up with a 6970 but lose out in every other way.

    Me thinks this is another wood screw saga to get the fanbois all reved up and stop them or at least hold them off for a while from buying a 6000 series.

    Maybe this time they'll upgrade to stainless

    Click to view full size!
     
  8. mrpiccolod

    mrpiccolod Member

    Joined:
    Feb 8, 2009
    Messages:
    1,014
    Location:
    Brisbane
    Agreed. This sounds like the exact same crap that they pulled last year in an effort to put off customers buying ATI. I don't think as many ppl will fall for it the second time round.
     
  9. DeejW

    DeejW Member

    Joined:
    May 18, 2008
    Messages:
    1,946
    Location:
    Margaret River WA
    yeah I bet they dont want to call it gtx485 because people were put off the 480 even though its pretty much the same card
     
  10. terrastrife

    terrastrife Member

    Joined:
    Jun 2, 2006
    Messages:
    18,453
    Location:
    ADL/SA The Monopoly State
    i dont see the complaints about being hot and noisey. the 480 is no hotter or noisey than the 5870 when it counts. why are you even argueing power consumption this end of the scale?

    also whats with the screw? nvidia delivered, albiet late, the 480 is pretty much twice what the 280 was, and what it was origintally pretty much specced to be.

    its been a LONG time since amd/ati have best nvidia for a single gpu solution (9700pro)

    im not a fanboi but i dont like what amd have been doing the last 3 years, nothings dmaratically changed or revolutionised the industry for them since the HD2900XT (which i had), whereas nv have made an effort to push new technologies... like the nintendo powerglove from the 80's, now everyone has motion control :)

    i will be waiting to see what happens, i currently have a 280 and dont feel the need to upgrade to anything this gen or .5 next gen.
     
  11. Hive

    Hive Member

    Joined:
    Jul 8, 2010
    Messages:
    5,177
    Location:
    ( ͡° ͜ʖ ͡°)
    This cannot end well for NVIDIA... lol an 8pin and 6pin...

    I miss nvidia back in the G80 days
     
    Last edited: Oct 15, 2010
  12. terrastrife

    terrastrife Member

    Joined:
    Jun 2, 2006
    Messages:
    18,453
    Location:
    ADL/SA The Monopoly State
    whats wrong with 8 pin and 6 pin? the 2900xt was first to have that :D

    infact i see it as a status of not how much power it uses, but of the engineering gone into the power regulation. kinda like the 5970 shouldve totally been at least 1 8pin, or 2 for good measure. for example, 2900xt had 6+8 had brilliant voltage regulation. the 280 is also 6+8, and thats got the design so many cards since have used (volterra/ichil ic's) 480, once again 6+8, all very flexible out of the box/reference with voltage and stability :)

    its more important nowdays that MANY psus do not run all 6/8 wires for their pcie, but 4/5.
     
  13. itsmydamnation

    itsmydamnation Member

    Joined:
    Apr 30, 2003
    Messages:
    10,259
    Location:
    Canberra
    no it isn't go look at the orignal DP performace they promissed they missed there shader clock by a big margin. even Jen-Hsun admitted there G100 mistakes.

    adding 32 cuda cores isn't going to do much for performance nether is an extra 128bits on the bus. what they need is much higher clocks. The oringal rumors for NI/SI (which ever one it really is) was 20% perf over a 512core G100, clocks not core count will be critical.

    that said the 470,480 have sold very poorly, especally compaired to the 460 and barts is about to hit that pirce point. a 580 isn't what they need, they have already lost that round, they need a better 460 but then they will hit 470 performace and that will be a big issue for NV as that means the G100 salvage part is useless.


    edit:amount of pins has nothing to do with volage regulation circuits...........
     
    Last edited: Oct 15, 2010
  14. FromPaul

    FromPaul Member

    Joined:
    Oct 14, 2006
    Messages:
    1,183
    Location:
    Sydney
    What cooler would they chuck on that to make it run reasonably cool? predict a 3 slot cooler if it comes out with a fad wodge of fins to get th heat down.
     
  15. itsmydamnation

    itsmydamnation Member

    Joined:
    Apr 30, 2003
    Messages:
    10,259
    Location:
    Canberra
    the hotter a card runs the more power required to do the same task, its a property of the materials used in semi conductors. So not only do you need more power to run high clocks/more shaders you need a lot more cooling to keep it at the same temp otherwise you need even more power.
     
  16. FromPaul

    FromPaul Member

    Joined:
    Oct 14, 2006
    Messages:
    1,183
    Location:
    Sydney
    I vaguely remember that from first digital & analog fundamentals :upset:

    it'd be the same die size as the current 480 though which would mean you couldn't cram in more heat pipes, all you could you is chuck more fins on or make the heat pipes more efficient (need to go dig up the atomic with the article on heatpipes in it for laymans refresh)

    3 slot cooler with heatpipes on the front and the back, now theres an idea.
     
  17. Hive

    Hive Member

    Joined:
    Jul 8, 2010
    Messages:
    5,177
    Location:
    ( ͡° ͜ʖ ͡°)
    I don't know how they added stuff and didnt chop anything off and still get lesser power consumption.. at least they say
     
  18. EpicPcCases

    EpicPcCases (Banned or Deleted)

    Joined:
    Apr 3, 2009
    Messages:
    1,200
    Location:
    Brisbane
    Its getting quite interesting actually with AMD (man that just sounds wrong...bring back ATI) and Nvidia bringing out their flagships in the december time frame.

    I don't really care that much about performance per watt, more so price vs performance. At the end of the day, whoever brings out the cheaper top range offering will win out.
     
  19. porridge

    porridge New Member

    Joined:
    Feb 25, 2004
    Messages:
    890
    Location:
    Sydney
    I would have believed that until i read this bit "a TDP close to that of the GTX 480".
    Would have to be something radical going on here if this is true. An absolutely game changing something.
     
  20. flu!d

    flu!d Ubuntu Mate 16.04 LTS

    Joined:
    Jun 27, 2001
    Messages:
    13,158
    And this is the way people used to think before the ATi marketing team made performance per watt the most important feature when buying a video card. Personally, I can't see the point of performance per watt figures when realistically, at the end of the day, the difference on my power bill is going to be negligible. Price vs performance and driver stability are the three important considerations for me.
     
Thread Status:
Not open for further replies.

Share This Page

Advertisement: