Nvidia's GeForce GTX 470/480 | GF100/Fermi | Discussion

Discussion in 'Video Cards & Monitors' started by DiGiTaL MoNkEY, Apr 23, 2009.

Thread Status:
Not open for further replies.
  1. DiGiTaL MoNkEY

    DiGiTaL MoNkEY Inverted Monkey

    Joined:
    Jun 28, 2005
    Messages:
    26,896
    Location:
    Melbourne, Victoria
    [​IMG]

    NVIDIA GT300 ''Fermi'' Detailed

    [​IMG]. [​IMG]
    [​IMG]. [​IMG]

    Written by: http://brightsideofnews.com/news/2009/4/22/nvidias-gt300-specifications-revealed---its-a-cgpu!.aspx With information from: http://www.hardware-infos.com/news.php?news=2904

    http://vr-zone.com/articles/amd-vs-nvidia-dx11-gpu-war-heats-up-this-christmas/7740.html?doc=7740

    _________________________


    Pre-release News:

    - Fermi Architecture GPUs Will Only “Hit the Full Stride” in Q2 – CEO of Nvidia.
    - Nvidia's Fermi GTX480 is broken and unfixable (SA)
    - NVIDIA GF100 (Fermi) Technology preview
    - Fermi-based Geforce GTX 480 to debut at Cebit
    - Fermi is Geforce GTX 480 and GTX 470
    Nvidia works on second generation Fermi (Fud)
    - Official: NVIDIA Fermi / GF100 Architectural Details Revealed
    - Nvidia GeForce 300 Videos Leak Ahead Of Tomorrow's Big Reveal
    - Fermi GF-100 NDA Ends Tomorrow at 9 PM – Sunday, 1/17/2010
    - Fermi-based cards to end up hotter
    - NVIDIA - Fermi Up and Running, Tegra 2
    - Fermi expected in March
    - Nvidia GF100 3-way SLI setup rokin' a tech demo at CES
    - Nvidia Fermi to come in Nvidia’s Q1
    - Nvidia GF100 to be 30 to 40% faster than a single HD 5870?
    - NVIDIA shows us GF100 Fermi video card at Digital Experience
    - Fermi live at CES 2010 - Active system running Unigine Heaven DX11
    - Nvidia plans to launch a 40nm GDDR5 memory-based Fermi-GF100 GPU in March
    - NVIDIA Fermi pushed back to March, ATI prepping midrange refresh for early Q1?
    - Possible Fermi reduction in shader cores?
    - Fermi A3 silicon is in the oven- Oak Ridge cans Nvidia based Fermi supercomputer
    - Fake GeForce GTX 360 and 380 Benchmarks Surface
    - Three flavors of Fermi in 2010?
    - GF100/Fermi SLI system pictured- NVIDIA talks future, Fermi and GeForce GTX rumors
    - GT300/Fermi specifications and power consumption
    - Nvidia remains "committed" to gaming industry
    - Nvidia Fermi GF100 first working desktop card spotted
    - [Press Release] Nvidia announces Fermi Tesla GPUs
    - Nvidia to Demonstrate Fermi Graphics Chip at Super Computing Conference.
    - Nvidia Fermi final silicon to be A3 revision, Q2 2010?
    - Nvidia to Ramp Up Production of Fermi Graphics Cards Only in 2010.
    - Nvidia Fermi Renders Look Ultra Realistic
    - Inside Fermi: Nvidia's HPC Push
    - nVidia GT300's Fermi architecture unveiled: 512 cores, up to 6GB GDDR5
    - Nvidia GT300 Details - Fermi Codename Revealed
    - Nvidia GT300 Architecture Details revealed

    _________________________
     
    Last edited: Feb 21, 2010
  2. Shinanigans

    Shinanigans Member

    Joined:
    Apr 7, 2006
    Messages:
    4,833
    Location:
    Central Coast
    I understood about 40% of that article but shitballs it sounds tasty :D:D:D
     
  3. theicemagic

    theicemagic Member

    Joined:
    Aug 6, 2001
    Messages:
    3,933
    Location:
    Melbourne, Australia
    Hey... Now that's pretty nice. ^_^
     
  4. mcrow5

    mcrow5 Member

    Joined:
    Oct 10, 2007
    Messages:
    3,050
    lol, yeah same here, its all mumbo jumbo to me, but i like the sound of the 2x performance increase with the same clocks:weirdo:
     
  5. anthony256

    anthony256 Member

    Joined:
    Jul 3, 2001
    Messages:
    10,781
    Location:
    Adelaide
    It's about time they got their finger out of their ass and did something, the 8800GTX was the last true big jump and before that the 6 series and before that their original cards.

    Other than that, it's all rehash's of old tech or strapped on stuff, this definitely sounds promising. Let's hope they've got some decent drivers for it and we don't need to wait till 2019 or something for "Big Bang 3".
     
  6. slyls1

    slyls1 Member

    Joined:
    Jun 4, 2006
    Messages:
    1,527
    Location:
    Adelaide
    yes but will nvidia still use the old gddr3 for the memory? i mean they keep on using it while ATi is using gddr5 for awhile. Is there a reason for this?
     
  7. OP
    OP
    DiGiTaL MoNkEY

    DiGiTaL MoNkEY Inverted Monkey

    Joined:
    Jun 28, 2005
    Messages:
    26,896
    Location:
    Melbourne, Victoria
    Not sure, most people are expecting them to move to GDDR5. But i guess that is dependent on their architecture put forth on the chip.

    I’m curious to see some GDDR3 vs GDDR5 pricing comparisons.
     
  8. Shinanigans

    Shinanigans Member

    Joined:
    Apr 7, 2006
    Messages:
    4,833
    Location:
    Central Coast
    The GTX280 fair slaughters the 9800GTX though. Not as big as the 7900 > 8800, sure, but still a fairly decent improvement. :)
     
  9. OP
    OP
    DiGiTaL MoNkEY

    DiGiTaL MoNkEY Inverted Monkey

    Joined:
    Jun 28, 2005
    Messages:
    26,896
    Location:
    Melbourne, Victoria
    While i agree with that, if take into consideration high end cards for that generation, the 9800GX2. The GTX 280 isn't that far off, but we know that the GPU was mainly designed to be a GPGPU monster.
     
  10. ayles

    ayles Member

    Joined:
    Mar 29, 2005
    Messages:
    504
    +1 for SLi 295s when theyre $399 a pop!

    :D
     
  11. Error-prone

    Error-prone Member

    Joined:
    Sep 30, 2008
    Messages:
    786
    Location:
    Queensland
    All hail nvidia!! Bow to the awesomeness of their power!!! :p

    Sounds interesting for sure. :thumbup:
     
  12. MrSmoke

    MrSmoke Member

    Joined:
    May 22, 2008
    Messages:
    2,368
    Location:
    NSW, Blue Mtns NizzleBiX
    hah glad i didnt get a GT200 based card :D
    those cards sound nasty :weirdo:
     
  13. Shinanigans

    Shinanigans Member

    Joined:
    Apr 7, 2006
    Messages:
    4,833
    Location:
    Central Coast
    Well if we're going to compare sandwiches, then we may as well compare the 9800GX2 to the GTX295 :p
     
  14. OP
    OP
    DiGiTaL MoNkEY

    DiGiTaL MoNkEY Inverted Monkey

    Joined:
    Jun 28, 2005
    Messages:
    26,896
    Location:
    Melbourne, Victoria
    Will i'll be comparing the GTX 295 to GTX 380 or whatever they call it.
     
  15. mshagg

    mshagg Politburo

    Joined:
    May 8, 2007
    Messages:
    23,090
    Location:
    Adelaide
    I wonder how much power the next round of monstrosities will cosume :p
     
  16. Shinanigans

    Shinanigans Member

    Joined:
    Apr 7, 2006
    Messages:
    4,833
    Location:
    Central Coast
    Whoa! Whoa!... Slowwwww doowwwnnn :lol: :)

    Wait... is that some insider information you just let slip? :leet:
     
  17. ltd73

    ltd73 Member

    Joined:
    Apr 14, 2005
    Messages:
    1,724
    i just love this kind of article. an author with a little bit of clue but not really.

    this sentence stands out: "Before the chip tapes-out, there is no way anybody can predict working clocks".

    i work with silicon/ASICs and the statement makes no sense. NVidia will know exactly what clocks they can hit. the real question is how far OVER those clocks can they go?
     
  18. OP
    OP
    DiGiTaL MoNkEY

    DiGiTaL MoNkEY Inverted Monkey

    Joined:
    Jun 28, 2005
    Messages:
    26,896
    Location:
    Melbourne, Victoria
    True, but better than some other articles i've read on this.

    Maybe he meant in the context of people that don't work for nvidia or for them. As saying nvidia wouldn't know what clocks to expect would be silly.
     
    Last edited: Apr 23, 2009
  19. n3wbi379

    n3wbi379 Member

    Joined:
    May 15, 2006
    Messages:
    4,600
    Location:
    Melb
    Double the power of the GT200 = massive cpu bottle neck....
     
  20. anthony256

    anthony256 Member

    Joined:
    Jul 3, 2001
    Messages:
    10,781
    Location:
    Adelaide
    Not when games like Rage start coming out ;)

    Edit; and when using 3d vision!
     
Thread Status:
Not open for further replies.

Share This Page