2060 to be re-launched! (8nm supply contrained, 12nm not?)

Discussion in 'Video Cards & Monitors' started by Myne_h, Jan 22, 2021.

  1. Myne_h

    Myne_h Member

    Joined:
    Feb 27, 2002
    Messages:
    11,245
    https://videocardz.com/newz/nvidia-to-reintroduce-geforce-rtx-2060-and-rtx-2060-super-to-the-market

    Though the article doesn't explicitly state that that is the reason, it is my best educated guess on why.

    It makes sense that all the 8nm is busy making high margin 3070, 3080, 3090 chips and doesn't have the capacity to produce the relatively lower margin 3060.

    It does seem a bit sad they haven't at least attempted to bring the 3060 to 12nm, but with an extra 2000 cuda cores, I suppose the chip would be too big to be economical.

    Add in the recent "gotta beat AMD" (make the 3060 12gb) and the extra ram probably adds too much to the cost.

    Can't make enough 8nm.
    Can't actually add all the ram you promised and meet the price.
    Have board partners screaming for chips.
    Customers screaming for cards.

    So, what to do?

    Have a perfectly functional 12nm line sitting idle...
    No time to rejig everything...

    Relaunch!
     
    Last edited: Jan 22, 2021
  2. RnR

    RnR Member

    Joined:
    Oct 9, 2002
    Messages:
    16,468
    Location:
    Brisbane
    Shortages every where... so what is a business to do when there is a pent up demand for your products?

    https://www.igorslab.de/en/more-det...hic-cards-are-so-hard-to-buy-and-manufacture/

    Lots of interesting stuff in the article.
     
  3. Sledge

    Sledge Member

    Joined:
    Aug 22, 2002
    Messages:
    8,587
    Location:
    Adelaide
    Yeah but who's going to want to buy them?
    Anyone who would have wanted one would surely have bought one already?
     
  4. power

    power Member

    Joined:
    Apr 20, 2002
    Messages:
    65,595
    Location:
    brisbane
    people on mid-tier 10 series can pickup a reasonably priced brand new 20 series.
     
  5. Phido

    Phido Member

    Joined:
    Jun 20, 2003
    Messages:
    7,411
    Location:
    Dark City
    Miners?

    The shortage is quite bad. Even very old cards are now going up in value.
    A 2060 6GB is probably going to be a very cheap card to manufacture. For a lot of people that would be a pretty good upgrade. If your sitting a 4gb 1050ti, 3Gb 1060, the most popular Nvidia card ever made, a 6Gb 2060 would be a decent upgrade.
     
  6. Sledge

    Sledge Member

    Joined:
    Aug 22, 2002
    Messages:
    8,587
    Location:
    Adelaide
    Would it be able to do VR?
     
  7. OP
    OP
    Myne_h

    Myne_h Member

    Joined:
    Feb 27, 2002
    Messages:
    11,245
    Speculating a bit on what this might mean in the future:

    Let's pretend you're AMD/Nvidia.

    7 and 8nm production is jam packed. Literally can't make enough. It's going to be a good year before the next fab comes online.
    We've seen that Nvidia have decided to re-release the 12nm 2060.
    AMD still appear to be making 14nm rx580s. They're still for sale new. But they're a bit slow.
    In the next few months, I think there's a reasonable chance they'll cut down and test produce Big Navi a bit (or say, rejig the Vega memory controller for DDR), and stick it on 14nm.
    It won't be big Navi mhz, but it'll happily sit where the bulk of the consumer spend goes - and do well at 1080p.

    Moving forward, if the last decade's price/availability cycle is the new normal, I suspect that low-mid range cores will be at a bare minimum designed, small batch produced, and tested on 'last gen' nodes, in parallel with 'current gen' nodes, with production all but ready to go if the demand is there. (Side note, you can't just resize the same chip on a different node and expect it to work. It has to be optimised for that node.)

    Obviously they're not really going to want to use last gen nodes if they can avoid it because last gen = big die. But last gen also = cheaper, better yielding, and available capacity.

    So... in NV land, it'd be:
    X060 and below = last gen node, X070 and above = current gen.

    And AMD would be:
    "little X" (navi) = last gen, "big X" = current gen.
     
    RnR likes this.
  8. RnR

    RnR Member

    Joined:
    Oct 9, 2002
    Messages:
    16,468
    Location:
    Brisbane
    Global Foundries have a 12+nm process now that is substantially better than their old 12nm from Zen 1+ days let alone their 14nm from Zen 1.

    https://www.anandtech.com/show/1490...nology-massive-performance-power-improvements

    The problem is design validation if they want to use some of their RDNA CU's on this silicon. This takes time and money. Radeon VII was on TSMC's 7nm and all their great Vega work for mobile and RDNA1 & 2 is on TSMC's 7nm as well. So they would have to deploy effort and money in an effort to capture the low end markets on potentially GloFo silicon. But yeah maybe this is why we haven't heard boo about the low end. RDNA2 efficiency gains together with GloFo's 12lp+ silicon should do much better than an RX580.
     
    Myne_h likes this.
  9. OP
    OP
    Myne_h

    Myne_h Member

    Joined:
    Feb 27, 2002
    Messages:
    11,245
    Ah, didn't know that. 12nm makes more sense.

    Yeah, I did touch on the design validation part. They probably have to make a hundred test products before they finally have a design that works properly.

    What choice do they have though? Current reality says demand for the cutting edge far outstrips the current and foreseeable capacity. It's not like most people care what node it's on. It'll run slightly hotter and use more watts, but it'll sell.
     
  10. RnR

    RnR Member

    Joined:
    Oct 9, 2002
    Messages:
    16,468
    Location:
    Brisbane
    There are also other shortages - see the Igor's Lab article I linked above. DDR6, voltage controllers etc are all in shortage. A 40CU RDNA2 on LPDDR5 with a 256bit bus maybe? Half the memory bandwidth of an RX6800, but it would still have the funky cache to make up for less bandwidth.
     
    Myne_h likes this.
  11. Phido

    Phido Member

    Joined:
    Jun 20, 2003
    Messages:
    7,411
    Location:
    Dark City
    For mid range cards the node is less important.
    It will still be under 200w, then its fine.

    GDDR5 is also phasing out.. I doubt anyone will go back there.
    But slower speed GDDR6 is a thing, as is DDR4\DDR5, and on a mid range card anything 8GB+ is decadent.
    2060 wasn't a bad space, it just cost too much.
     
  12. Skramit

    Skramit Member

    Joined:
    Oct 28, 2004
    Messages:
    4,326
    Location:
    Melbourne
    Watch the latest Hardware Unboxed video, he makes a good point that nVidia never officially killed the 2060 and so this news of restarting production needs some tempering down as possibly not really news.....
     
  13. nope

    nope Member

    Joined:
    May 8, 2012
    Messages:
    3,320
    Location:
    nsw meth coast
    worst time to buy a gpu ever 3 year old performance for brand new prices
     
    TRG.dOinK and Skramit like this.
  14. 2_stroke

    2_stroke Member

    Joined:
    Aug 9, 2008
    Messages:
    1,322
    Location:
    cranbourne 3977
    lol happy i'am bought my 2070 super now, i was going to wait till the 3000 series thank god i didnt. Was funny with people selling the 2080's for $600 hahah
     
  15. RnR

    RnR Member

    Joined:
    Oct 9, 2002
    Messages:
    16,468
    Location:
    Brisbane
    Well it seems they don't even that choice.

    upload_2021-1-28_7-33-49.png

    https://twitter.com/piefkee/status/1354406211781881859

    Since the GloFo 12nm process is essentially a Samsung process, for AMD to use its TSMC based designs requires a fark tonne of work and effort. They ain't gonna do that. Especially not for the lowend low margin territory.
     
    Myne_h likes this.
  16. power

    power Member

    Joined:
    Apr 20, 2002
    Messages:
    65,595
    Location:
    brisbane
    when i invested in a 1080Ti I didn't think it would be a literal investment.
     
    SnooP-WiggleS, sammy_b0i and Ck21 like this.
  17. OP
    OP
    Myne_h

    Myne_h Member

    Joined:
    Feb 27, 2002
    Messages:
    11,245
    Question is...

    How organic is the design process from day 1?

    It all starts out in a chip design tool, right? Something like CAD.
    Once they're happy with the logic of it, and they've done their theoretical benchmarks on whatever super computer they have, then they send it to their fab partner and begin the iterative process of tape outs and modification - right?

    Presumably that process is about the same length on most nodes. Though a known node like one you've previously used for CPUs might be faster. You never really hear about how long it takes to port a chipset or other low value chips - But they're done.

    Assuming they made the decision in September or earlier, and used their contractually obliged version 1.0 with only the errata changes added. It could theoretically be done by now. At least something good enough to ship. Companies do do things like this with staff all the time so it's still possible.

    Yeah, nah?
     
  18. OP
    OP
    Myne_h

    Myne_h Member

    Joined:
    Feb 27, 2002
    Messages:
    11,245
    Getting about the age that it might be worth looking into the exact capacitors and mosfets on it. A preemptive replacement before they fail wouldn't hurt.

    Here's what they look like when they go bang

    1080blownmosfet.JPG
     
  19. power

    power Member

    Joined:
    Apr 20, 2002
    Messages:
    65,595
    Location:
    brisbane
    if it up and dies i will just join the conga line of people trying to buy something new, that and i can just play one of the hundreds of other games i have on console to tide me over.
     
  20. RnR

    RnR Member

    Joined:
    Oct 9, 2002
    Messages:
    16,468
    Location:
    Brisbane

Share This Page

Advertisement: