NVIDIA Announces the GeForce RTX 20 Series | RTX 2080 Ti 2080 2070 | Discussion

Discussion in 'Video Cards & Monitors' started by DiGiTaL MoNkEY, Aug 21, 2018.

  1. WRX_STi

    WRX_STi Member

    Joined:
    Sep 6, 2001
    Messages:
    1,745
    Location:
    Cairns
    What's a gravics card?
     
  2. bennyg

    bennyg Member

    Joined:
    Dec 16, 2005
    Messages:
    3,635
    Location:
    Melbourne, Oztraya
    Not that many in desktopland will give much of a toss but FYI the launch of the mobile range (2060/2070/2080/2070MaxQ/2080MaxQ) has pretty much gone down like a lead balloon ... just like the desktop

    - Clocks, power limits and scores are all over the place, varying laptop to laptop
    - Lower power limits and low clocks have them well behind their desktop counterparts
    - Most mobile 2080s only perform about as fast as an overclocked mobile 1080
    - No SLI/Nvlink so even the fastest 2080 (Clevo) are still about 40% outbenched and outperformed by the previous gen SLI 1080 monsters
    - Laptop variants lose even more % performance with RTX On than desktops and it looks like the only thing a Mobile 2060 will play with RTX is at 720p
    - MaxQ a total joke, the 2080MQ is 80W and half the speed of desktop 2080, down near desktop 2060
    - Very, very expensive, most direct model replacements are +$500-$1000 for 10% or 20% improvement over 10 series
     
  3. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    35,683
    Location:
    Brisbane
    Lots of important information in this. It demonstrates how far this tech still has to grow for realtime ray/path tracing.

    In film, we bounce many millions of photons/rays about, taking hours per frame. These GPUs are clocking in at thousands of photons/rays 60 times per second. Certainly incredible stuff, but still a while off to really switch over completely.

    Good talk too of how smoothing and averaging is handled, and why high motion, high detail stuff will look a little weird for a while.

    I still think we're a minimum of two generations of hardware off yet until it gets really exciting. But this is a good start. With any luck we'll see more experimental stuff like this happening both in open source and in newer engines.

    It's going to make the "1440p 120FPS or death" crowd a little upset too, as that's not going to be achievable for a while. But these are the sacrifices we need to make to move towards better lighting realism in real time engines.
     
  4. RnR

    RnR Member

    Joined:
    Oct 9, 2002
    Messages:
    12,466
    Location:
    Brisbane
    pffft... frames means frags... and frags are good :leet:
     
  5. Court Jester

    Court Jester Member

    Joined:
    Jun 30, 2001
    Messages:
    2,473
    Location:
    Gold Coast
    I Disagree
     
  6. power

    power Member

    Joined:
    Apr 20, 2002
    Messages:
    57,376
    Location:
    brisbane
    you've been wrong in your life but never wronger than this.
     
  7. Luke212

    Luke212 Member

    Joined:
    Feb 26, 2003
    Messages:
    9,500
    Location:
    Sydney
    4-5 years ago I invented how to do this properly. You dont need to do it 60 times per second and you dont need to render the whole frame 60 times per second if you make assumptions about what is important. Everyone strives for perfect, but its a mistake in life in general!

    the trick is knowing whats important and whats perceivable. and you can do it within current hardware.

    specular and moving light sources are what creates time-critical work for a path tracer. using AI you can reduce that load dramatically. i wonder how long it will take these companies to catch on.
     
    Last edited: Feb 6, 2019
  8. power

    power Member

    Joined:
    Apr 20, 2002
    Messages:
    57,376
    Location:
    brisbane
    speaking of people being wrong.
     
    fredhoon likes this.
  9. Luke212

    Luke212 Member

    Joined:
    Feb 26, 2003
    Messages:
    9,500
    Location:
    Sydney
    just ahead of the curve as usual :p
     
  10. Perko

    Perko Member

    Joined:
    Aug 12, 2011
    Messages:
    3,554
    Location:
    NW Tasmania
    Amazing how elvis comes in here, and then his alt straight afterwards and throws a grenade. :D
     
  11. Luke212

    Luke212 Member

    Joined:
    Feb 26, 2003
    Messages:
    9,500
    Location:
    Sydney
    we have similar interests unfortunately.
     
  12. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    35,683
    Location:
    Brisbane
    Don't worry, you lot will still have your legacy engines to satisfy you in the short term. The competitive FPS market isn't going to change over night.

    But for story driven games, the level of immersion and realism has an exciting opportunity to improve dramatically as a test case before more framerate demanding genres.
     
  13. mAJORD

    mAJORD Member

    Joined:
    Jun 4, 2002
    Messages:
    9,525
    Location:
    Griffin , Brisbane
    Good to see you've come to the party on that one :p
    Not just story driven, MP and dumb shooters can be immersive too.
     
  14. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    35,683
    Location:
    Brisbane
    I've never been anti-graphics. Merely anti "only graphics matter". And, for what it's worth, my whole job is about selling realistic computer graphics. :)

    Take a look at the "Anthem" thread. Page 11 is the first time someone actually talks about whether or not the game is fun. Before that, endless talk of the framerates or server ping times. And while I understand why they're important to that genre, it's fairly typical that discussion of whether or not a game is actually fun is lost on a certain demographic. One of my favourite games ever - Okami - gets constant derision because it's frame locked to 30FPS (which has zero effect on the game). What an amazing experience people are throwing away for a meaningless number.

    Where graphics enhance a game's "funness", great. When it becomes the obsessive, singular drive of people at the expense of what this art form best delivers, then we've lost our way.

    On topic, ray tracing has long been considered the impossible for real time engines, and we're finally at a point where something akin to 90s CGI are achievable. The good news, IMHO, is that it's not going to take 20 years to get to 2010 level. My best guestimate is 2 generations of tech, which I think will happen in the next 5 years.

    So in layman's terms, we're at "Toy Story 1" now (don't let recent movies fool you - go and re-watch that to see how terrible it was, graphically). I reckon 5 years and we'll be at "Finding Nemo" / "Incredibles" at 60FPS.

    Between here and there, it'll be interesting to see how devs blend the tech. I don't know if it's even possible to do so - can they do traditional dumb/cheating lighting for the core game, with ray traced overlays for environmental stuff? I'm not sure. But I'm keen to see if and how these two eras of technology can work together until we can jump 100% across.
     
  15. Perko

    Perko Member

    Joined:
    Aug 12, 2011
    Messages:
    3,554
    Location:
    NW Tasmania
    Patent numbers or gtfo.
     
    Luke212 likes this.
  16. dzajroo

    dzajroo Member

    Joined:
    Dec 23, 2010
    Messages:
    802
    Becuase it's multiplayer, you will always have competitiveness, over looks, even in this "casual MP as Anthem is", you still want to be the best of the 4 players, hence the frames and latency -performance is dominant over the looks. with other competitive games, most players will run minimal settings to get max frames, that how it is.

    30fps - maybe no effect on Okami, but F-me trying to do perfect rolls/blocks in Dark Souls on PC with 30fps lock was sadistic, luckily for the 60fps unlock mod. But yes in general, singlepayer games, I don't mind running 60-80 frames and enjoy max settings, the minute I switch to MP game, ultra low + 144hz here I go.

    I think 2 generations to get raytraced reflections/refractions and shadows running in games with sold frames, and probably much longer for fully implemented raytraced rendering. Unless they come up with some solution that will cut rendering time significantly or nvidia somehow magically multiply their performance from generation to generation.
     
  17. RnR

    RnR Member

    Joined:
    Oct 9, 2002
    Messages:
    12,466
    Location:
    Brisbane
    Speculation here ofcause, but I really think that AMD's Navi will have a decent raytracing implementation. Enough for 4k @60hz. My reasoning is that Navi will be going into the nextgen consoles, and I doubt very much that Sony and Microsoft will like being on the raytracing side lines for 5+ years while the pc platform gets seriously pretty imagery. Again, 'completely-out-of-my-arse-speculation(tm)'. All I have is that AMD has said that they have been working on their raytracing implementation for some time.
     
  18. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    35,683
    Location:
    Brisbane
    Sure. But is it fun? You remember fun? It's what we had as kids. It's what we forget about when we turn into adults, and when we worry about everything else around fun.

    Fun is still independent to competitiveness, and there are plenty of competitive games (video and real world) that are not fun for reasons unrelated to the competition part.

    I don't begrudge people talking about framerates in an online game at all - I totally understand why they matter. But I am saddened that it took OCAU members over 150 posts before someone actually mentioned whether or not they enjoyed playing the game. Staring at a triangle rendering at 6000 FPS isn't fun, but by golly has it got one hell of a framerate. This isn't some sort of "exclusive OR" scenario. We can talk about both things at the same time, it just so happens that lately, so many gamers appear to concentrate on only one of these things. We've lost the ability to critically analyse the art, it feels.

    Yes, this is how Moore's Law works. We double capacity roughly every 18 months, and ray tracing is all about capacity. Can you double the photons you cast, and then re-double them again? Yes, in 2 cycles of Moore's Law.

    https://en.wikipedia.org/wiki/Moore's_law

    3 cycles of Moore's Law is 4.5 years, and is 2^3=8 times the capacity, or almost an order of magnitude. That's when things will get interesting, and that's why I plucked "5 years" out of the air, as it's 3 cycles plus lead time for marketing and distribution.

    Over in the film and VFX world (where I work), we've been using this for years. Have our frame rates improved? No, because we accept upper bounds of time for the maximum quality we can produce. So when performance doubles ever 18 months for us, we use that to make things prettier and more realistic, but still put up with 5-50 hours per frame, because we have that "luxury". Real time engines like games are similar, but with a lower maximum bounds. What's the prettiest thing you can draw in 8-16ms today? What about in 18 months when we can do twice as much in the same 8-16ms window? Rinse and repeat.

    And, FWIW, GPUs are starting to creep into film/VFX rendering. The upper bounds there is video memory. Our render nodes clock in at 192-256GB RAM each, while the best cards we can buy at sensible price volume are more like 12GB. But, we're getting closer and closer to GPUs being a valuable thing for non-realtime, photo realistic rendering. And with that I can see a merging of game and movie technology that's already starting to happen (the Unreal Engine gets used more and more in low-end VFX, and that's rapidly creeping into the high end, which has a fascinating outcome for both games and movies in the next 5-10 years).

    I think AMD are also a while off yet. Part of this is R&D at a design/knowledge/IP level. But a big part of this is just how many transistors you can slap on a silicon wafer, which is not something AMD nor Nvidia can push beyond anyone else.

    Looking at the film/VFX world, there's still a lot of stuff done in HD/2K even today. Netflix are pushing 4K hard, but you'd be surprised at just how many big budget action films render 3D/VFX/particles at 2K and scale to 4K in post.

    UHD/4K is 4 times the pixels of HD/2K. That's a massive jump in computation power, when you look at the maths behind this. Pulling numbers out of my arse, the grunt required for "ray tracing" (or what video games call ray tracing) 4K/60FPS is around 10-20 times what traditional game engines require for the same output. AMD (nor anyone) is getting there in 2019/2020 (at least, not in the sub-$2000 price range - I could imagine a $20K GPU that might, and the liquid cooling it would require to not set your house on fire).
     
    Last edited: Feb 6, 2019
  19. dzajroo

    dzajroo Member

    Joined:
    Dec 23, 2010
    Messages:
    802
    best fun was back in x386 era, x486 and first few Pentiums :) not sure if it was because I was young and all the games were new, never seen before and packed with content, or because I'm getting older and not interested in games that much these days (or because nowadays everything is MP with 4 maps and skins with minimum content). :)

    yeah, based on Moore's Law that make sense, however I was taking into consideration Intels's and nvidia's statements. Can't find the intel one but here's nvidia https://www.extremetech.com/computing/256558-nvidias-ceo-declares-moores-law-dead . Pretty sure is not dead, but might "slow down a bit"
     
  20. mAJORD

    mAJORD Member

    Joined:
    Jun 4, 2002
    Messages:
    9,525
    Location:
    Griffin , Brisbane
    Moore ls law is dead as we knew it.. i.e the time frames have changed, the cost benefit is gone.
    This will have.. Scratch that, is already having a profound effect on how this technology progresses, and who can afford it
     

Share This Page