Moore's Law isn't dead. Moore's Law simply states that capacity will double every 18 months. And that is still true. Moore's Law says nothing about visual quality. What Nvidia can do with double the transistors isn't giving them double the visual quality. That has nothing to do with Moore's Law. That has to do with the upper bounds of how current generation lighting engines in video games and real time 3D engines work. Likewise, people often incorrectly assume that Moore's Law means things get "twice as fast" every 18 months. That's not true either. Previously with twice the capacity of transistors, CPUs "felt" faster for our day to day usage. Right now your Windows 10 desktop doesn't feel twice as fast when you upgrade your CPU because of other limits in asymmetric computing architecture. Ditto for your games. Nvidia (and others) are trying to mitigate customer expectations here, which in turn are incorrectly set through a misunderstanding of how technology progresses. Moore's Law is still alive and well, but video game engine technology, even with all the advancements in DirectX12 and Vulkan, has been largely stagnant for the better part of a decade with the way it "cheats" something as fundamental as how light bounces from a source, off an object, into the eye/camera. Go outside for a moment and stare at a real world object. Anything. A single brick, a blank bit of wall, the side of a car, whatever. Really concentrate, and think about what you are seeing and how that works at a physics level. You're not seeing the object. You're seeing energy that came from some distance source, travelled through space, lost energy along the way (atmospheric interference, refraction, absorption of materials it bounced off) and results in your eye interpreting colour and luminance to give the illusion of shape and texture. Now imagine how small a photon is (hint: really small) how fast it travels (hint: fast) and how many of those you need to have enough information to build an image (hint: a lot), and how many billion transistors you'd need to calculate and sample that information ever few milliseconds (hint: a lot squared). In human history, we've been nowhere near that, so we bullshitted it up until now with hacks and approximations. Right now, we're closer than ever, but still a whiles off making it realistic for what the customer base expects a video game to work at. Easy to comprehend, hard to explain to shareholders that they have to wait a little while for shit to catch up. So as a CEO, how do you mitigate that? You announce "Moore's Law is dead". It isn't. But it's a double-edged sword of demonstrating a technology that's 5-10 years off mainstream use now, and keeping people patient enough for the technology to catch up. Don't confuse marketing bullshit with science and engineering. We're still progressing as always. But right now this is like the Nvidia Riva 128 just got released, and everyone wants GeForce level technology and graphics (remember when "Hardware TnL" appeared, and we all were amazed?). Shit takes time, yo. I hope this also explains a little better why, after this long working in professional 3D, I'm not so fussed about all this progress. Improvement is inevitable, as are forums full of people pissing and moaning that improvement isn't happening fast enough, that the sky is falling, and that we've apparently reached an plateau.