No. Nowhere near. The basics are much much better, but the lack of developer support in both directions, (devs<->AMD), means that you can cop a significant performance penalty in individual titles and be stuck with it for effing ages. But it's not as bad as it was, these days usually the people saying that AMD drivers are a shitshow only use Nvidia; conversely, most of those who say that AMD drivers are great and/or better than Nvidia's wouldn't touch an Nvidia card with a pole. That's just people justifying their purchasing decisions with denial and assumption, (have a look in the Apple forum for the number one example). My view on it at this point in time is that Nvidia are the premium gaming brand, and AMD are the compute champ. It boils down to OpenCL being a standard that AMD can design their software to, as well as having an arch advantage, and gaming being a disjointed mess that Nvidia are prepared to throw the cash at to give that user group a better experience. Teaching game devs to build to a higher standard rather than hacking bits together at the end isn't an option apparently. Yes, in this instance the high price of the RTX 2080 made the Radeon VII viable in the marketplace, but it's the classic AMD extremist line to claim for months in the lead up to a release that they are the sole source of goodness in the industry and exist to lower prices, and then blame Nvidia or Intel when they don't.