Discussion in 'Video Cards & Monitors' started by SnooP-WiggleS, Mar 6, 2020.
funny how the NON XT keep popping up on scorp.
Guess im gonna be waiting till after xmas for ANY card.
xmas... 2021 yeah?
I mean not really, the consoles have a shared pool for everything, including the OS.
People are really overselling the importance of vram quantity for gaming, even at 4K 10GB is going to be just fine this generation.
Some games can chew through VRAM though. Remember GTA IV? Needed more VRAM than most cards had at the time and it still had issues.
ignoring the VRAM thing, i think a lot of people are overlooking the fact that AMD is suppling BOTH world leading consoles with hardware. I damn well know what most multi-platform game developers are going to say when they need to get a game up on all platforms - "Amd is definitely gonna be easier, as its already there in two consoles"
And in saying that, i think devs are gonna look at 16gb and shiver with delight. You can fit a lot of textures into 16gb.
AMD have pretty much always supplied the chips for Xbox & Playstation lol, they were the cheaper option and still are. It's also much easier dealing with a single company that can stick CPU & GPU on the one chip.
The 16gb will take a while to be utilised, probably a couple of years and by then PC will be well past that on the GPU anyway.
Just submitted an enquiry to mwave as my 6800 order is still in processing, with no card allocated... I'm also surprised that on order cancellation, they don't refund the original transaction fee.
Weird. Wasn't aware that was something retailers did
That's kinda the point. GTA V is still relatively ambitious by today's standards - a massively dense open world game with thousands of bespoke assets and textures - and does so at less than 5Gb VRAM usage at 4K. Yes, 4Gb-5Gb at 4K was a lot of VRAM for a game to use back in 2015, but it isn't today - we are well past that. 8Gb is going to be sufficient for all but the most extreme settings in 4K, and 10Gb more so.
If game devs can't make their games look beautiful in 4K with < 8GB of textures they aren't really trying. The next big leap isn't more fast ram to pre-cache assets, it's on the fly streaming from super fast SSD storage - then VRAM quantity becomes far less important beyond a certain point.
There may be some of that translating to PC, but from my understanding it's not quite that straightforward. Yes both consoles have AMD hardware, but devs are using the APIs provided by MS and Sony respectively, not programming to the metal per se.
Take Digital Foundry's latest analysis of Ass Cred Valhalla... the PS5 is actually out-performing the Xbox Series X despite being at a teraflop disadvantage on paper. This is likely because Sony have a better/easier API for devs right now, not because the devs are intimately familiar with AMD hardware.
I'm not a dev, so I might be talking pure shit here, but I would think if you're using an engine like Unreal, and hooking into the an API like DirectX, you don't give a crap what end hardware is - that's the hardware manufacturers problem to write drivers that work will with the API of choice (DirectX, Vulkan etc), and it's the game engine creators job to make sure that engine performs well with the various APIs that sit just above the hardware layer.
VRAM might be a concern, though at 2560x1440, the most I'm seeing is 8GB out of my RTX 2080 Ti's 11GB, granted a chunk of that could be just pre-chaching. I doubt 16GB is going to be saturated at my resolution, though at 4K and up, perhaps.
So it is easy-ish to port from Xbox to PC. Sony is all together a completely different set of structures...
But back to xbox - easy to move to PC?
No. Controlled hardware in the xbox means you know exactly what you have to target. PC world - millions of combinations of hardware and software to try and take in to account. No doubt directX and MS have tools to assist moving across platforms, but Sony won't be helping you.
I'm the same, 6800 MSI, no allocation yet or ETA, but no cancellation either, fingers crossed.
Never really like nvidia unusual approach with numbers for vram and bus. Should just stick with binary numbers to make life easier. Also given no crossfire I will offload a 6800 locally in Brisbane no doubt.
The thing is, 16gb may leave plenty of headroom for now, but in a generation or two, it may be the bare minimum. Consider the 1080 series buyer, whom may not have upgraded to the 2080 series because the price/performance increase wasn't there for them. They could have had that 1080 for years. In which time graphical requirements could change substantially. So 16gb makes a lot of sense for future proofing for those that don't upgrade every generation, like me.
hey, still using a 1080 here! take it easy!
Me too. Wtb the 6800XT I have on order.
Also Nvidia upset me with their silly price jack going into the 20 series and their 30 shenanigans haven’t impressed me either.
I get the argument, but I don't think it something people need to be concerned with.
We've had 4K assets for a number of years now and VRAM usage simply isn't exploding. Yes, more VRAM means devs could use higher resolution textures, or more unique textures, but at what cost? Producing higher quality and higher resolution assets isn't free, and keeping your resource budget in check generally helps your game run on a range of hardware. I honestly can't see 4K games blowing past 8Gb very often at all.
The 3080 and 6800XT aren't 8K cards, the 3090 and 6900XT will be barely passable at 8K as well, so we're really talking about 4K max here. Frankly even 4K is having a slow uptake, but the 3080/6800XT are very capable at 4K, I genuinely can't see either being left behind over the next 4 years. Now watch Cyberpunk release and immediately prove me wrong
We've seen the system requirements and recommendations for Cyberpunk 2077 and it seems reasonable, even for 4K. Though they have said RT will not be enabled for AMD GPUs on release. Perhaps in the future, DXR might catch up to RTX.
anyone got their amd card yet? How is it?