Discussion in 'Video Cards & Monitors' started by Agg, Jan 6, 2017.
they won't be launching a dual GPU card this launch.
At least for the consumer line up, because crossfire has not been upgraded to work well in ALL titles and under all OS options so its not a good consumer option IMO. I just hate the issues you get with dualgpu setup, drove me mad when I attempted it not long ago.
I think that's what's planned for NAVI isn't it?
Multiple smaller GPU cores produced and having 1, 2 or 3 on the board. Appearing as one to the O/S.
[ HyRax1 edit: I was serious when I imposed the new this-thread-only rules. See you in 24 hours. ]
Have we forgotten? Keep on topic.
That's a software, not a hardware issue.
The same principle can be applied to CPU cores and threading. You could throw a 1000 cores at a computer program but if the software is coded to only run 1 thread then no amount of extra cores would improve the performance.
AMD cards make great miners so I'm super keen to see how Vega performs with crypto mining.
Just like Ryzen, I think they really need to do something different with their GPU architecture if they want to compete with NVIDIA
AMD started the long term game with Mantle.
There is no point attempting anything else but waiting for all games to be DX12/Vulkan which is going to happen eventually.
As long as AMD prices their cards on the fact that most games are not DX12/Vulkan right now, they will do fine.
Their past does seem to show they prefer to price on potential rather then reality though.
GPUs ARE massively multithreaded with thousands of cores... What we are talking about here is *properly* stitching two or more gpu dies together in hardware to appear as one, and instead of a software based mechanism trying to distribute the workload which requires extra driver and dev work both of which have always made SLI and crossfire somewhat unappealing. DX12 is supposed to do this better with less dev work needed but its implementation is lagging badly FWICT
Because who of the cpu and gpu market leaders *profits* from games needing less powerful cpus and gpus?... /conspiracytheory
but it still needs to be implemented in DX12 games by the developers for it to work
we need a 100% in hardware multi-GPU solution where the devs dont need to do anything.
even in newer Dx12 games we still dont have universal multi gpu support and in the Vulcan pin up game we still dont have multi gpu support.
I would guess this will be mostly done when Unreal/Unity have this implemented and game developers start using it.
Problem is most game developers don't upgrade their unreal/unity version so we will be waiting for games in 2-5 years time.
you sound like you're arguing even when everything you've said is in complete agreement
That's not how it any of it works, the new low level API's and there ability to do Multi GPU requires more work that SLI/Crossfire as it is up to the dev to implement it correctly.
There is no way to have some hardware that can stitch 2 GPU's together as one as far as I am aware.
It either up to a driver to do it (with all the correct coding pre done by the vendor) or up to the dev to do it directly (DX12/Vulkan).
And wonder how many of the titles in production atm with est-release of 1-2 years will have any of the feature sets ?
I guess it will be at least what 1- maybe 2 gens of cards away before the real benefits are realised . Mantel was sorta the same - it will be great etc.
Still, all said and done, somedays I have to just remind myself how far it has all come, long way from the 3DFX Voodoo cards, was almost a overnight (pardon the pun) game changer, and now, here we are looking @ single GPU's to push 100+fps @ 4K.
Is multigpu in dx12 more work because it is natively harder, or because the SDK (or whatever its called) that nvidia/amd provide to devs is nowhere near as refined as for DX11?
voodoo cards had multi GPU cards with support in hardware not relying on developers to support it, I dont see how AMD and nvidia could go backwards to the point where sli / crossfire is basically useless in newer games.
Under the new API's it should be hardware agnostic (i.e. running amd and nvidia hardware together) so shouldn't be a vendor side issue re: dev kits or sdks.
It's harder because instead of just saying, enable SLI, you have to code it yourself and you are presented with the GPU's as one (kind of) and have to allocate everything manually.
The specific implementation is beyond me, it's not easy, that's all I know.
3DFX used AFR just as the current lot do, this is easy and requires little to get it working, but if you don't do any tweaking it can run pretty bad with a lot of stutter, thus why you have SLI/Crossfire profiles that are tailored for specific games to make it run a bit better (but still not great).
But lets not forget the long list of games that either don't work in CF or have negative performance.
I always thought it was hilarious when I'd play a new release game and finish it before I would get a crossfire profile for it.. but then I would just stop getting any profiles at all.
Don't forget Microstutter, frame pacing and the jump in power usage either.
I am happy to just upgrade the GPU more often now rather than mess with multiple GPU's.
Although my old 6950's in crossfire lasted a long time, they tended to be poor when both GPU's were pushed hard, like in Battlefield 4, where a single Radeon 7870 provided a better experience overall even if it had a lower framerate.
To me it's a real shame there hasn't been 'ground breaking' work with multi-gpu card gaming.
Horizontal scaling is far more easy to do that vertical and yet it's still a very much vertical industry