Discussion in 'Video Cards & Monitors' started by the_antipop, May 27, 2019.
Real OpenGL, Gsync, Physx, RTX, Long list of VR Features for starters...
This thread is like De ja vu every page
OK lets draw the line in the sand.
- All Nvidia cards are the best, and best value.
- This will be DOA. Do not buy it. It does not have RTX, so no slideshow RT effects for you if you buy
Fresh discussion below, hopefully.
Why is it that every time Court Jester enters a thread/convo it turns in to a steaming pile of shit?
So a Great Wall ute has the same performance as a 600SEL?
if you were to believe amd's marketing on the matter
SO this is really interesting (I love hearing about archs), seems AMD has done a lot to GCN here, probably enough to say it's a pretty big shift. Seems to go a ways to fixing some of issues GCN has whilst keeping backwards compatibility with games made for GCN (consoles).
Because that's what he is as a person. A steaming pile of shit.
I guess it takes one to know one
I KNOW YOU ARE, BUT WHAT AM I?
WHY? WHY? WHY? WHY? WHY? WHY?
There's your responses set for the next year.
Have they announced when more of the cards are coming? or are the 2 so far (and the console ones) all we know about for now?
the thing that always baffles me about AMD in the here and now is they are clearly having a crack, they come up with all this fancy stuff (HBM, 7nm) and really try but it just ends up being not that great in the long run. In all honesty to an extent it makes nvidia just look better and better imo.
I do however think they should stop hobbling themselves and aiming at the mid-range, give us a big bad killer card to go with the mid range.
I thought we we're adults here, can we just keep it on topic and if you haven't got anything worth while to add go somewhere else and ramble on about who's epenis is bigger.
So going off the video from GN I think it will be interesting to see how well games written to take advantage of the new arch will be, should be a significant improvement in IPC compared to Vega.
AMD and ATI before them tended to try new tech, sometimes it pays off and sometimes it doesn't, if HBH prices stayed where they were it would be good.
As for a high end card, that should be interesting based on the new arch, maybe next year with the second version of it they may be able to scale it up past 64rops and have it actually give gains.
but will developers do that, most of them have proven over and over again to just be flatout lazy and rely on the hardware doing the job, so it might be more in the hands of people like Epic to bring those improvements to UE4, EA to bring them to Frostbite that kind of thing rather than the developers themselves.
Yeah, I dont think is specifically something devs need to do, maybe more of a driver/game engine thing. Need to get a better understanding of how it works.
AMD just don't seem to understand the industry at all
1. They wasted money on the assumption DX12/Vulkan etc would explode and all developers would scramble to change quickly. I think gamers knew this wasn't going to happen, its concerning that highly paid professionals working at AMD thought otherwise. DX12/Vulkan still hasn't come very far yet.
2. They invested in HBM and released overpriced cards that seemed to shift their timeline backwards. Surely whoever was responsible for research and development should have realised it wasn't' worth it, unless those same dreamers as AMD thought HBM prices were going drop heavily in the future.
3. AMD have tried multiple times to make new tech to compete with Nvidia, but everyone time making something very average and not developer friendly. The only reason freesync worked was because it was part of a standard without a module, the way AMD treated freesync it would otherwise be dead in the water.
Their alternative would be to stick to what they did best in the past, making an Nvidia competitor that is much better value because it doesn't include Nvidia features/benefits. Its much better then pretending to be a viable competitor in performance and features and pricing it based on that.
I don't know why people carry this bullshit sentiment. At no point in the last 15 years has AMD ever made tech to compete with NVIDIA's top end. At no point has AMD ever publicly pushed a "Ti Killer". AMD's cards are also very developer friendly as they A) don't shit on standards and B) are quite active with the community. If by not developer friendly you mean doesn't lock game studios into unfair advantages by selling them bullshit like GameWorks, then sure I guess.
The thing that PCMR nerds don't realise is that there is a bell curve, and that bell curve is a large market that AMD is going and they've been doing fine in. AMD realises it doesn't yet have the chops to go after the top end, and the products aren't positioned that way.
Lol what a load of crap
In that 15 years and has had the best performing outright in several different generation's
Can we just stop this derailing crap and discuss Navi, there's a great 25 minute architecture video to discuss here.
So what is TressFX, Freesync, Mantle, LiquidVR, TrueAudio and anything else they have made for?
These are all attempts at enhancing the value of their GPU's.
Instead they switch to HBM, DX12/Vulkan and other things that Nvidia hardware is not very good at in an attempt to get an advantage that way.
And it worked... for a few games that implemented the tech correctly and were optimised.
lol - what a load of crap. The only one here to blame for the current DX12 failure is Microsoft refusing to bring DX12 to Windows 7 at the beginning. For multi year projects ofcause devs would pick DX11 to max out their potential market.
Ok so what is your explanation for all the games without DX12, or that most of the games with DX12 don't work very well?
Still Microsoft's fault I guess? I guess they pay off the studios to implement DX12 poorly?
Enough time has passed that your windows 7 excuse won't work.