Discussion in 'Video Cards & Monitors' started by Agg, Jan 6, 2017.
be lucky to see one local i reckon lol
Since when has it been a paper launch? I thought it was going to be launched and AIB cards come in late August / early September.
How can you say if it includes shipping when you didn't see the product?
It did include shipping and everyone had a chance to buy one.
The point is that its an indicator of what the real price is when there's actual stock.
The technology/scaler in gsync monitors is superior. Its been said that Freesync 2 will solve all of that and force monitor manufacturers to meet the standard but if it will match gsync we have to wait and see.
You have to remember that Nvidia created the whole idea of synced refresh rates, AMD had no plans until Nvidia released gsync, like usual. And even then, a lot of freesync implementations are bad because its completely open to the manufacturer to do what they want, unlike gsync.
Maybe once all the latest freesync monitors match gsync we will see Nvidia want to play nicely but until then they do what they always do, make money.
Trust me, if AMD was putting huge $$$ and staff and releasing crazy amounts of superior technologies, they wouldn't make it all open source and invite nvidia to use it for free. Thats charity, not business.
yeah i guess its still early stages in this sort of tech. be interesting to see what AMD does in 2-3 years.
I run Ark just fine @ 4K on epic for the most part settings -60+ fps on a TI
Nvidia is still over charging but they can because Freesync is just a guide isnt it, and the onus is on the manufacturer to be accurate.
GSync vs Free Sync implementation
Manufacturer must buy a chip and the monitor must be able to meet specific parameters or it cannot be certified Gsync
It the responsibility (honesty) of the manufacturer to determine pixel response, refresh and develop firmware for the monitor scaler to AMD guideline.
This is exactly right, but thats what Freesync 2 is meant to fix. You can't put the Freesync logo etc on your product unless you follow the standard.
Either way its not relevant to Vega as today's Freesync monitors are not a fair comparison to Gsync.
how far off is the comparison overall? like i said prior i never used either. I saw the X34 in game on a 1080ti for about 30 seconds in BF1 but i couldn't tell from that small preview VS my dell.
I can say my "shitty doesn't work rubbish" 1440p 144hz freesync monitor is as smooth as hell though a wide fps range with a FuryX. My previous 1080p 144hz monitor with a GTX980 looked like shit if frames fell below 100.
There are plenty of companies in the world that compete to deliver a product that meets a standard. They aren't giving each other charity just because a consumer isn't locked in to their particular tech.
Nvidia have gsync to create another revenue stream and make more money. They don't do it to be your friend. Their main objective above all else is to make it harder for a consumer to justify switching teams.
Anyone else see this?
Gonna work out around $600usd for the top vega rx gaming card
cant even keep up with a 1080...
This pretty much explains the last 100 pages in this thread.
So many hopes will be dashed, it's very obvious
I doubt that.. I get 60fls at 2k and even lower at times.
I hope for a compelling reason to skip this generation and spend my toy money on something else slightly less unnecessary.
Parts of ragnarok do dip a bit, but that map has some interesting issues, both client and hosting side (undocumented patch changes atm a pain). ..run a cluster with all 4 maps and ARK overall it is now running a lot better than 12+ months ago.
TBH, @ 4K really not found anything yet that I would class as unplayable with a Ti and eye candy.
Are you an idiot? Is everybody a fucking idiot?
You can all get your Kermit flail on WHEN WE SEE THE RX VEGA BENCHMARKS.
All you've done is posted a link to a page that has speculative pricing with graphs from the FE benches; FE not the RX.
No the FE isn't great but when the RX version drops and if it's shit then then we can collectively sigh and eye roll knowing we are looking at real numbers not some speculative crystal ball shit that everybody summons forth from their colon.
Seeing a lot of rumours and so called "confirmed" pricing on Reddit today saying Vega will be around €1155 (Euro) without VAT, South African wholesale price at $1081 Standard/$1227 for Watercooled & Spanish blog "leaked" RX Vega prices in Europe saying it'll be around $700 - $900 without taxes.
What do you guys think?
So when do we get real info?
Saturday some time?
Sunday or Monday probably, Siggraph starts on the 30th.
The "glass half full" part of me says there's got to be a silver lining.
If they know it will perform worse than a GTX1080Ti, why would they price it above? It won't sell.
So either the prices are wrong, or the performance indications so far are wrong.
Maybe it's a killer for ETH mining somehow and they're banking on selling a large number of initial units to miners?
If they do a Siggraph reveal and the performance is >1080Ti, then a higher price will make sense.... but that's a big if. All indicators so far point to not a very nice experience.
Bracing for impact.