1. OCAU Merchandise is available! Check out our 20th Anniversary Mugs, Classic Logo Shirts and much more! Discussion in this thread.
    Dismiss Notice

Disappointed in system improvements post 2014.

Discussion in 'Intel x86 CPUs and chipsets' started by ExcessionOz, Dec 5, 2016.

  1. Reaper

    Reaper Member

    Joined:
    Jun 27, 2001
    Messages:
    12,159
    Location:
    Brisbane, Qld, Australia
    That was actually awesome. You know why? It wasn't just the hardware getting better, but the software too. Why did you leave that out? The reason why hardware lasts longer now is that the software has more or less stopped getting better as it was software that drove the hardware. I blame the mobile and console movement. If anything, this world sucks.

    There's the problem, you're stuck in the member berries of the T-bird being good. The few of us that have no expectations from AMD are doing so through experience, not based off one thing many years ago. I have very little faith in Zen. Intel has already beat it. It'll come in at predictable places in bench marks, just like Radeon vs. Nvidia does. Experience has lead us here to expect nothing. If it does any good, then good. I just have no expectations so am not going to get disappointed.

    Funny thing is you're looking at a 1070 video card. Why? I have a 6gb 1060 driving a 1080p display just fine. Why not go for a r480 if they are so much cheaper, and your experience was better with AMD? Baffling.
     
  2. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    46,644
    Location:
    Brisbane
    You'd be one of those guys who confuses "prettier" with "better".

    There are plenty of ugly-but-fun games, and there are plenty of pretty-but-shit games. Sure, amazing graphics are amazing, but that doesn't dictate fun.

    I'm still playing and enjoying games from the 80s, side by side with games made this year. I urge you to do the same, and play a back catalogue of awesome stuff I 100% guarantee you've missed (because nobody can play everything).
     
  3. mAJORD

    mAJORD Member

    Joined:
    Jun 4, 2002
    Messages:
    12,528
    Location:
    Griffin , Brisbane
    It's not just about being prettier on the surface - more advanced physics, effects, complex Environments, weather, list goes on. These all potentially add to immersion.. IMO enablers for better / more innovative gameplay, but I agree, they're not much good in isolation.

    It's funny, Elvis and Reaper, you're on opposite ends of the spectrum, some enjoy having games put cutting edge hardware to the test, others just enjoy games no matter what the rest in the middle. I think I'm somewhere 60:40 maybe 70:30 on that spectrum, so i'm biased towards pushing the boundries, but I'd like to think those who aren't , would still appreciate we need to progress, a I feel the general gaming community, even developers
    to an extent are too accepting, burying there head in the sand and focusing on gameplay innovations inside false boundries. The [false] assumption here being 'graphics/engine tech is as good as it gets' bar some texture and polygon count bump-up.

    Anyway, Moving back towards the OP's Hardware dilemma, for PC and PC gaming the stagnation of hardware is for most part caused by:

    - Process technology scaling difficulties: (CPU's and GPUs) Painfully difficult and expensive to produce current 14-16nm FF chips, and more so moving foward. It's not that these processes aren't getting results - They are , especially with GPUs (low clocks). but it's at a cost, and with greater time.

    Being the only limiting factor though means GPU's are now picking up pace again on the new Process nodes.

    -CPU per core performance wall. Known as the end of free lunch (and there's a somewhat old but good publication with this title you should read - although they predicted the end too early - thanks to Conroe). Intel, AMD both know this is a hard wall, and at the pointy end, performance will only go up with core count's, and improving per-core efficiency to allow this.

    Programmers have struggled to deal with this in relation to game performance for a while, simply because they relied on single threaded programming for so long, (The 'free lunch' being provided by an out of control frequency scaling at the time)

    Progress is slow but steady, and you should factor this in.. your original proposal of an i5 probably isn't a good idea. They are getting increasingly left behind by 4c+HT and 6core+ i7's in newly released titles. Low level API's are helping also.
     
  4. bennyg

    bennyg Member

    Joined:
    Dec 16, 2005
    Messages:
    3,795
    Location:
    Melbourne, Oztraya
    The real world difference even from SATA vs NVMe drives is not noticeable

    Rough calcs: Say you're loading 200MB sequential data. a 500MB/sec SATA drive will do that in 400ms. A 3000MB/s NVMe will do it in 66ms. While that means something if you do it ten times in a row, in the individual situation a third of a second really is not something noticeable by normal humans.

    An OS that incorporates prediction of user actions will make much more of a difference - you're typing "Exc" into the search bar it starts to load Excel data in the backgroud.

    I have just installed a 4K panel on my 17 inch laptop up from a 1080p so that's about as direct a comparison there can be. I can absolutely notice it in the clarity of text and graphs vs the 1080p. A huge improvement. Can I notice in games? Not much so far. Mainly being able to scale UI elements smaller without losing text resolution is the main benefit I've noticed so far. Haven't tried many yet so time will tell but I'm not expecting to see anything from FPSes.

    But to get a top notch 4K panel for $150 and DIY was well worth it. If it were retail $600+ like the factory option for this model? Probably not. Could I do this on my 5 year old laptop? Nope as it's LVDS. This is where the forced obsolescence has occurred... sockets and ports. Because otherwise the midrange 3720QM from 4 years ago, unlocked and (partial +400MHz turbo) overclocked is still around a midrange 6700HQ, and at stock only avg ~25% slower than the stock 6700K.
     
    Last edited: Dec 6, 2016
  5. demiurge3141

    demiurge3141 Member

    Joined:
    Aug 19, 2005
    Messages:
    2,351
    Location:
    Melbourne 3073
    Moore's law hasn't really stopped, just gone in a different direction.

    SB: E5-2690 8-core
    IB: E5-2697v2 12-core
    HW: E5-2699v3 18-core
    BW: E5-2699v4 22-core

    The amount of pure computational power you can get on a cpu has grown enormously since 2014.
     
  6. OP
    OP
    ExcessionOz

    ExcessionOz Member

    Joined:
    Mar 5, 2014
    Messages:
    34
    e5-2690, $AU3,500 ebay

    e5-2697, $AU5,400 ebay

    e5-2699, $US3,400 ($AU4,600) Amazon

    e5-2699V4, $US4,200 ($AU5,650) Amazon

    It seems unreasonable (to me) to expect consumer-level interest in these Xeon processors. They're data-centre products, not home-gaming oriented.

    My qualms about current Intel/AMD offerings not being significantly better than the 2013 ones are in no way, shape or form affected by these possibilities. Even if I could afford one of these chips, it wouldn't result in fast(er) gaming. It would allow me to twiddle my CPU's thumbs quicker, but to what end? :)

    An interesting viewpoint though. I'm sure there are enthusiasts who would be very pleased to own such hardware, but they're definitely not playing Minecraft for amusement value only :)
     
  7. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    46,644
    Location:
    Brisbane
    We have a term in professional IT for people who like to play with hardware and not achieve anything, and we call it "technical masturbation".

    And sure, it can be fun. I get a nerdy sense of enjoyment staring at benchmark numbers of anything - whether it's some shit hot gaming rig, or some new storage array's IOPS and throughput, or the crypto benchmarks of some new CPU extension on Intel or ARM CPUs. I'm a numbers nerd, and I like numbers.

    But sometimes you have to stop and ask "what's the point?". Like big number benchmarks of all the huge enterprise stuff I buy at work, at some point I have to stop and remember that it's there for a purpose, not just pure numbers. If all I'm doing is getting wood over one number being bigger than another, there's far cheaper ways to achieve that.

    And to some degree, work does satisfy my "benchmark whore" requirements. We buy render nodes by the dozen with 56 cores of CPU, 128+ GB RAM, and have storage that can flood 40GbE worth of fibre network interconnects for days without breaking a sweat. I can throw out benchmarks across our clusters that would make people's eyes bleed, and our $6K rigs we put together for our VR developers would shit all over most gaming systems on OCAU.

    But at some point, I also just want to play a bloody game without buying a new video card. Because games can be fun, even when they're ugly.

    I'm playing Red Dead Redemption on PS3 right now. 1152x640 internal res upscaled to 720p, 30FPS for most of the game. Do I care? Shit no. Fantastic game, fantastic story, completionist boners everywhere, heaps of fun. Zero fucks given that it looks like balls, because it's just that cool. And to not play it because it's ugly would be missing out on a fantastic experience for no good reason other than pure elitism.

    At some point, you just need to shut up and game.
     
  8. OP
    OP
    ExcessionOz

    ExcessionOz Member

    Joined:
    Mar 5, 2014
    Messages:
    34
    Way to misquote. It was an example of how things did change for the better for Consumers. Evidence to support my argument, not merely a blip on the historical radar. I actually bought a T-bird and it rocked.

    No doubt that the nostalgia factor is at play here, "ooh, remember when the AMD processors flattened the Intel ones for a much cheaper cost, ah that was sweet!". I was there man, I was there! :) :)

    Today's CPU market is dominated by Intel, and what do they do with their dominance? Offer 10 processors-on-a-chip for $AU2,500, because they can charge that kind of money due to no competition in that area.

    Competition is vital for us, as consumers, otherwise future products will be lacklustre AND more expensive, because there is no requirement by Intel to offer anything better, since nobody else is offering anything better.

    Please, offer your own opinion without trying to grandstand and cite the masses as supporting your argument.

    I readily accept your opinion that you don't think AMD is going to come up with the goods, fine, predict-away.

    I do NOT accept that your opinion trumps fact, when the facts are not yet available, slim idea of what performance is available and zero idea of what price AMD are going to market their CPU.

    You're entirely guilty of being brainwashed into the ridiculous 'Blue is better than Red' (holden vs ford, vi vs emacs, yada yada yada) dualism which only exists in people's heads. There are many people who have current AMD systems who are happy and satisfied with what they have. They are not 'lesser' people because they buck the current Intel-biased trends of PC enthusiasts. That is what your opinion boils down to "if you don't buy <x> then you're less capable than me and my mates".

    This is effectively condensing what I'd said in my original screed -- it's disappointing that the status quo isn't going to be toppled anytime soon with anything significantly better. Small improvements here and there, but essentially nothing that will offer real-world noticeable improvements.

    Getting my hopes up that things in Computer Tech are going to be amazingly better, next year, is pointless.

    My stated disappointment stems from lack of 'wow' factor in any new system I buy, compared to what I already have. That's one of those 'first world' problems that people speak of. Nothing is stopping me from the consumer dream of acquiring new gizmos, if I have the money to burn I'm free to do so. But at the end of the day, if I'm not getting anything of value for my money, then there is no reason to experience 'buyers remorse'.

    Funny that I have reasons ... hey, let me expand upon that, just so you know that it isn't wholly hypocritical of me to espouse supporting either 'side' of the GPU market:

    There are a number of immediate reasons; lower wattage is primary (the R9 290X just chews up electricity, it's ridiculously watt-hungry), but I also have this love/hate relationship with AMD video drivers. I've had nightmare situations with AMD drivers and Window 10, currently resolved but they were painful when they happened. I've had all sorts of issues with games 'not working' because of driver-level inconsistencies. Perhaps Nvidia's drivers are better at the moment than AMD's? I don't know.

    I regularly swap between Nvidia and AMD GPUs with my four-yearly builds. I am not brainwashed into thinking that Blue > Red, and I value competition, so I vote with my wallet, supporting AMD and Nvidia if *I* think they're good value.

    The R9-290x has been a superb performer, at a cost. I could have gone down the much more expensive GTX 980 path, had I been so inclined. My experience with owning the GTX570 was okay, but my short experiment with SLI of two GTX570's was very unappealing (heat problems), and the R9 290x blew them away performance-wise.

    The R480 is an option I looked at, but it's a marginal improvement on the R9 290x which I already own.

    One thing I will definitely do if I did buy a new system early next year is -defer- my new video card choice until 2018 or so, I don't envisage actually getting a 1070 immediately, it was just an upgrade path from my current card.

    In the past I've always defered video cards, that is, my New system will have my existing video card transplanted, and down the track I'll buy the latest and greatest that I can justify. That's a quirky thing that I do, because I can, and not because I need to.

    I do hope you understand that I'm talking about consumer choice here, and that your lack of respect for other people's choices is inherently distasteful.

    I don't question your allegiance to Intel or Nvidia, I do question your choice to not even -consider- those options.
     
  9. Flamin Joe

    Flamin Joe Member

    Joined:
    Jun 28, 2001
    Messages:
    5,386
    Location:
    4300
    No it doesn't. Just get a NVMe to PCI Express 3.0 adapter like this for example. Not the best looking solution as compared to it lying flat on the motherboard with newer boards, but it does the job.
     
  10. OP
    OP
    ExcessionOz

    ExcessionOz Member

    Joined:
    Mar 5, 2014
    Messages:
    34
    Just posting what I recently read with regards to faster SSDs:

    Tech Report on NVMe performance. Page 5 shows boot times, and the next pages show game load times, which are all virtually indistinguishable from each other.

    http://techreport.com/review/30993/samsung-960-evo-ssd-reviewed/5

    Very informative, and shows how general performance isn't much changed between old (2013) and new (2016) tech, and paying top dollar won't get you much except in edge cases, specific load situations where speed is required AND delivered by a faster SSD solution. For games in general, this doesn't happen; NVMe SSD gives much faster possibilities, but not realised in normal usage.
     
  11. OP
    OP
    ExcessionOz

    ExcessionOz Member

    Joined:
    Mar 5, 2014
    Messages:
    34
    Yay for sensibility. Well said.

    <-- Back to Factorio for me ...
     
  12. shintemaster

    shintemaster Member

    Joined:
    Jan 7, 2003
    Messages:
    749
    I agree with pretty much everything you said - which is kind of ironic because RDR on PS3 was a game where I found the graphic "blurriness" to be a borderline impediment to the game at times. A tragedy they never put it on PC, there was a game there screaming out for hardware that could just give it that extra 10%.

    BTW the new one looks absolutely stunning. Going to sell a massive amount of copies IMHO.
     
  13. demiurge3141

    demiurge3141 Member

    Joined:
    Aug 19, 2005
    Messages:
    2,351
    Location:
    Melbourne 3073
    But you are really barking up the wrong tree if you're asking for gaming performance. CPU hasn't been a bottleneck for gaming for a long time. Intel could release a 10 core 6Ghz part and your 4k gaming will still be limited by your GPU.
     
  14. luke o

    luke o Member

    Joined:
    Jun 15, 2003
    Messages:
    3,790
    Location:
    WA
    Processor tech has stalled in the desktop market for the last few years. The upgrades have been barely incremental and hardly worth the money. The 4000 series to the 7000 series is a joke from Intel. Why is this though? Processor design is incredibly difficult it takes the best brains we have on the planet to make things just a teeny tiny bit faster! Also the focus in the enterprise space is on slower cores but more of them (think 48 core+ per CPU but all running at 1.1ghz) for massive parallel cloud based computing.

    Since desktops are traditionally the dribble down of tech from the server market it kind of makes sense we are seeing bugger all traditional innovation for home users. The push to offload everything to the cloud mean business can get away with mediocre machines for most users. The last hold outs are CAD/Dev and graphic design where faster individual components actually make a difference. Even in these niche's the graphics card has really become the master device and not the CPU.
     
  15. AbRASiON

    AbRASiON Member

    Joined:
    Dec 20, 2002
    Messages:
    1,124
    Location:
    Melbourne


    You're right for the most part but at some point Intel could've done 6 core for mainstream or 8 core.
    The fact it is taking AMD to push Ryzen to get into to get this Coffee Lake 6 core for mainstream to come out, it's pretty sad.

    We know multiple cores don't help THAT much either but if the cores are there, over time Microsoft will make Windows use them better and developers in general can 'count on them' to be there so they'll do what they can.

    It's been a slow slow 5 years.

    My mate has an i7-920 I built him with triple channel ram, 8 years ago and with video card upgrades, he can still run most games at 1080p, ... 8 years!
     
  16. clonex

    clonex Member

    Joined:
    Jun 30, 2001
    Messages:
    24,892
    Location:
    north pole

    If none of the things mentioned in maJORD's posts existed do you think the push would of been the same, adding new details to games has really plateaued.
     
  17. sTeeLzor

    sTeeLzor Member

    Joined:
    Dec 12, 2005
    Messages:
    2,778
    Yeah thats probably the position I would recommend for you. Invest in a better video card and monitor :) No need for new base system at this point. Unless you have excess funds, theres just not a huge return on it.

    If you seperate it out.

    Base System - $1500 to replace everything would net you maybe a 10% speed boost in FPS, maybe slightly faster loading compared to your current system.

    Video Card Upgrade - $500 gives you +80% frames and no additional boosts but its $1000 cheaper. Get yourself a high end display or save the cash for another time.
     
  18. SimonT

    SimonT Member

    Joined:
    Jul 27, 2001
    Messages:
    70
    Location:
    Mudgee, NSW
    I can vouch for the fact that CPU upgrades have really meant very little compared to graphics card upgrades. Still rocking an i5-750, albeit overclocked to a whopping 3.2GHz over the stock 2.67GHz.

    Hadn't played games in ages because my old GeForce GTX260 wasn't quite cutting the mustard any more, and with kids etc, I didn't really have the time.

    Decided to start my upgrade process by getting an MSI GTX-1080 Gaming-X card - BOOM - suddenly any game I play just works seamlessly at my old monitor's native res of 1920x1200 (yeah I know, also old-school!) on the highest settings. This is current games I'm talking about. Witcher 3, GOW4, Doom, whatever....

    GeForce Experience said my CPU wasn't up to the task of VR - but the Steam VR test showed no dropped frames at 90fps and all VR games I played with using HTC VIVE seemed flawless.

    Pretty keen to upgrade CPU and motherboard soon (had been holding out for Kaby Lake but that was a waste of time) but really there's no need to upgrade my CPU, because the 1080 just powers through everything!
     
  19. sTeeLzor

    sTeeLzor Member

    Joined:
    Dec 12, 2005
    Messages:
    2,778
    Yeah solid real world experience. Id maybe upgrade that monitor though mate :)
     
  20. Annihilator69

    Annihilator69 Member

    Joined:
    Feb 17, 2003
    Messages:
    6,094
    Location:
    Perth
    I have a i5-2500k at 4.5Ghz

    I just splashed out on a 35" ultra wide 144hz monitor and a GTX 1080, and upgraded to 16GB Ram and a couple SSD's OS & Game

    I love the fact that I just spent $2k and got best you can get again after 7 years essentially.
    It basically meant I got the monitor and SSD's for free. Woooo.
     
    Last edited: Jan 5, 2017

Share This Page

Advertisement: