Discussion in 'AMD x86 CPUs and chipsets' started by Skramit, Nov 7, 2019.
1900 max IFclock.
looks like CTR 2.1 (rc5) has just been released (13/5)
i played with one of the earlier versions briefly a while ago; i wonder if this version is going to be significantly different? reading thru the guide notes now.... one thing i think it didnt have before? been a while since i looked - was 4 main modes. 1 for idle, PX high for the 2 best cores, PX mid for the best 4 and PX low for the best 6/8 cores in the system (model dependant). CTR is now gating the switching between profiles so that CTR cpu usage doesnt spike too much on zen 3 processors. claimed that it is better (easier?) than curve optimise through better voltage management. actually, iusmus seems to be claiming better power consumption overall by way of that lower volt usage. anyway, the accompanying vid:
I posted this in the intel thread as well but am posting it mainly because of the comments in relation to temperature and power made towards the end of the video.
Also it is the first comparison I have seen using 3800C14 ram
power consumption aside,it's great that there's some real competition again.
pricing trending down is partly due to supply opening up, but it's also a direct outcome of that competition
what a time to be silicon gear head
Yep great to have 2 competitors again.
Last AMD I had was a Athlon 64 in 2003 which I vaguely remember being ok with but didn't love it. I've been through a tonne of Intel systems since then. May have missed a few as I forgot the names but last few years I went form 4690K (literally died after about 5-6 years, had it overclocked and lost interest for awhile in pc's).
Then I got the bug again last few years and went from a 9600k>9900k>10900k (average OC)>10900KF and all up about 4 motherboards. Keeping my water cooled 10900KF/APEX XII/1080TI as my pc in the loungeroom. It's a good overclocker and plan to run that till it dies.....hopefully not lol.
I've been very tempted to get a AMD system so last few months I built a ITX Ryzen 3600 (cooled with a Scythe Fuma 2)/B450I/2080TI system to use in my office for most of my gaming and it's small so sometimes I move it around for a media centre pc for another room. First impressions seemed a tiny bit slower till all the drivers/windows updates etc were loaded then it was just as fast so happy in that area. Games it was good but sometimes I felt it would have slight stuttering which was annoying and I was so close to buying a 5600x or higher cpu to see if that would fix it. I wasn't planning to overclock this in anyway like my intel system and just wanted it to run well/quiet but last 2 days enabled PBO, no other extra tuning besides XMP on the 3200CL16. PBO fixed my stuttering in games so now lost the itch to add a better cpu.......for now lol.
3600 seems like a great gaming chip, when I had the 9600K it also had stuttering which I could not fix and annoyed the crap out of me (9900K upgrade fixed it). Glad I found a solution on the 3600, I'm assuming 12 threads vs 6 made a difference though in helping get rid of it.
Overall very happy with the new AMD set up and if I was building a new over the top rig think I'd try AMD next for high end. Saw a few ram kits in the forum for sale 3600CL14, do you guys think it is worth the upgrade from 3200CL16? I'm gaming at 1440P with the AMD. Google results seem to show pretty low improvements in FPS but you know the feeling of just wanting better parts...............might see if I can get a 3080tI to stick in the ITX to play with when they come out (I don't have high hopes based on current gpu shortages though....).
ye be added to olde FAQ
got 30k on cinebench r23 with the default oc from my mobo manufactuerer- but toned it down a bit with a PBO undervolt of -10 on my mobos curve, runs cooler and hits 25.5k in r23
pretty happy with the 5950x. everything is faster. Every. single. thing.
currently running 3200 ram- 4slots 64gb- gonna try to swap to 2x 32 gb 3700 ram to maximise performance. but dunno if itl be worth it.
Anyone here moved from a 3900X to a 5900X whilst using an RTX3090 gpu?
Just want o to know if you saw any benifit especially when gaming at 4K.
My current specs are in my sig belo.
About to rebuild my system into a new open frame style case so its a good time to maybe look at a cpu change?
i moved from 9900k and i noticed some legit frame rate gains im sure 3900 will be a bigger jump because the microcode for the 5000seires per core and memory management just as the 9900k+ has less convolution for how the data flows- so upgrading would cause less ingame micro stutter
i too am on a 3090 - everything is cpu bound except rtx
granted is it worth it per $? probably not - you might want to wait for ddr5 instead- but if uve got the cash to splash and you're gonna upgrade anyway- and u run 144hz+ - yeah id upgrade personally.
the raw compute performance aint that impressive- threadrippper still is king for that. its the lower latency per frame processing thats impressive with the 5000 seires its starting to match intel
Thanks for the advice, might hold off & spend a few dollars on a PCIE 4 NVME addition to my rig instead.
My monitor only goes to 120hz, mind you I doubt I hit that rate in cyberpunk 2077 when settings are set to ultra
man i was lucky to hit 60fps in 2077 on ultra -_-
if 2077 is ur game with RTX on- ur cpu is bored to death playing that game LOL
I went from 3900X to 5800X. It was a solid improvement. RTX3080.
This weekend past i swapped in a 5900X now however I have not had a chance to test in gaming. In workloads however it has noticeably improved again.
So from a CPU upgrade alone I got 10fps improvement in Warzone. I went from a 3950X to a 5800X on a RoG Strix 2080 Super OC @ 3440x1440.
I found this WAY too aggressive for my CPU. Cinebench was about the only app not crashing after applying the recommended profiles. Yes I set BIOS as recommended. I snuck up voltages here and there, then wound back clocks here and there. In the end I would get crashing in most games after 15 mins or so, despite temps being less than 60 degrees. Prime95 would instant fail on low thread count workloads, and after 5 mins or so on all thread workloads. I know the author says prime95 is too hard a workload and unrealistic but I would rather an OC that is 100% crash proof.
Additionally my single core bench in other apps also went down with CTR 2.1 (rc5), happy to try out a future release, perhaps he just needs more data yet?
So I am back to just letting PBO do it's thing, living crash free again.
I am getting into crypto so ended up getting a 5950x for Chia plotting. It is a monster. 4.5GHz on 16 cores is a sight to behold. The two cores getting to 5.05GHz is nice. Can't wait to game on this after I have done the work I need to do.
Apparently a B2 stepping will be coming.
Just the kick in the guts Intel needed. While already on the floor clutching their guts...
XT models incoming?
idk... don't steppings usually have bug fixes as well? Would they make and ship B0's and B2's at the same time. That would be weird I would think. The Ryzen 3000 series XT's had the same stepping as the non-XT's.
The whole Warhol/XT/next-cpu-release is super confusing
Edit: from another forum... "OPN is the same so they are likely not distinct products in the market."
Original B2 tweet - https://twitter.com/patrickschur_/status/1394433467233021966
I only just got a 5900X so please don't do this to me Lisa.