This isn't a problem, I just don't understand what's going on So I have a 4k video that has a bitrate of 99.1mbps which works out to about 12.5MBps. Now in CrystalDiskMark my hdd only gets a 4k read speed of 0.5MBps (sequential reads of 120MBps) and using any media player the video plays flawlessly, both on my 4k TV and my 1080p monitor. How does that even work? Am I actually losing quality somewhere? (It looks amazing) Also I have a much lower bitrate 4k movie that will only run smoothly in MPC-HC player, whereas the other players (VLC and built in Windows 10 player) pin my CPU at 100%, even if I set VLC to hardware decoding. Is MPC-HC just better at decoding with GPU? Or does it skimp on quality somewhere? Like I said, I don't actually have an issue, I just don't understand what's going on here
not sure if you're confusing the video's bitrate with the drive's transfer rate: to me they're not the same thing.
using CrystalDiskMark, what does your HDD get for 1080P? edit: this thread needs to go in the pool room surely you know that 1080P is a resolution, as in 1920x1080 pixels so if you say 1080P video.... guess what 4K video means? that's right, it's referring to the resolution (no. of pixels) of the video. 4K in CDM is a transfer size, as in Kilo (bytes/bits), just like you have mega bytes/bits, giga bytes/bits etc. soooo.. amount of pixels =! transfer size the two are completely unrelated
Yes I know it's resolution haha but if the bit rate is higher than the read speed of the drive, how can it be playing smoothly? Shouldn't it be dropping quality somewhere? Also I don't get how a lower bit rate 4k video can put more stress on the cpu than a higher one?
Ooooh right, I thought it related to file size, like transfer speed for 4k video vs 1080p, it all makes sense then except for the lower bit rate video being harder to play. Could be codec related?
The lower bitrate 4K file is probably h.265. h.265 is most effective at the lower bitrate ranges. h.265 means hardware decoding becomes a lot more picky. And if you use the CPU, the hit is quite substantial. As for the disk read speed. Your disk reads at 120 Mbyte/sec. The 4K file is 12.5 Mbyte/sec. The disk is nearly ten times faster than the file's bitrate.. it can read it very easily.
highly likely so to reiterate: FHD (Full High Definition): '1080' = 1920x1080 UHD (Ultra High Definition): 4K = 3840x2160
Haha yes I'm very aware of this, just didn't phrase my question very well, I thought '4k' in CDM was a transfer of a 4k file, but It's not, that's where the confusion came from As a note, 4k is actually 4096x2160, uhd is 3840x2160 but the distinction is irrelevant Thanks all for the knowledge dropping