right, it isnt currently as incredible as it sounds. it isnt actually decoding the incredible amount of information it takes to display a proper 4k picture because current tech cant really handle it yet. all the media is severely bitrate starved garbage which is how they "solve" this problem of how to process that much information...just dont process that much information. but go ahead and display it at that resolution anyway and hope everyone is too oblivious to know or care otherwise. just like youtube crap is about 10 levels worse than the resolution it decodes to. 4k on youtube is worth playing back at 360p on a good day. this is the direction everything will continue to go...market stuff as being bigger and better but make it a horrible pile of shite because all the braindead rubes wont know any better.
h265 was supposed to help solve this enormous problem of how to get a remotely respectable amount of quality into the bit starved trash that reasonably affordable current technology can handle...like most things it is taking much longer than hoped/anticipated to become a mature technology itself. in the mean time most if not all of this media will continue to be garbage.
Where's a "like" button when I need it?
Compression issues first became apparent to me with the DTV transition from analog between 2006-2009. I walked the isles at department stores and looked at all the LCD HDTV sets. Many store displays were still running analog cable content on the hidef sets, and the ones that got digital feeds to the TVs had compression artifacts.
My mom and I resisted subscribing to cable for as long as we could, but we live 2 miles south of the airport and Murphy's Law dictated the placement of the air traffic control tower directly blocking the path of the DTV over the air signals. Analog signal would create ghosting issues whenever planes took off and landed, resulting in a fuzzy but watchable signal. But DTV is extremely sensitive to multipath distortion, the picture would go black for 10-15 seconds, making OTA signals unwatchable, even with a properly aimed high gain antenna. It didn't matter that the signal strength within city limits was strong, when multipath distortion scrambled the signal resulting in all-or-nothing reception.OTA DTV broadcasts cram up to 21Mbits of bandwidth into a 6Mhz band by transmitting a 3-bit signal with 8 individual logic levels at high bitrate. As a result, multipath destroys the signal. The "bonus content" sub-channels are almost exclusively broadcast in 480i, no better than analog cable, but worse with compression.
DVD is good.
BluRay is great.
4k UHD is placebo.
Here's why:
DVD has a maximum bitrate of 9.8Mp/s (720x480i 60Hz or 720x480p 24Hz NTSC, or 720x576 i50/p25 PAL) MPEG2.
BluRay has a maximum bitrate of 40Mb/s (up to 1280x720p 50/60, 1920x1080p24, or 1920x1080 i50/60) AVC.
The vast majority of BluRay movie content is 1080p24 progressive. Compared to (properly mastered) 720x480p24 NTSC widescreen DVDs, that is exactly 6x the pixel count per movie frame for BluRay compared to DVD. BluRay has a maximum bitrate of 40Mb/s compared to 9.8Mb/s for DVD, so content can be encoded at roughly 4x the bandwidth. Because AVC codec has roughly twice as efficient compression efficiency compared to MPEG2, this translates to roughly 8x video quality for 6x pixel count. This is good, because that means BluRay content can have a higher temporal quality afforded to it compared to DVD.
Additionally:
DVD dual layer capacity: 8.5Gb
BluRay dual layer capacity: 50Gb
This offers BluRay nearly 6x the storage space compared to DVD, so it is possible to store more content with less compression, even with regards to pixel count.
Moving onto UHD BluRay:
Screen Resolution: 3840x2160p, typically 24fps for movies, H.265, at 128Mb/s max bitrate.
100Gb triple layer discs; 66Gb dual layer.
First and foremost, I don't care what anybody says; 4k resolution is a placebo, and is slightly more data than a 20/20 pair of eyes can resolve at an optimal 60 degree view angle. And yes, if you look hard enough, you can still see the individual pixels from six inches away from the screen. Inversely curved screens are also a stupid gimmick. Great if you're sitting in the one seat in the home theater that forms an equilateral triangle with the screen edges. Move even a little bit to the side, and the curved screen distorts the view. I am extremely sensitive to shapes, and the curvature of the screen causing straight lines to become bent at non-optimal viewing angles, would drive me nuts. Flat panels are relatively viewable at all angles. Secondly, OLED tech is immature at this point, and current sets will suffer from burn in and will be junk in a few years. HDTVs are an investment. My first flat panel from 2006 is still kicking...
Statistically, you have a codec that is only marginally more efficient compared to AVS, compared to the quantum leap of MPEG2. The 128Mb/s bitrate allows for just over 3x the quality of BluRay, with 4x the pixel count. This is good. However, UHD-BDs can only hold 2x the data, so expect higher compression rates, especially for releases packed with hours of bonus content. Some newer films like the Hobbit Trilogy and others were filmed at 48Hz, while UHD BluRay does allow for higher frame rates than 24p or 30i. Again, the quality will suffer for long movies at higher frame rates crammed onto discs with only double the capacity. The real benefit will be from the higher bit depths compared to the 8bits per channel on older DVD and BluRay tech. This will allow a richer color gamut with blacker blacks and brighter whites compared to standard HDTVs. But you'll need a TV with newer display technology as LCD will not cut the muster. OLED is currently being pushed by manufacturers, which as I mentioned before, can have serious burn in issues within just a few weeks or months of constant use.
But most movies from the mid 90s until the end of the first decade of the new millennium, while encoded in HD, for many movies, special effects were not mastered in UHD or higher and did not take advantage of the superior bit depth either. As a result, conversions of recent movies, some even just a few years old, will be placebo. Movies originally mastered in HD will be upscaled to 4k but you cannot add detail where none existed. However just like with BluRay, old film movies contained much higher definition and dynamic range compared to SD, and benefitted immensely from being remastered to HD for BluRay reissue. A similar effect will occur for old classic film movies, with old photographic film negatives having a color depth and resolution far beyond even that of BluRay.
A bigger issue with the transition to 4k UHD is streaming and broadcast media. Modern RG-6 Coax used by cable companies has a total bandwidth of 2Ghz (0-2000Mhz) which has to be subdivided into hundreds or thousands of HD channels, as well as a significant portion of this bandwidth needs to be used for broadband internet access shared by hundreds of customers, at least for the last mile or so of access until said bandwidth is converted to fiber upstream.
Satellite is even worse. There are not only hundreds or thousands of HD channels beamed down from the geostationary satellite, but also every single local OTA station in the United States needs to be transmitted as well, because each customer needs access to local channels. As a result, most local channels available over satellite are heavily compressed 480i. I have seen compression artifacts in broadcast media, and at times said artifacts have even been plainly visible even using digital-analog RF adapters with old school CRTs! Ever since Comcast stopped transmitting analog channels over cable, overall quality has improved with less artifacts, largely due to freeing up a lot of available bandwidth.
I fear similar issues as cable and satellite systems upgrade their equipment to broadcast UHD content, the compression artifacts will get worse, not better. While disc based media obviously trumps broadcast media (also the variable bitrate on prerecorded media can be increased to peak rates much higher than average when needed, something that is absolutely not possible with constant bitrate required by broadcast or streaming), squeezing UHD content into networks already overloaded with current HD content, the artifacts will only get worse. If you allocate the same bandwidth to a UHD stream as an existing HD stream, the picture will look worse, sometimes much so. Artifacting in broadcast or streamed UHD content will become plainly visible even when downsampled to sub-HD resolutions.
Eventually bandwidth capability of equipment will catch up to UHD, but much more slowly. Moore's law is breaking down rapidly, as hard drive capacity and CPU speed are beginning to reach the limits as to what the laws of physics will allow. There may even be a global "crash" affecting servers and content providers world-wide, as technology and existing infrastructures cannot keep up with the rising global demand for bandwidth, storage, and computational resources.
Forgive me for my extremely off-topic rant...