I'd point out though that with ordinary display and print systems, saturated reds and blues really are darker than greens. The exact formula depends on your color space but
Grayscale = 0.299R + 0.587G + 0.114B [1]
is commonly quoted (I think for sRGB) and in that case the brightest pure red is about 30% bright and the brightest pure blue is 11% which makes "bright red" an oxymoron in most cases.
You can certainly use those colors but they are always going to be dark. Simply applying contrast rules will make your color choices accessible but if you want to make something accessible and that look good the techniques in that article will make you a winner.
For that matter, saturated screen greens are nowhere near as saturated as is possible but are more saturated than most greens you see in real life: I make red-cyan stereograms
and one rule of thumb is that trees and grass look great in stereograms because even though they are green, they actually have a lot of red so the balance between the channels is good so you get good stereo and good color.
This brings back thoughts of the NTSC days and broadcast safe limits, and the horrible time of clients that loved loved loved red. Explaining to them how the beautiful red artwork will be anything but beautiful on TV was never fun, especially if they wanted it for broadcast. Even when it wasn't for broadcast, an illegal red could still be seen frames later, and would bleed like it just had its throat slit.
As someone who also worked within the arcane limitations of analog video, at both the broadcast and prosumer levels, today's UHD video standards and colorspaces can be incredible when correctly applied in a maximal high-end workflow such as native 4k 10-bit HDR.
Yet when I look at today's typical "top quality" live broadcast content such as the 4K Super Bowl as delivered by mass consumer distribution such as Comcast Xfinity (via their latest high-end decoder box), it's a visual mess compared to what the signal chain should be capable of delivering.
Even though I have top notch viewing gear properly configured and calibrated (with local video processing 'enhancements' disabled), it looks terrible. Unfortunately, due to the layers of compression, conversion and DRM slathered on the signal before I receive it, it's extremely difficult to analyze what's going wrong. All I can determine is that it is a video feed being decompressed into a 4K-resolution, 4:2:2, 60fps frame buffer. However, examining still frame sequences reveals extensive motion, color and resolution artifacts.
The net effect conveys a sense of "sharpness" in the frequency domain at first glance but on critical viewing over time it's a weird kind of digital abomination of macro-blocked chroma splotches, lagged temporal artifacts and bizarre over-saturation of primary hues. While some pre-compressed streamed film content looks quite good when delivered via a streaming service willing to devote sufficient encoding time and delivery bitrate, it's still hit and miss. Live broadcast content, especially high-motion sports, is almost always a mess. We've come so far in standards and specifications yet still have so far to go in the actual delivered result to most households.
Years ago when deciding to cut the cord, I had to convince my roommate that an OTA DTV antenna would provide a better image. We had clear line of sight to the broadcast towers, so I knew it was a no brainer, but I'm in the video side of things, and he's not. This makes him a great analog for the vast majority of viewers. I set the inputs on the TV to the same channel for the Comcast cable box and the OTA antenna, and then A/B tested the inputs for him. Even he could see how bad the image from cable came. Their push-a-button-get-a-prize style one set of encoding settings for all content will always mean their low bit rates look bad.
My favorite cable box sports example was a PGA tournament was showing a golfer putting on the green. The shot was an extremely tight close up of the ball sitting there as the golfer addressed the ball. All of the dimples in the ball were clear, and every blade of grass was visible until the golfer swung and made contact with the ball. As the camera panned to follow, the ball went to this white roundish shape with no detail and the grass went to this blurry green smear again with no detail. As soon as the ball went into the cup and the camera stopped moving, at least one GOP later the grass snapped into full detail again.
Their predictive model is tuned for low motion static content because that's what 90%+ of their content is. Even something like ESPN is now primarily talking heads of people talking about sports rather than being sports. Any sports show in replay and not live so who cares? Looking back at crappy SD tape captures, it's obvious that anything was better than nothing. Much like YouTube. People just want something, doesn't have to be amazing. If it looks like Picasso instead of Monet, they don't care as long as their minds don't have to think
And to think the US government gave anyone that wanted one a free DTV antenna. By that point, pretty much nobody used a terrestrial antenna any more, so a very few number of people took them up on the offer. I can only imagine cable companies being very please with that.
Also, the signal was meant to have even more bandwidth. When the broadcasters decided to bring out the fractional channels, it didn't exactly fit the idea that Congress had when allocating the frequencies. Yet another example of how Congress can be behind the times in pretty much everything.
When we bought our place we put a rooftop mounted antenna, and Distribution Amp (essentially a zero loss splitter), and then pulled RG-6 Quad Shield to each room.
Interesting comment, thanks. Two questions out of curiosity:
> Even though I have top notch viewing gear properly configured and calibrated
Any chance you'd be willing to share a few details about this?
> While some pre-compressed streamed film content looks quite good when delivered via a streaming service willing to devote sufficient encoding time and delivery bitrate, it's still hit and miss.
Which streaming services are doing things right in your view?
> Any chance you'd be willing to share a few details about this?
I have several viewing devices in different rooms including an LG C2 OLED, a high-end Samsung QLED and in my dedicated, fully light-controlled home theater room a native 4K 10-bit HDR+ laser projector and 150-inch screen. Each of these displays has been professionally calibrated. To objectively evaluate an input source these days it's important to try multiple different display technologies because flat screens can vary between OLED, QLED, mini-LED, LCD and VA which all have different trade-offs in contrast, peak brightness, viewing angles, color spaces, gamma response curves, etc. And that's before getting into various front projector technologies.
Most consumer TVs these days come with a pile of post-processing algorithms which claim to deliver various "enhancements." In almost all cases you'll want to turn these options off in the setup menus. For critical viewing, objective calibration with a suitable colorimeter is ideal, especially when considering HDR sources which should be normalized to each display's native capabilities in Nits. If you don't want to dive down the rabbit hole of evaluating all this yourself (which can admittedly get complex), I suggest the TV reviews at https://www.rtings.com which are credible, thorough and yet still relatively accessible to non-experts. Unfortunately, RTings doesn't evaluate front projectors. For that the best bet is an expert forum like AV Science (https://www.avsforum.com).
> Which streaming services are doing things right in your view?
Currently, I don't think there's any service I would say is universally "doing it right." It still varies depending on the individual piece of content. Amazon, Netflix, AppleTV and even YouTube each have some extremely well-encoded, high bitrate content. But I've also seen examples on each service that aren't great.
The highest-quality home source will typically be a UHD Blu-Ray disc player. If you have such a player I highly recommend the Spears and Munsil UHD Benchmark Discs (https://spearsandmunsil.com/uhd-hdr-benchmark-2023/). Just because a disc is UHD format doesn't mean the media on it has been encoded correctly, from a high-fidelity source and in appropriate quality. The Spears and Munsil disc features a comprehensive suite of custom-designed test signals and specially sourced demonstration content identically encoded in HD, UHD, HDR, HDR10, HDR10+ and DolbyVision, including moving-window split screens allowing you to compare formats. It's extremely impressive and, as a video engineering geek, I found it fascinating to explore for hours on my various displays - while my wife had zero interest in it :-).
Yes, the visual quality of a sports game can vary a lot and is frequently a disappointment. I can get an ATSC 3 multiplex from Syracuse and it is sad that it is not really better than the ATSC 2 symbol.
That's disappointing to hear because my current residence came with a large digital-capable antenna installed in the attic which I've never connected. When more local stations in my market start ATSC 3 broadcasts next year I was thinking of hooking up an OTA feed just to see if it's better than the Comcast XFinity cable-delivered mess.
I don't even watch that much TV content but when I do, I want it to not look like crap. It's frustrating because I'm sure the four national broadcast networks and top cable channels (eg ESPN, CNN, etc) are providing pristine source feeds at their head-end distribution points which look amazing. Is there even any meaningfully better-quality alternative these days? Maybe some over-the-top streaming provider of broadcast and cable channels who actually delivers 4k sources with guaranteed high-quality encoding and decent bitrates? If so, I'd cut the Comcast cord even if it costs more. It's not like Comcast is cheap but I also hate the idea of paying top dollar for such a substandard product simply because there are no better alternatives.
BTW: I'd be delighted to learn of any viable US-based content alternatives (eg streaming, direct satellite, etc). Back in the analog SDTV days I had a C-Band sat dish and the direct network feeds looked amazing in pure 6 Mhz analog component compared to local cable and even local OTA broadcast.
Red is still a particularly hard color to accurately represent, just not as hard as it used to be. The bleeding and chroma crawl that is most visible in NTSC red, has been replaced with at best half chroma resolution, and depending on how the viewer's decoder works, red edges may be especially harsh-looking.
It's definitely better than it used to be, though.
The better monitors can be reconfigured to use the DCI-P3 primary colors instead of the default Rec. 709 primary colors (a.k.a. sRGB primaries).
(sRGB combines the Rec. 709 primaries with a certain nonlinear transfer function, while Display P3 combines the DCI-P3 primaries with the sRGB nonlinear transfer function and with the PAL/SECAM white, which is also used by Rec. 709 and sRGB).
With the DCI-P3 primaries, it is very noticeable that the red is much better, allowing the displaying of reddish colors that are simultaneously more saturated and brighter than what can be achieved with the Rec. 709 red.
While DCI-P3 also has a better green than Rec. 709, there the improvement is much less obvious than in the red area.
The monitor has a set of primaries that doesn't change but the monitor can treat R, G and B signals as if they are in a particular color space with certain primaries and do the best that it can to simulate the appearance specified in the signal.
For most monitors, as a user you cannot know which are the true colors of the pixels of the screen and this is completely irrelevant.
What matters is which are the colors that will be reproduced on the display when you send the digital codes corresponding to pure red, green and blue, through the DisplayPort or HDMI interfaces of the monitor.
All the good monitors have a menu for the selection of the color space that will be used by DisplayPort and HDMI, and the menu will typically present a choice between sRGB and Display P3 or DCI-P3. Even when in the menu it is written DCI-P3, what is meant is Display P3, i.e. the menu changes only the primaries, without changing the white or the nonlinear transfer function.
All monitors will process the digital codes corresponding to standard color spaces to generate the appropriate values needed to command their specific pixels in order to reproduce a color as close as possible to what is specified by the standard color space.
The cheapest monitors are able to display only a color space close to Rec. 709 a.k.a. sRGB, those of medium price are normally able to display a color space close to DCI-P3 and a few very expensive monitors and many expensive TV-sets, which use either quantum dots or OLED, are able to display a larger fraction of the Rec. 2020 color space (laser projectors can display the complete Rec. 2020 color space).
Even when a monitor can display bright and saturated reds, as long as it remains in the default configuration of using sRGB over DisplayPort and HDMI, you cannot command the monitor to display those colors. For that, you have to switch the color space used by DisplayPort and HDMI to a color space with a wider color gamut.
Some monitors, typically those that are advertised to support HDR, allow the use of the Rec. 2020 color space over
DisplayPort and HDMI, but most such monitors cannot display the full Rec. 2020 color space, so the very saturated colors will either be clipped to maximum saturation or mapped to less saturated colors.
> All monitors will process the digital codes corresponding to standard color spaces to generate the appropriate values needed to command their specific pixels in order to reproduce a color as close as possible to what is specified by the standard color space.
This is overly optimistic. It has gotten better lately but most monitors aren't calibrated as well as they could but. And not that long ago the RGB signals were directly mapped to the monitors colors.
I'd point out though that with ordinary display and print systems, saturated reds and blues really are darker than greens. The exact formula depends on your color space but
is commonly quoted (I think for sRGB) and in that case the brightest pure red is about 30% bright and the brightest pure blue is 11% which makes "bright red" an oxymoron in most cases.You can certainly use those colors but they are always going to be dark. Simply applying contrast rules will make your color choices accessible but if you want to make something accessible and that look good the techniques in that article will make you a winner.
For that matter, saturated screen greens are nowhere near as saturated as is possible but are more saturated than most greens you see in real life: I make red-cyan stereograms
https://en.wikipedia.org/wiki/Anaglyph_3D
and one rule of thumb is that trees and grass look great in stereograms because even though they are green, they actually have a lot of red so the balance between the channels is good so you get good stereo and good color.
[1] https://www.dynamsoft.com/blog/insights/image-processing/ima...