A digital illustration of a anthropomorphic gray cartoon wolf blushing and holding his paws up to his cheeks.

Trading 1.4ms for sharper fonts and better colors

page add

While trying to debug some issue caused by my external display that’s unrelated to this post, I ended up googling “AGDCDiagnose”—a seemingly obscure tool that (apparently) spits out a bunch of useful properties and diagnostics from attached displays. Unfortunately, this tool doesn’t work anymore on Apple silicon Macs, so I’m not sure what its output is supposed to look like.

$ /System/Library/Extensions/AppleGraphicsControl.kext/Contents/MacOS/AGDCDiagnose
AGDCDiagnose Version: 8.1.9 (AGDC node count: 0)
AGDC node not found, bailing ...

Surveying the Google search results, I found this Stack Exchange question, which catalogues a treasure trove of information related to debugging external display usage on macOS. Oh, how I wish every request for help on the internet was written with this much care and detail.

Following the link to a blog post written by the author, he details how he was able to force 4:4:4 chroma subsampling by using a refresh rate of 120Hz and fiddling with HDR.

I remember testing my display with the exact same chroma subsampling “torture test” in the past, being disgraced with a blurry mess of pixels in certain sections, and accepting that I was simply going to have to live with it. But out of curiosity, I tried lowering my refresh rate, and:

A comparison of the "red on blue" and "blue on red" text sections of the chroma subsampling torture test, taken with a macro lens, when macOS was set to 120Hz and 144Hz. The text is visibly sharper when the macOS is set to 120Hz.
The text is sharper when a refresh rate of 120Hz is selected.

Ah. Sure enough, I’ve verified through my display OSD that macOS sends YCbCr to my display when using 144hz, and RGB when using 120hz. When the former is active, in typical scenarios, colors appear slightly washed out, and almost undetectably “blurred”. It’s a subtle change that I’m not sure most people would be able to detect unless they were looking at the torture test image.

I remember the degraded color rendering posing a problem in chat apps like Discord, though—roles with vivid colors such as pink or magenta would often appear slightly distorted on the cool gray background of the chat window.

Here’s an alternate version in black and white, which might make the difference more visible:

The same as the previous image, but with grayscale images.

I’m ultimately not sure why macOS does this (something to do with bandwidth?), or whether it’s forced to or something, but this wasn’t totally obvious to me. There was literally no indication to me that a “worse” level of chroma subsampling would be picked, and there was no way for me to know that refresh rate could affect it.

…Is this even chroma subsampling? I’m not 100% sure—after all, the original Ask Different question was trying to find a way to know for sure on the software side. Anyhow, the image on my display was subtly bad for the longest time until I stumbled upon a random blog post.

I’m more than willing to live on 120hz for good colors, though. Highlighted tokens in my text editor are a bit more vivid, and all text (even monochrome) has become a tad sharper. I’ll take it.

PawMonochromatic icon of a dog's paw