Hey fellas, could you help me understand a bit more about HDR?
- I understand that it’s an absolute brightness standard, not like the relative levels in SDR
- But why does it end up washing out colors unless I amplify them in kwin? Is just the brightness absolute in nits, but not the color?
- Why does my screen block the brightness control in HDR mode but not contrast? And why does the contrast increase the brightness of highlights, instead of just split midtones towards brighter and darker shades?
- Why is truehdr400 supposed to be better in dark rooms than peak1000 mode?
- Why is my average emission capped at 270nits, that seems ridiculously low even for normal SDR screens as comparison.
Cheers 😊
Edit: It’s a QD OLED
Does your display specifically support HDR? If not it doesn’t matter whether the codecs/media/metadata support it, you won’t gain from it
I understand that it’s an absolute brightness standard, not like the relative levels in SDR
The standard is also relative brightness actually, though displays (luckily) don’t implement it that way.
why does it end up washing out colors unless I amplify them in kwin? Is just the brightness absolute in nits, but not the color?
It depends. You might
- have a driver bug. Right now only AMD has correct color space communication with the display, that doesn’t work correctly on Intel and NVidia yet
- have a display that does a terrible job at mapping the rec.2020 color space to the display
- be just used to the oversaturated colors you get with the display in SDR mode
Why does my screen block the brightness control in HDR mode but not contrast?
Because displays are stupid, don’t assume there’s always a logical reason behind what display manufacturers do. Mine only blocks the brightness setting through DDC/CI, but not through the monitor OSD…
Why is my average emission capped at 270nits, that seems ridiculously low even for normal SDR screens as comparison
OLED simply gets very hot when you make it bright over the whole area, the display technology is inherently limited when it comes to high brightness on big displays
Hey there, thanks for the comprehensive reply, I learned a lot. Also, your blog is fantastic, I’m always happy when there’s a new post =)
Question about the last point: I feel like in SDR mode, the OLED is pushing brighter images. I almost feel like it’s underselling the capabilities at 270, but does so to give pixels a rest every now and then, in the hope that the bright spots don’t stay stationary on the screen. It’s a wild guess, I have no idea.
Also, your blog is fantastic, I’m always happy when there’s a new post =)
Thank you, I’m glad you like it!
I feel like in SDR mode, the OLED is pushing brighter images. I almost feel like it’s underselling the capabilities at 270, but does so to give pixels a rest every now and then, in the hope that the bright spots don’t stay stationary on the screen. It’s a wild guess, I have no idea.
It’s certainly possible, displays do whacky stuff sometimes. For example, if the maximum brightness in the HDR metadata matches exactly what the display says would be ideal to use, my (LCD!) HDR monitor dims down a lot, making everything far, far less bright than it actually should be.
KWin has a workaround for that, but it might be that your display does the same thing with the reported average brightness.
But why does it end up washing out colors unless I amplify them in kwin? Is just the brightness absolute in nits, but not the color?
The desktop runs in SDR and the color space differs between SDR and HDR, meaning you will end up with washed out colors when you display SDR on HDR as is.
When you increase the slider in KDE, you change the tone mapping but no tone mapping is perfect so you might want to leave it at the default 0% and use the HDR mode only for HDR content. In KDE for example, colors are blown out when you put the color intensity to 100%.
Why does my screen block the brightness control in HDR mode but not contrast? And why does the contrast increase the brightness of highlights, instead of just split midtones towards brighter and darker shades?
In SDR, your display is not sent an absolute value. Meaning you can pick what 100% is, which is your usual brightness slider.
In HDR, your display is sent absolute values. If the content you’re displaying requests a pixel with 1000 nits your display should display exactly 1000 nits if it can.
Not sure about the contrast slider, I never really use it.
Why is truehdr400 supposed to be better in dark rooms than peak1000 mode?
Because 1000 nits is absurdly bright, almost painful to watch in the dark. I still usually use the 1000 mode and turn on a light in the room to compensate.
Why is my average emission capped at 270nits, that seems ridiculously low even for normal SDR screens as comparison.
Display technology limitations. OLED screens can only display the full brightness over a certain area (e.g. 10% for 400 nits and 1% for 1000 nits) before having to dim the screen. That makes the HDR mode mostly unuseable for desktop usage since your screen will dim/brighten when moving large white or black areas around the screen.
OLED screens simply can’t deliver the brightness of other display technologies but their benefits easily make it worth it.
Ah cool, I didn’t know that there are layers of capabilities for different requested brightnesses. Thanks for your in depth reply! I’m also a 1000 nits enjoyer but I don’t switch on any lights - I like when my eyeballs get blasted with colors. 😂
For the washed-out colors, are you using an Nvidia, Intel, or AMD GPU? If you’re using AMD you need to run kernel 6.8 or later I believe, if you’re using DisplayPort.
I’m not sure why your display lets you adjust contrast in HDR mode, I would just leave it at the default imo.
I’m using all of them sometimes. ^^ Washed out colors are not an issue on AMD anymore as you said it, but on nvidia I can’t seem to fix it. I wonder if this is happening to absolutely everyone, as the arch wiki makes it sound like nvidia 545+ has been reported working…
About the contrast: I wish I could, but I found that the factory default was 70% and it did seem to often cause noticeable dimming because the image was too bright for the max avg luminance. It felt weird and I think it’s because Alienware, like many manufacturers, just can’t resist blasting the consumer with overtuned contrasts to get a purchase out of it.
HDR content looks washed out on my HDR TV and my work Mac. At this point I’m pretty sure “washed out” is just the HDR look. I just turn it off in anything I can now.
Why do you need HDR enabled on your desktop, may be another question you want to ask, this only if it’s causing you problems of course.
Well if everything’s working correctly you’d want the desktop itself to stay close to the sdr values but have applications that are HDR capable to make use of it. Otherwise you’re limited to full screen apps making use of it.
My monitor (Acer XV275K P3) has a better MiniLED local dimming algorithm in HDR mode than in SDR, so even SDR content looks better that way. Also it’s annoying having to switch it back and forth, it’s way easier to just leave it in HDR mode and not worry about it.