Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have Interstellar on 4K UltraHD Blu-ray that features HDR on the cover, Sony 4K Blu-ray player (UBP-X700) and a LG G4 OLED television. I also have an AVR (Denon AVR-S760H 7.2 Ch) connecting both the Blu-ray and a PC running Linux with a RTX 3060 12GB graphic card to the television. I've been meaning to compare HDR on Linux with the Blu-ray. I guess now better than never. I'll reply back to my post after I am done.




Try it with different monitors you have. The current nVidia Linux drivers only has BGR output for 10bpp, which works on TVs and OLEDs but not most LCDs monitors.

My monitors (InnoCN 27M2V and Cooler Master GP27U) require RGB input, which means it's limited to 8bpp even with HDR enabled on Wayland. There's another commentator below who uses a Dell monitor and manages to get BGR input working and full HDR in nVidia/Linux.


I connected two portable LCDs I have that support HDR. Both LCDs didn't automatically detect HDR and looked washed out initially. I had to manually change them to HDR. The signal according to the AVR was...

  Resolution: 4K60

  HDR: HDR10

  Color Space: YCbCr 4:4:4/BT.2020

  Pixel Depth: 12bits

  FRL Rate: ---
...for both the LCDs.

Here are their specs:

  https://www.amazon.com/dp/B09Q5L245X 

  https://www.amazon.com/dp/B08131MVGT
With the HDR off for both the desktop and LCD, the Youtube HDR video at 19s seems flat. I could increase the monitor's brightness to match the planet brightness when HDR is on, but space would be washed out. Of course without HDR, lowering the brightness for darker space results in the planet becoming darker too.

When HDR is off for LCD and desktop I do still see a difference between Youtube's HDR and SDR videos. For example, at the 19s mark I cannot see most of the debris scattering between the viewer and the planet in the SDR video. That should be the case for you too.

*Edit: Strange... one of the monitors states 10bit colors in the link even though the AVR claimed a signal of 12bits. Not sure what to make of that!


I'll look into that tomorrow. See my other comment for Linux vs Blu-ray.

Television HDR mode is set to FILMMAKER, OLED brightness 100%, Energy Saving Mode is off. Connected to AVR with HDMI cable that says 8K.

  PC has Manjaro Linux with RTX 3060 12GB

  Graphic card driver: Nvidia 580.119.02

  KDE Plasma Version 6.5.4

  KDE Frameworks Version: 6.21.0

  Qt Version: 6.10.1

  Kernel Version 6.12.63-1-MANJARO

  Graphics Platform: Wayland
Display Configuration

  High Dynamic Range: Enable HDR is checked

  There is a button for brightness calibration that I used for adjustment.

  Color accuracy: Prefer color accuracy

  sRGB color intensity: This seems to do nothing (even after apply). I've set it to 0%.
  Brightness: 100%
TV is reporting HDR signal.

AVR is reporting...

  Resolution: 4KA VRR

  HDR: HDR10

  Color Space RGB /BT.2020

  Pixel Depth: 10bits

  FRL Rate 24Gbps
I compared Interstellar 19s into Youtube video in three different ways on Linux and 2:07:26 on Blu-ray.

For Firefox 146.0.1 by default there is no HDR option on Youtube. 4K video clearly doesn't have HDR. I enabled HDR in firefox by going to about:config and setting the following to true: gfx.wayland.hdr, gfx.wayland.hdr.force-enabled, gfx.webrender.compositor.force-enabled. Color look completely washed out.

For Chromium 143.0.7499.169 HDR enabled by default. This looks like HDR.

I downloaded the HDR video from Youtube and played it using MPV v0.40.0-dirty with settings --vo=gpu-next --gpu-api=vulkan --gpu-context=waylandvk. Without these settings the video seems a little too bright like the Chromium playback. This was the best playback of the three on Linux.

On the Blu-ray the HDR is Dolby Vision according to both the TV and the AVR. The AVR is reporting...

  Resolution: 4k24

  HDR: Dolby Vision

  Color Space: RGB

  Pixel Depth 8bits

  FRL Rate: no info
...I looked into this and apparently Dolby Vision uses RGB tunneling for its high-bit-depth (12-bit) YCbCr 4:2:2 data. The Blu-ray looks like it has the same brightness range but the color of the explosion (2:07:26) seems richer compared to the best playback on Linux (19s).

I would say the colors over all look better on the Blu-ray.

I might be able to calibrate it better if the sRGB color setting worked in the display configuration. Also I think my brightness setting is too high compared to the Blu-ray. I'll play around with it more once the sRGB color setting is fixed.

*Edit: Sorry Hacker News has completely changed the format of my text.


Thank you, this is very valuable.

I don't think the Interstellar Blu-ray has Dolby Vision (or Dolby Atmos), just regular HDR10. If the TV/AVR says it's Dolby Vision something in your setup might be doing some kind of upconversion.

You're right! It looks like the Sony UBP-X700 doesn't automatically detect the HDR type and was set to Dolby Vision. I turned it off and the TV now displays the same HDR logo it shows when connecting to the PC. The AVR says...

  Resolution: 4K24

  HDR: HDR10

  Color Space: YCbCr 4:4:4/BT.2020

  Pixel Depth: 12bits

  FRL Rate: ---
...color are now more aligned with the PC. The Blu-ray video seems to be showing more detail in the explosion. I thought this extra detail was because of more color being shown, but I now think this might have something to do with Youtube's HDR video being more compressed.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: