AdrianH
Super Moderator
Picked up an issue with Google TV and HDR.
I watch several streaming services that have a variety of SD, HD, UHD with HDR or Dolby Vision. Google TV is set to 4K@60, Dolby Vision (Low Latency). Also, setting the Google TV to Dolby Vision means it will switch to HDR instead if Dolby Vision is not in video but HDR is. So basically the pecking order is = Dolby Vision, then HDR, then SDR.
A few times now, when watching a UHD program without Dolby Vision support but HDR support, the TV never changes to HDR. This occurs in Amazon Prime and YouTube, so not an app problem. I would then start rebooting and fiddling with display settings on the Google TV, and normally it would work again after a while.
Tonight though, the same happened again. After much messing around, I found out if I change to Dolby Vision (Low Latency) from Dolby Vision (Standard), then HDR is triggered on the TV correctly when a HDR video is played. So I can actually reproduce the issue. Googling the issue it seems the Google TV has a bug where sometimes after a reboot, the Dolby Vision (Low Latency) is changed to Dolby Vision (Standard) which explains why it happens every now and again.
Can anybody explain what is the difference between DV Standard and DV Low Latency, and why DV (Standard) would it stop the HDR from triggering?
@KenMasters maybe you can shed some light on these DV options?
I watch several streaming services that have a variety of SD, HD, UHD with HDR or Dolby Vision. Google TV is set to 4K@60, Dolby Vision (Low Latency). Also, setting the Google TV to Dolby Vision means it will switch to HDR instead if Dolby Vision is not in video but HDR is. So basically the pecking order is = Dolby Vision, then HDR, then SDR.
A few times now, when watching a UHD program without Dolby Vision support but HDR support, the TV never changes to HDR. This occurs in Amazon Prime and YouTube, so not an app problem. I would then start rebooting and fiddling with display settings on the Google TV, and normally it would work again after a while.
Tonight though, the same happened again. After much messing around, I found out if I change to Dolby Vision (Low Latency) from Dolby Vision (Standard), then HDR is triggered on the TV correctly when a HDR video is played. So I can actually reproduce the issue. Googling the issue it seems the Google TV has a bug where sometimes after a reboot, the Dolby Vision (Low Latency) is changed to Dolby Vision (Standard) which explains why it happens every now and again.
Can anybody explain what is the difference between DV Standard and DV Low Latency, and why DV (Standard) would it stop the HDR from triggering?
@KenMasters maybe you can shed some light on these DV options?