This is an automated archive made by the Lemmit Bot.
The original was posted on /r/tifu by /u/ptsrr734 on 2025-08-11 05:13:52+00:00.
I upgraded to my first smart TV a few years ago, a nice 75" Samsung. I don’t have cable, I just connect my PC to the TV and essentially use it as a monitor for gaming/movies/TV.
Lately, as I’ve obtained more 4k media, I’ve noticed that 4k HDR video looks dark compared to SDR, but didn’t think much of it. Well today I decided to fully investigate.
Turns out that the TV is capable of recognizing the type of input device connected to it and has different picture quality setting options that it defaults to for different types of devices. Not all device types can use all the different picture quality settings available. However, as I found out today, you can manually change the input device type in the source settings to something else.
So, I realized that my TV saw my PC and so it didn’t offer the full breadth of picture quality settings that it’s capable of. I changed the input type to “game console” and the difference was immediate and dramatic. Now I’m getting the full picture quality my TV has to offer with 120Hz refresh rate. Its like getting a new TV all over again.
TL;DR: I didn’t realize that the type of device connected to my TV determined the quality of the picture it would show. I’ve been unknowingly consuming all of my visual media in a lower quality for several years.