I absolutely hate "smart" TVs! You can't even buy a quality "dumb" panel anymore. I can't convince the rest of my family and friends that the only things those smarts bring are built-in obsolescence, ads, and privacy issues.
I make it a point to NEVER connect my new 2022 LG C2 to the Internet, as any possible improvements from firmware updates will be overshadowed by garbage like ads in the UI, removal of existing features (warning: reddit link), privacy violations, possible attack vectors, non-existent security, and constant data breaches of the manufacturers that threaten to expose every bit of personal data that they suck up. Not to mention increased sluggishness after tons of unwanted "improvements" are stuffed into it over the years, as the chipset ages and can no longer cope.
I'd much rather spend a tenth of the price of my TV on a streaming box (Roku, Shield TV, etc.) and replace those after similar things happen to them in a few years. For example, the display of my OG 32-inch Sony Google TV from 2010 ($500) still works fine, but the OS has long been abandoned by both Sony and Google, and since 2015-16 even the basic things like YouTube and Chrome apps don't work anymore. Thank goodness I can set the HDMI port as default start-up, so I don't ever need to see the TV's native UI, and a new Roku Streaming Stick ($45) does just fine on this 720p panel. Plus, I'm not locked into the Roku ecosystem. If they begin (continue?) enshitifying their products, there are tons of other options available at similar price.
Most people don't replace their TVs every couple of years. Hell, my decade old 60-inch Sharp Aquos 1080p LCD TV that I bought for $2200 back in 2011 still works fine, and I only had to replace the streamer that's been driving it twice during all this time. Sony Google TV Box -> Nvidia Shield TV 2015 -> Nvidia Shield TV 2019. I plan to keep it in my basement until it dies completely before replacing it. The Shield TV goes to the LG C2 so that I never have to see LG's craptastic UI.
Sorry, just felt the need to vent. Would be very interested in reading community's opinions on this topic.
This is one of the downsides of the widespread adoption of HDMI, it has quite a few downsides. Something like display port would be better, but it's far less common. Such is life.
How is this a downside of HDMI?
It sounds to me like the user's TV or streaming box are configured incorrectly. DisplayPort doesn't magically remove judder from 24fps content being rendered to a 60hz signal.
DisplayPort never saw widespread adoption in the home theater space because it never tried to. The standard is missing a ton of features that are critical to complex home theater setups but largely useless in a computer/monitor setup. They aren't competing standards, they are built for different applications and their featuresets reflect that.
Newer revisions of HDMI are perfectly good, I think. I was surprised and dismayed by how slow adoption was. I saw so many devices with only HDMI 1.4 support for years after HDMI 2.0 and 2.1 were in production (probably still to this day, even). It's the biggest problem I have with my current display, which I bought in 2019.
GP's problem probably isn't even bandwidth, but rather needs to enable their TV's de-judder feature or configure their streaming box to set the refresh rate to match that of the content being played.
VRR support came with HDMI 2.1.
You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.
VRR is really meant for video games.
That's interesting. Pretty much every Blu-Ray player should support this. I can confirm from firsthand experience that Apple TV, Roku, and Android TV devices also all support this. I can't speak for Amazon's fire stick thingy though.
The feature you are looking for is not to manually set the refresh rate, but instead for the device to set it automatically based on the framerate of the content being displayed. On Apple TV it’s called “match frame rate”.