The big "How do I choose an HDTV?" thread.

ccoates wrote:

What's weird is that on the Xbox Series X, the Blu-ray has no effect. As in no changes I make to the TV's settings seem to have any effect on the image. Adjusting the contrast or brightness seems to just change the overall picture brightness, which doesn't make any sense.

Maybe the XSX or the Blu-ray app from the Microsoft Store does some sort of automated adjustments behind the scenes? If I use the Xbox's built-in calibration from Settings the picture can be tuned as expected.

On the U7G you can tweak the settings per source, so that's what I'm doing for now, but it's really puzzling.

It has to do with freesync/G-Sync being enabled, from what I’ve read up on about Hisense models.

trueheart78 wrote:

It has to do with freesync/G-Sync being enabled, from what I’ve read up on about Hisense models.

I'll have to check that! I swear I had Freesync toggled off for every source though. It only seems to happen on the Xbox.

Edit: It's not Freesync. I didn't have that enabled on any source or video mode.

I've checked toggling off every possible setting that could be to blame, and I think it's just something Xbox specific.

Calibration Blu-rays work fine with the PS5, and I copied some of the calibration test videos to my Nvidia Shield TV and they work as expected there too.

Edit: Edit:

I toggled off every single video mode/advanced option on the Xbox, then *rebooted* it. And now everything works as expected. So one of them (I'm not sure which) must do some sort of adjustments on the fly that makes using a calibration tool impossible (other than the dashboard one, so maybe it knows not to run that extra processing during the built-in test?)

I'm going to re-enable all those video options/modes now that I'm done, but still kinda interesting.

So tuning the TV to the Disney WoW disc I ended up with... values really close to what I had from the Xbox tool, honestly.

Using the WoW disc, my contrast/brightness for the XSX is Contrast 50 and Brightness 52 (using the Xbox tool it was Contrast 48 and Brightness 54). On the Shield TV it's Contrast 49 and Brightness 52. On the PS5 I ended up with Contrast 50 and Brightness 50.

So honestly I could've probably just left everything at 50 or stuck with one of the middle-of-the-road default picture modes and the difference wouldn't be THAT drastic.

Now I can stop tinkering and see how all this looks with some 4k games!

ccoates wrote:

Now I can stop tinkering and see how all this looks with some 4k games!

Glad you got everything sorted! 4k HDR on these sets are a treat!

https://wccftech.com/hdmi-licensing-...

You can’t make up this kind of dumb.

LG has announced their 2022 OLED lineup at CES

https://www.cnet.com/tech/home-enter...

The good news is that they have the much requested 42" size this year which should appeal to some gamers that still were on the fence with the idea of a 48" "Monitor"

Sony has also announced an OLED TV that uses Samsung's fancy OLED tech that boosts brightness and color accuracy beyond what LG can currently do with their panels. It remains to be seen if that will be worth the no doubt premium that Sony and Samsung will attach to those sets.

Also LG has a 98" OLED this year which most mortals won't be able to afford but man that would be an amazing set and a real alternative to a projector setup.

Imagine trying to move it though or even the mechanics of delivery and setup.

Let's see if this really is a good alternative to OLED.

slazev wrote:

Let's see if this really is a good alternative to OLED.

it is OLED.

Yeah it's just tweaked OLED. It'll be brighter and have a wider color spectrum, but otherwise still an OLED.

Agree with the others. It is evolutionary not revolutionary.

I'm still wanting a new TV for gaming, but I am probably going to hold out for either OLED to get more affordable or MicroLED. I am glad to see just about every new TV announced this year is claiming to support HDMI 2.1 with many of them including call outs for supporting VRR and other gaming focused features. Of course I bought my current TV because I came home one night to my old TV doing this (and it completely died soon after) so here is hoping that doesn't happen again.

Rykin wrote:

I am glad to see just about every new TV announced this year is claiming to support HDMI 2.1

Well now that they just rebranded HDMI 2.0 into HDMI 2.1, of course everything is gonna be labeled as HDMI 2.1. You have to look closely to make sure any of them actually are what 2.1 was previously supposed to mean. All the new 2.1 features are now considered "optional" for 2.1 compliance.

Which is why I put "claiming" in there

Any thoughts on Samsung's The Frame TV? I was debating between a thin LG OLED TV and that for my new house. But damn, that Frame TV looks really good as a piece of art. The TV will be mounted on top of fireplace mantle.

IMAGE(https://i.imgur.com/2UTfoV1.png)

Seems kind of gimicky and as someone who runs seasonal picture screensavers on their TV, there is a big difference between a picture on a wall and a TV showing a picture due to the amount of light the TV puts out. If it is a nice TV and not a big premium and you like it, go for it as it will probably look nice in that spot. Just be clear it will not look like a picture on the wall, it will look like a TV showing a picture.

I'd be curious how it works also if you hook up another box like a Roku or AppleTV instead of using the built-in apps. Does the art function still work seamlessly?

The idea of the breakout box so you only have a single cable running to the TV is nice, but the purpose of that is so you can mount it directly to the wall as close as possible like a painting. In your case where it is over a fireplace, you are probably going to want a regular adjustable mount anyway so you can tilt it down to be in line with your seating, so the slim mount kind of loses some of its usefulness.

LeapingGnome wrote:

Seems kind of gimicky and as someone who runs seasonal picture screensavers on their TV, there is a big difference between a picture on a wall and a TV showing a picture due to the amount of light the TV puts out. If it is a nice TV and not a big premium and you like it, go for it as it will probably look nice in that spot. Just be clear it will not look like a picture on the wall, it will look like a TV showing a picture.

I'd be curious how it works also if you hook up another box like a Roku or AppleTV instead of using the built-in apps. Does the art function still work seamlessly?

The idea of the breakout box so you only have a single cable running to the TV is nice, but the purpose of that is so you can mount it directly to the wall as close as possible like a painting. In your case where it is over a fireplace, you are probably going to want a regular adjustable mount anyway so you can tilt it down to be in line with your seating, so the slim mount kind of loses some of its usefulness.

Thanks for the input. I will not be tilting this TV down. I am mounting it as close to the wall as possible. Was planning to do that with the OLED.

TheGameguru wrote:

https://wccftech.com/hdmi-licensing-...

You can’t make up this kind of dumb.

I used to believe the standardization of physical ports would put an end to certain tech-averse relatives' issues. Now I miss the days where I could just say "plug the cable where it fits, in the same color where applicable".

dejanzie wrote:
TheGameguru wrote:

https://wccftech.com/hdmi-licensing-...

You can’t make up this kind of dumb.

I used to believe the standardization of physical ports would put an end to certain tech-averse relatives' issues. Now I miss the days where I could just say "plug the cable where it fits, in the same color where applicable".

As someone who works AV/Computer Tech Support USB-C has been one of the worst things to come along in years just because it is being used with multiple connection protocols and for some things that don't comply with any other spec that would have used a proprietary connector in the past (the dock connector on the Switch and Analogue Pocket for instance). You see a USB-C port and think "great it is USB-C" but then you have to wonder is it USB 3? USB 2? Thunderbolt? USB 4? Just a power port with no data? Something else?

HDMI is at least still mostly just an AV standard (even though they added support for networking via HDMI I have yet to see anything really use that), but the recent 2.0/2.1 shenanigans have muddied the waters. You plug one HDMI device into a monitor or TV and you usually at least get something. Now maybe you were hoping for 4k/120Hz and you can only get 4k/60Hz but at least it usually works.

Just a word of caution on Vizio: I purchased a higher end Vizio 65 4 years ago and programmed it to work, with the rest of my equipment, through the Logitech Harmony app. It has worked OK with hiccups until recently. In the last six months the TV has had an update and defaults to their crappy media center interface no matter workarounds I have tried. It refuses to use the last interface used or what the harmony remote tells it to do on boot. Vizio just doesn't have those options in their interface.

EvilDead wrote:

Just a word of caution on Vizio: I purchased a higher end Vizio 65 4 years ago and programmed it to work, with the rest of my equipment, through the Logitech Harmony app. It has worked OK with hiccups until recently. In the last six months the TV has had an update and defaults to their crappy media center interface no matter workarounds I have tried. It refuses to use the last interface used or what the harmony remote tells it to do on boot. Vizio just doesn't have those options in their interface.

Glad my Vizio of about the same era has an option to default to previous input. It does jump to media center if it doesn't detect anything or something like that, but since I have an always on Roku and cable hooked up, it is very rare that ever happens.

I bought a 55" Sony X900E a handful of years ago, I remember being disappointed to find that it only had 2 HDMI inputs that supported HDR (one of those being an in input for Arc, so I used that to connect to my sound bar, which of course doesn't have HDR passthrough, negating that input for any use other than Arc). It's cause an annoyance on my end, because since I have only one input for HDR, I have a switcher that my AppleTV, PS5, and UHD player go into.

My question is if the amount of HDR inputs something that is generally changing? It's just a major paid being limited to one input and a switcher that could be feeling wonky and confused about all the HDMI signals it is dealing with. Also, if there is a better set-up, I'd be stoked to hear it. I may just not be thinking about things right.

Garth wrote:

I bought a 55" Sony X900E a handful of years ago, I remember being disappointed to find that it only had 2 HDMI inputs that supported HDR (one of those being an in input for Arc, so I used that to connect to my sound bar, which of course doesn't have HDR passthrough, negating that input for any use other than Arc). It's cause an annoyance on my end, because since I have only one input for HDR, I have a switcher that my AppleTV, PS5, and UHD player go into.

My question is if the amount of HDR inputs something that is generally changing? It's just a major paid being limited to one input and a switcher that could be feeling wonky and confused about all the HDMI signals it is dealing with. Also, if there is a better set-up, I'd be stoked to hear it. I may just not be thinking about things right.

I checked rtings.com and even the lousiest model of 2021 has hdr support on all its inputs, so you're probably safe buying a new TV.

Chairman_Mao wrote:
Garth wrote:

I bought a 55" Sony X900E a handful of years ago, I remember being disappointed to find that it only had 2 HDMI inputs that supported HDR (one of those being an in input for Arc, so I used that to connect to my sound bar, which of course doesn't have HDR passthrough, negating that input for any use other than Arc). It's cause an annoyance on my end, because since I have only one input for HDR, I have a switcher that my AppleTV, PS5, and UHD player go into.

My question is if the amount of HDR inputs something that is generally changing? It's just a major paid being limited to one input and a switcher that could be feeling wonky and confused about all the HDMI signals it is dealing with. Also, if there is a better set-up, I'd be stoked to hear it. I may just not be thinking about things right.

I checked rtings.com and even the lousiest model of 2021 has hdr support on all its inputs, so you're probably safe buying a new TV.

Thanks for that. I was looking on rtings as well, but must have just missed it or something. Thanks again.

Chairman_Mao wrote:
Garth wrote:

I bought a 55" Sony X900E a handful of years ago, I remember being disappointed to find that it only had 2 HDMI inputs that supported HDR (one of those being an in input for Arc, so I used that to connect to my sound bar, which of course doesn't have HDR passthrough, negating that input for any use other than Arc). It's cause an annoyance on my end, because since I have only one input for HDR, I have a switcher that my AppleTV, PS5, and UHD player go into.

My question is if the amount of HDR inputs something that is generally changing? It's just a major paid being limited to one input and a switcher that could be feeling wonky and confused about all the HDMI signals it is dealing with. Also, if there is a better set-up, I'd be stoked to hear it. I may just not be thinking about things right.

I checked rtings.com and even the lousiest model of 2021 has hdr support on all its inputs, so you're probably safe buying a new TV.

Although on a similar topic that came up earlier in the thread, if you want to future proof/avoid future annoyances, you may want to check for ones that have HDMI 2.1 on all ports.

If you want 120hz on your game consoles, you'll need the HDMI 2.1 port, and if you're using a soundbar with eARC that's going to take up one of them. So on lower to mid end TVs (like my U7G) you only get two HDMI 2.1 ports, and one of those is going to be used up by the eARC.

Whereas an LG C1 has HDMI 2.1 on every port, but... it also costs twice as much.

Some soundbars can passthrough 4K, but only like three or four can passthrough 120hz right now. And a name brand HDMI 2.1 switcher costs like $200 right now. (But you could roll the dice on an Ali Express knockoff for a fraction of that.)

Okay, good to know. Thank you!

On a different note, I think I managed to remedy an issue that I mentioned in my previous post. I noticed that my TV would randomly change inputs (usually at really inopportune gaming times) to my 3rd input (the ARC input where my sound bar is connected). I thought it was an issue with my ancient Vizio sound bar. But then was reminded that I have my Switch plugged into the Vizio's single HDMI input. After switching it to a random HDMI input on my TV, it continued to happen. My Switch is randomly trying to take control of the TV (even when not used). I suspect turning off the CEC will remedy this.

ccoates wrote:

Some soundbars can passthrough 4K, but only like three or four can passthrough 120hz right now. And a name brand HDMI 2.1 switcher costs like $200 right now. (But you could roll the dice on an Ali Express knockoff for a fraction of that.)

Can anyone validate whether this article about every 2.0 now just being able to be called 2.1 has happened in TVs? That could cause some issues.

mrtomaytohead wrote:
ccoates wrote:

Some soundbars can passthrough 4K, but only like three or four can passthrough 120hz right now. And a name brand HDMI 2.1 switcher costs like $200 right now. (But you could roll the dice on an Ali Express knockoff for a fraction of that.)

Can anyone validate whether this article about every 2.0 now just being able to be called 2.1 has happened in TVs? That could cause some issues.

Yup

Yes and yes..make sure any TV you are looking at has a rtings.com review as they will let you know exactly how far that TV goes in their 2.1 implementation.

Even LG OLEDs are just now finally getting the full 48Gbps bandwidth.

There is currently a local estate auction that has a 65" LG Smart TV model OLED65B6P up for bid. From the pictures, it was manufactured mid 2016, and judging from the pictures of the house, and where this TV is wall mounted, I'd guess the LG was not the daily TV but a smaller LCD near the estate's kitchen/dining area was instead. The owners of the house were clearly older, based on their decor and furnishings.

So, I'd guess that the TV hasn't had a ton of use, but I also can't rule out they didn't leave the screen on FOX 24/7.

Sadly, an in person preview is not possible. So, besides knowing the set works, that's all I have to go on.

Anyone care to guess on the overall reliability of an LG Oled of that vintage?

I'm not going to pay 500 bucks for it if the bidding gets that high, but I'd be really tempted at the 250 to 300 range.

Even only getting rare use the panels from that gen of OLED don't age all that well.

Mine is in my bedroom now and only gets very rare use, but I would not recommend buying it at any price really.

Decent TV's are so relatively cheap I just can't see paying for one like this that almost definitely has a news logo forever burned into the corner.

Mine still looks perfect when I first turn it on and if I keep watching full screen content, but if I watch something with letterboxes and then switch to full screen content there are two very clear hard lines and shifts in brightness between where the wide-screen picture was displayed and where the letterboxes were.

And while it has no burn-in because I don't watch news channels or abuse my screens, it is susceptible to image retention after fairly brief periods of time. Playing games on it would be terrible, for example.

Awesome! That is exactly the kind of anti-enabling I was looking for. I'm perfectly happy with my budget 4k set, plus the auction is already at $120 with a week to go, pretty much past the nothing to lose price point.