(Sorry for the length - skip to the bottom for the actual question.)
I recently picked out one of the new 50-inch 1080p Samsung DLP HDTVs for my mom because apparently my brother (who is 24 and still lives at home) needs to have it to play Halo 3.
It got delivered and after getting everything hooked up, we had problems. My brother would plug the X-Box in and after a couple minutes of playtime, the screen would freeze - the game was still active and the sound was still working, but there was a pixelated static picture on the screen, no matter what input I switched to.
When I unplugged the TV, waited a minute or so, and hooked everything back up, everything would work again. So, I went into the X-Box settings to play around with the display, and noticed it was defaulting to 1080i. I was about to switch it to 720p, but decided "hey what the hell" when I noticed 1080p was one of the options. Fired up Halo 3, and it worked like a charm for the 30 minutes or so I played it, and presumably will continue to do so.
So, I thought the problem was with the output on the X-Box, and turned on the DVD/Receiver we picked up with the TV. Read through the manual, and noticed that there's an HDTV setting on the DVD player that lets you cycle through display rates, up to 1080i. So, I kicked it up to 1080i, popped in a DVD, and after about 2 minutes, the picture froze. Same exact problem. Unplugged everything, plugged it back in, switched the DVD player down to 720p, and it worked fine.
So... why is 1080i breaking my new TV?