Originally Posted by phelings:
“If it was as simple as just reducing flicker 100Hz WOULD look better.Unfortunately most 100Hz sets demonstrate motion blur,lack of detail etc.
This is picked up in most hardware mag reviews,and is also why many 100hz sets have many picture processing tricks to counteract the problems,and why some 100hz sets,incredibly allow you to revert to 50Hz.There's a good advert for 50Hz if ever I saw one”
no your missing my point...100hz doesnt change the picture in anyway, but a lot of sets come bundled with processing tricks...its them that cause the motion blur, lack of detail...a lot of times these can be turned off but people dont. In fact on my set the 50 hz can only be seen with them on, and aside from that sharpness, noise reduction and automatic image contrast is additional and available in all modes. Noise reduction alone is a BIG culprit for bad image quality, people turn it on because their reception is below par...it can totally mush a picture, especially when combined with a sub par digital source like sky, which is compressed heavily already.
50hz isnt difficult to achieve seeing as its native, and its required for some equipment to work with TV's like some peripherals for games for example...theres no conspiracy behind its inclusion.
So if we jump up the hz are you saying NTSC pictures are 20% worse on an NTSC DVD because the screen has switched to 60hz...are your ntsc dvds all blurry?...does my computer look worse when viewed on screen at different hz...no...
Forget magazines, visit a store, ask them to view one of your favourite DVDs on the same player on a few machines, turn the sound down, fiddle with the controls and pick the one you like. Don't rely on magazines...there's plenty of reasons why they should not make the choice for you.