Originally Posted by mwardy:
“But surely you are not saying that an encoder tuned for high bitrates will produce a worse picture than one tuned for low bitrates? I'd have thought that having more bandwidth would (when handled appropriately) produce better results every time, wouldn't it?”
“But surely you are not saying that an encoder tuned for high bitrates will produce a worse picture than one tuned for low bitrates? I'd have thought that having more bandwidth would (when handled appropriately) produce better results every time, wouldn't it?”
YES - higher bitrate means worse pictures sometimes - and to "set things up appropriately" means delving into the algorithm to set when toolkit changes should take place - and that is NOT an easy job.....
A leading IPTV codec is abysmal on H.264 at the same rate as good (ish) MPEG2 ..... it is unwatchable.... Halve the bit rate and it does a beautiful picture !!!
There is also the matter of getting the Quality few frames by few frames (ie.sub GOP) more or less constant - because there is nothing so annoying as the picture going from pin sharp to fuzzy .... (perhaps that is too far ) - but some codecs "Breathe" more than others on certain material ....
If all of this stuff was easy many people would be out of a job ....
remember all that the spcification of an codec does is to define a stream which a compliant decoder can decode - it says NOTHING about its quality.. so say 10 Mbit /s from one manufacturer is very different than say 10Mbit/s from another.... (and may be 9 or 11 Mbit/s from the same unit)
hence a lot of work needs to be done to select a codec if you are wanting best performance... and that is wthout thinking about Stat muxing ....




) I'm not trying to be argumentative, just to understand the principles.
, PQ is now vastly better than during this 'hot' period when implementation mistakes certainly were made, and from here that ruling looks like an emergency strategy that is not necessarily a sound legacy for a licence fee based service.