I could be wrong but my thoughts are unless you had a screen cable of supporting a 1080 native resolution then you prob would see much difference at all.
It depends on the source. If you were looking at static images then no difference, with films (1080p24 original) then little difference if the deinterlacing is well done. On the other hand sport shot at 1080p50 will show a massive difference.
The biggest sham is that interlacing is used at all, now is a perfect opportunity to be rid of interlacing for all but maintaining backward compatibility with SD broadcasting.
Watching Sky Sports in HD in a shop the other day, put me right off! The Interlacing effect seen as people run was rubbish. Was like watch a poorly converted home video.
Played some HD avi files from Microsoft's website via a PC, that looked awesome... but Sky HD was not a patch.
The biggest sham is that interlacing is used at all, now is a perfect opportunity to be rid of interlacing for all but maintaining backward compatibility with SD broadcasting.
I absolutely agree with this, why when the vast majority of HD displays and HD sources are naturally progressive have interlacing?
I absolutely agree with this, why when the vast majority of HD displays and HD sources are naturally progressive have interlacing?
Economics and technology.
It would be great if industries could jump over those pesky little steps required to get R&D onto the market as the ultimate finished product but in the real world that is rarely possible.
Interlaced is cheaper for the broadcasters and distributers of domestic television all over the world and allows legacy hardware to be of some use during the years required to move to 100% HD production.
Your best bet is to wait a few years and let the market mature and then buy into broadcast HD otherwise you have to put up with products that do a certain job at a certain price point.
Comments
Screen example would be 1920x1080
Played some HD avi files from Microsoft's website via a PC, that looked awesome... but Sky HD was not a patch.
If you have a decent TV, and your regular Sky Box is connected with a quality RGB cable, then the viewing is fantastic.
I absolutely agree with this, why when the vast majority of HD displays and HD sources are naturally progressive have interlacing?
Economics and technology.
It would be great if industries could jump over those pesky little steps required to get R&D onto the market as the ultimate finished product but in the real world that is rarely possible.
Interlaced is cheaper for the broadcasters and distributers of domestic television all over the world and allows legacy hardware to be of some use during the years required to move to 100% HD production.
Your best bet is to wait a few years and let the market mature and then buy into broadcast HD otherwise you have to put up with products that do a certain job at a certain price point.
Given a certain available data rate you can either do a lower resolution picture in progressive mode or a higher resolution picture using interlace.
For example, say it costs £1m per year for a 1080p channel, it will cost approximately £0.5m per year for a 1080i channel.
I'm not technical but the above doesn't sound right?