Still no 1080p? And future 4k
[Deleted User]
Posts: 13
Forum Member
Just got sky again after a couple of years and I'm surprised there is no 1080p option. Given that there are 4k tests now happening on Astra 19 and Hotbird. Is it a hardware or bandwidth issue. I have a 4k so can't wait for the tests that the BBC are doing. Hopefully we get the hardware and a channel by the end of the year.
0
Comments
Netflix currently has Breaking Bad in 4K as long as you have a compatible TV and your broadband can support at least 25 Mbps.
Don't get me started on LG using netflix to misell this 2013 model tv saying it does support netflix 4k when it doesn't. Hopefully we get it soon. It's only just come out for the 2014 models.
Most thongs put onto Blu-Ray (TV series) are converted from interlaced to progressive too.
And anyway, 4K is entirely different, firstly because its a bigger resolution whereas 1080p is just a form of image display.
They 'used to', the BBC absolutely crippled their bitrate a number of years ago - going from the best quality to the poorest over night (it's widely assumed that this was due to the launch of Freeview HD, to stop satellite looking so much better). After huge numbers of complaints the bitrate was increased slightly, but no where near the quality it had been previously.
But as for your original question, there's no 1080P as there's no point, 1080i is the exact same resolution, and if it originates from a 1080P source could well give exactly the same results.
As what the Broadcsters will emit has not been agreed yet and will not be for a year or two I would not hold out too much hope.
Things have not chagend that much since this was recorded at IBC Last year
http://www.youtube.com/watch?v=QPrNhWcjW4c
BT 2020 has been amended to add 100 and 120 Hz which is probably the minimum frame rates for TV, but there is still debate on is 10 bit enough? - because as shots from Brasil have shown ( even UHD) there is not the dynamic range available ... even on 10 bits! so probably 16 bit capture and 12 bit emission may be needed....
Some of us are old enough to remember the IBC of 1990 "the year of HD" ....
but it was only 12 years later that there were early sytems available into the home and then say 2008 before HD was the norm for new installations whether in the home or in the earler in the bradcast chain.
So taking the first international UHD2 trasmisstion as a good starting point as "the year of UHD" - that was 2008
- so working systems even if technology is moving faster is still a few years off -
And the broadcasters are wanting to gain a wow factor and not to repeat the short cuts which were made as they moved to HD (like gamma (this was standardised in 2011) , 100/120Hz and Fractional frame rates ) .
Just two things - is there a TV display that you can buy even Professional which is UHD1 12 bit 100Hz BT2020 gamut ???
or a TV Camera ( not the excellent Sony F55 or similar large sensor film cameras being used at moment) which has as similar spec .. let alone anything in the middle !!!!
Remember that UHD will come in on premium material - like Sport where vision mixing is required - in real time, Unlike film/douementaray where mixing/ rendering is not in real time,
so there will need to be hardware working on video streams of around 100Gbit/s .... some what different from the 1.5 or 3 or 6 Gbit./s now.
(And sport requires a higher frame rate - so stadium illumination may need to be changed!)
And what sound system are to be used -??? It needs to be immersive!
and adding Objects to a stereo mix may be a good way of delivering it as it means that the home loudspeakers can be put anywhere and still give you the sound field the director seeks . (and integrates AD and speaker enhancement)
All of these reports show where the industries are going to get things standardised so that there is interoperability https://www.smpte.org/standards/reports
Bit rate is not a very good measure now -
and I think that you will find that the distortion put in by the coders that the BBC use now is less than it was with their first high bit rate on very early coders.
( yes they did push things a bit as they were doing the progression to better coders - the PSNR measurements which did not reflect what happened on crossfades was one)
And Siemens /ATOS have been upgrading the algorithms with no public announcements or comment - as the pictures have got better.
Remember that some of the issues where down to naff material - usually that which had had a decode recode t finish it .. But the BBC sorted that out
and as FD day comes there is a a lot more knowledge about getting the workflow to retain the image and sound quality. see this http://www.bbc.co.uk/academy/technology/article/art20140416102447779
We've been through this before and you still keep repeating this incorrect statement. After de-interlacing the resolution of a 1080i source is reduced and you do have to de-interlace to display on any modern TV.
Probably most people will not notice the difference between 1080i/25 and 1080p25 transmissions and that is the most important thing but repeating something which is incorrect is not helpful. 1080p50 of course is a different matter.
Of course if the source is not interlaced and the TV detects this and acts accordingly then the whole thing is progressive whatever the transport mechanism so it doesn't matter if it is i or p.
Is this correct? I'm not saying it isn't because I don't know so I'm asking the question.
Well they should since one has twice the frame rate as the other once deinterlaced. Hugely obvious difference between the two formats.
Correct.
We've been through it before, and you keeping claiming otherwise doesn't mean it's true
Only 'incorrect' in your opinion, and as you're fully aware people can't see the difference between 1080i and 1080P on the Freeview broadcasts, and no one would even know about them if the OSD didn't show it
But it's 'helpful' in any case, as it's informing the confused who have this strange idea that 1080P is going to be better, where it may well be no different whatsoever.
I fully agree that 1080P50 is a different story, and 1080P25 is a waste of time - just confusing people.
My point exactly, i or P IS the transport mechanism, depending on the exact method of interlacing it could well be identical. But in any case, there's no practical evidence that 1080P25 is any better than 1080i50.
That is true but the fact is that 1080i sourced material will lose resolution* when de-interlaced, not because I say so but because that is the fact. Anyone can find this out by researching de-interlacing methods on the web.
*Of course it will end up as 1920x1080 on a full HD screen so the pedant can say it is the same resolution after all processing but that same argument could also be applied to 576i.
De-interlacing 576i also gives you the same result. Yes you have to up-scale but then you also have to up-scale 1080i. Clearly I was referring to the intermediate processing that reduces the resolution before it is then up-scaled to fit the screen size (obviously it's more complex than simple up-scaling but the principle is there).
Agreed, 1080p25 would not be very suitable for much viewing such as football which is why we have 1080i/25. Really because of limitations in bandwidth preventing early adoption of 1080p50 we should have had 720p50 as the standard but of course 720 is a smaller number than 1080 so we couldn't have that.
As all sensors are now like displays inherently progressive.
So do you expose every 20 ms and throw away the other half the of the data each time .... Which gives you 1080i with correct temporal resolution .at 25 frames a second.
This can be linearly spatially interpolated to get 1080p 25 .... With output frames offset by at least a frame.
To add spatial and temporally interpolation to get 1080p 50 you require an offset of at least one and a half frame but it get the motion vectors right it may be more frames ... All if which takes processing resources and may /will not get it as good as material captured progressively from the sensor.
Some say that the set manufacturer bullied the broadcasters into going with interlaced HD ....
But some broadcasters of a fractional frame rate persuasion were wanting interlace in UHD. !!!!!!!
I've just got a new one that does support it but I'd have been very annoyed if I'd bought one of the first generation of these. Not to rub it in but stuff does look great in 4k - but the upscaling from 1080 is also out of this world.
And this doesn't even take into account the fact that the chroma channel always needs upscaling even with a 1080p signal...
Yes as alot is filmed in interlaced and you'd be hard pushed to find a Blu-Ray that is interlaced and not progressive. Happy to be corrected though.
What chroma channel?, it's all digital.
No that's nonsense, there's plenty of 1080i stuff on BD. Hell, BD doesn't even support 1080p/25 natively, so anything that's 1080i/25 or 1080p/25 will be encoded as 1080i/25 (as with satellite). BD also doesn't support 1080p/50.
Films are obviously 1080p/24 but they are not filmed as interlaced.
Luma and choma are still encoded separately in digital. Almost all consumer video is YCbCr 4:2:0, meaning for a 1920x1080 video, the luma channel is 1920x1080 but the chroma is only encoded at 960x540 resolution. It therefore has to be upscaled to 1920x1080 at some point before your screen displays it (since they need RGB, rather than luma/chroma). On a PC this conversion to RGB usually takes place before the signal is sent via HDMI to the display. For other devices, they usually let the TV do this final conversion.
Like i said, happy to be corrected. Guess you missed that part!!
I got new second generation 4k tv as well and it makes sky hd really stand out.it upscales everything so I am not losing out by watching hd and not 4k.got Netflix and breaking bad just looks superb in 4k and it is streaming over wifi with no buffering.sorry to sound like im gloting but it does make a difference.
More than happy, just zero need for the attitude, but anyway......
ON Good Tv's or equipment 1080i25 and 1080p25 are identical regardless of the box's output mode.
It's a shame we never got 1080p50 or 1080p60 in a broadcast form. it looks like official UHD standards will include a frame rate boost as well as a increase in colour depth which will be very nice and needed, after all on resolution alone a good percentage of people will see little or no difference in the picture bar a reduction in size or absence of encoding artefacts.
On the few rare un-molested bit's of 4k demo I've seen, it looks great. However most demo material looks unnatural, over sharpened, colour manipulated, fake crap. Not helped by 4k Tv's being LCD and LCD being inherently unable to hold motion detail to anything close screens resolution. 4k or UHD dlp projectors look like the only consumer level technology that may be able to bring all the available motion detail in 4k to the home.
I guess Panasonic baulked at the potential costs of 4k plasma TV. Potentially making high end TV's that are ether hideously expensive or lose excessive money on to actually sell any.