Full HD or HD Ready Is there any difference?

13»

Comments

  • bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    Mythica wrote: »
    Yes it can. The resolution of 1080i is 1920 x 1080, so the full image is shown.

    You clearly don't understand what is going on here. The TV cannot display 1080i as filmed any more than the HD ready can. After de-interlacing the resolution can go down to as low as 540 vertical pixels, with a decent TV it will be better than this but it is not pixel by pixel.

    Now of course it can display the picture but it is not in its original form so your picky point in reply to the earlier post "Which it can't. Yes it can input at 1080i/p. But it is not showing at 1080i/p. " also applies to a full HD set displaying 1080i, only an interlaced display can do this.

    The moral is if you want to be picky be sure of your ground first. I would add to that in suggesting that when you get caught out you just hold your hands up admit it and move on.
  • MythicaMythica Posts: 3,808
    Forum Member
    ✭✭✭
    Take a look at the table here.

    http://en.wikipedia.org/wiki/HD_ready

    Do I take you can't watch 576i on your full-HD TV as it doesn't have 576 lines ? Can a HD-Ready display watch a 720p signal (TV's with 720 lines are as rare as hens teeth).

    You clearly know what I mean. A TV with a resolution of 720p (1280 x 720) can not and never will be able to display 1080i. 1080i has a resolution of 1920 x 1080.
  • MythicaMythica Posts: 3,808
    Forum Member
    ✭✭✭
    bobcar wrote: »
    You clearly don't understand what is going on here. The TV cannot display 1080i as filmed any more than the HD ready can. After de-interlacing the resolution can go down to as low as 540 vertical pixels, with a decent TV it will be better than this but it is not pixel by pixel.

    Now of course it can display the picture but it is not in its original form so your picky point in reply to the earlier post "Which it can't. Yes it can input at 1080i/p. But it is not showing at 1080i/p. " also applies to a full HD set displaying 1080i, only an interlaced display can do this.

    The moral is if you want to be picky be sure of your ground first. I would add to that in suggesting that when you get caught out you just hold your hands up admit it and move on.

    I clearly do.

    I haven't been caught out as you put it.

    The resolution of 1080i and 1080p is the same which is 1920 x 1080. They are just delivered to the screen differently. If the resolution is the same then the image is going to be there regardless of what way it is delievered to the TV, the end outcome is always going to be 1920 x 1080, it has to be for the image to be on the screen. So of course a 1080p Tv will do this and of course a 720p TV will not do this as it does not have enough pixels.
  • bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    Mythica wrote: »
    I clearly do.

    I haven't been caught out as you put it.

    The resolution of 1080i and 1080p is the same which is 1920 x 1080. They are just delivered to the screen differently. If the resolution is the same then the image is going to be there regardless of what way it is delievered to the TV, the end outcome is always going to be 1920 x 1080, it has to be for the image to be on the screen. So of course a 1080p Tv will do this and of course a 720p TV will not do this as it does not have enough pixels.

    Lets be clear what was said.

    Some people said that a plain HD ready TV could display a 1080i input. You said they couldn't because the resolution wasn't there.

    Basically you were being picky so I in turn pointed out that by the same token a full HD TV couldn't display a 1080i image (taken from an interlaced source) either. By your picky interpretation of display it cannot.

    The bit I highlighted in your post is incorrect and I think shows where your lack of understanding arises. A TV does not take the interlaced halves and put them together to form a 1920x1080 image as per the signal (unless the source was naturally progressive when it may do which is why I put the proviso of naturally interlaced in there). At it's worst the TV may actually display 1920x540 (vertical pixels doubled up) after de-interlacing whereas an actual interlaced display can show 1920x1080.

    To summarise an HD ready will take the 1080i input signal and process it display it as it cannot display exactly what is in the signal, a full HD will take the 1080i signal and process it to display it as it cannot display exactly what is in the signal (well it could but it doesn't because the results would be horrendous).
  • MythicaMythica Posts: 3,808
    Forum Member
    ✭✭✭
    bobcar wrote: »
    Lets be clear what was said.

    Some people said that a plain HD ready TV could display a 1080i input. You said they couldn't because the resolution wasn't there.

    Basically you were being picky so I in turn pointed out that by the same token a full HD TV couldn't display a 1080i image (taken from an interlaced source) either. By your picky interpretation of display it cannot.

    The bit I highlighted in your post is incorrect and I think shows where your lack of understanding arises. A TV does not take the interlaced halves and put them together to form a 1920x1080 image as per the signal (unless the source was naturally progressive when it may do which is why I put the proviso of naturally interlaced in there). At it's worst the TV may actually display 1920x540 (vertical pixels doubled up) after de-interlacing whereas an actual interlaced display can show 1920x1080.

    To summarise an HD ready will take the 1080i input signal and process it display it as it cannot display exactly what is in the signal, a full HD will take the 1080i signal and process it to display it as it cannot display exactly what is in the signal (well it could but it doesn't because the results would be horrendous).

    Or, 1080i has a resolution of 1920 x 1080 so it fits onto a 1080p TV. Regardless of the ins and outs of how it gets there is not what I'm arguing, the fact I'm arguing is that the resolution will always be 1920 x 1080. This can not happen for a HD Ready set as there isn't enough pixels to display 1080i.
  • bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    Mythica wrote: »
    Or, 1080i has a resolution of 1920 x 1080 so it fits onto a 1080p TV. Regardless of the ins and outs of how it gets there is not what I'm arguing, the fact I'm arguing is that the resolution will always be 1920 x 1080. This can not happen for a HD Ready set as there isn't enough pixels to display 1080i.

    There are enough pixels to display 1080i but it can't display it because it's not an interlaced display. The pixels on the screen are not all the same as the pixels in the signal.

    Yes you can say the output resolution is 1920x1080 because it is but it would be the same if you fed the TV 576i.
  • MythicaMythica Posts: 3,808
    Forum Member
    ✭✭✭
    bobcar wrote: »
    There are enough pixels to display 1080i but it can't display it because it's not an interlaced display. The pixels on the screen are not all the same as the pixels in the signal.

    Yes you can say the output resolution is 1920x1080 because it is but it would be the same if you fed the TV 576i.

    No there isn't, quite clearly.
  • grahamlthompsongrahamlthompson Posts: 18,486
    Forum Member
    ✭✭
    Mythica wrote: »
    No there isn't, quite clearly.

    You have a unbelievably naive belief that every single pixel recorded by the original camera is transmitted exactly as it was recorded. Nothing could be further from the truth. High Definition TV is only possible within the limits of available broadcasting bandwidth by using a lossy compression system (H264/AVC for HD broadcasting). This relies on a system that transmits one full frame (an I frame) and subsequent frames with only difference information in a related data structure known as a Group Of Pictures (GOP). The accuracy of the the reconstruction of the original data depends largely on the data rate used (normally measured as millions of bits/second (Mbps) and the efficiency of the encoder used to encode the data. A lower resolution tramsmission within the bitrate available frequently has a superior picture whatever the pixels you have on the screen.

    You also ignore the relation between the pixels on screen and the screen size versus viewing distance. Any digital photographer will tell you that printing photographs on a printer using more than 300dpi results in negligible increase in picture quality.


    If every single pixel was transmitted you have the following

    Each pixel requires 3 bytes (24 bits) each byte represents -256 variations of Red Green and Blue. Every pixel on your screen has 3 subpixels one Red one Green and one Blue. On a a 1920 x 1080 display the sub pixels are rectangular creating a square pixel in total.

    To transmit 1920 x 1080 pixels at 25 frames per second thus requires

    1920 x 1080 x 25 x 24 bits/second = 1244160000 bits/second

    Divide by 8 to get Bytes/second = 155520000

    In other words about 155 GB/second. A 1Hr recording of uncompressed Full-HD at 25fps would need a pvr with a storage capability of 3600 x 155 GB or 559.872 TB (A 1TB HDD could hold less than 2hrs of recording).

    And you talk about the trivial difference in the final display format. There are much more significant variables in the path from camera to display, the number of pixels the display has being amongst the least of them.
  • Deacon1972Deacon1972 Posts: 8,171
    Forum Member
    Mythica wrote: »
    Of course I consider it one image, it's what I can see on my screen.

    An HD set by definition can never display 1080i, it doesn't have enough pixels to be able to display 1080i, I'm sorry but that's a fact you can't argue with. It's not my fault EICTA can't use the correct use of English when describing something. Quite why you would think a HD Ready TV could display 720p and 1080i is beyond me, they both have a different resolution.

    What's wrong with the wording? It's self explaintory to me - a display that is HD Ready has to have a minimum resolution of 720 and should be able to accept, process and display a HD signal (720/1080).

    My mate has a 42" plasma that was sold as HD ready, it has 1920x1080 resolution, it accepts 720/1080i but won't accept 1080p, so technically speaking a HD ready TV can display 1080. ;)
  • bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    Mythica wrote: »
    No there isn't, quite clearly.

    I was referring to the 1920x1080 display having enough pixels but it still cant display the interlaced image exactly as broadcast and as Graham pointed out certainly not as the camera got it. Not that that matters because it can take the signal and display something after processing just not the original signal; the same as a standard HD ready of whatever resolution does.
  • emptyboxemptybox Posts: 13,917
    Forum Member
    ✭✭
    Nobody has suggested it will display at that resolution - but why would you want it to?.

    A decent quality HD Ready TV will give a BETTER HD picture than a cheap Full HD set - why are you pointlessly obsessed with numbers?.

    As others have said - most Full HD sets overscan the picture anyway, although some have the capability to set them to 1:1 mapping (but not all).

    According to the link that grahamlthompson posted, in order to use the label "HD Ready 1080p" a TV must be able to offer 1:1 pixel mapping.
    http://en.wikipedia.org/wiki/HD_ready

    On my Samsung it's not buried away, but one of the options on the 'P size' button on the remote.

    Indeed, a TV would be fairly useless as a computer monitor if it couldn't display 1:1 pixel mapping.
  • MythicaMythica Posts: 3,808
    Forum Member
    ✭✭✭
    Deacon1972 wrote: »
    What's wrong with the wording? It's self explaintory to me - a display that is HD Ready has to have a minimum resolution of 720 and should be able to accept, process and display a HD signal (720/1080).

    My mate has a 42" plasma that was sold as HD ready, it has 1920x1080 resolution, it accepts 720/1080i but won't accept 1080p, so technically speaking a HD ready TV can display 1080. ;)

    Because it can't display 1080i it just inputs it. What that TV was sold at and what that TV actually is are two different things.
  • MythicaMythica Posts: 3,808
    Forum Member
    ✭✭✭
    You have a unbelievably naive belief that every single pixel recorded by the original camera is transmitted exactly as it was recorded. Nothing could be further from the truth. High Definition TV is only possible within the limits of available broadcasting bandwidth by using a lossy compression system (H264/AVC for HD broadcasting). This relies on a system that transmits one full frame (an I frame) and subsequent frames with only difference information in a related data structure known as a Group Of Pictures (GOP). The accuracy of the the reconstruction of the original data depends largely on the data rate used (normally measured as millions of bits/second (Mbps) and the efficiency of the encoder used to encode the data. A lower resolution tramsmission within the bitrate available frequently has a superior picture whatever the pixels you have on the screen.

    You also ignore the relation between the pixels on screen and the screen size versus viewing distance. Any digital photographer will tell you that printing photographs on a printer using more than 300dpi results in negligible increase in picture quality.


    If every single pixel was transmitted you have the following

    Each pixel requires 3 bytes (24 bits) each byte represents -256 variations of Red Green and Blue. Every pixel on your screen has 3 subpixels one Red one Green and one Blue. On a a 1920 x 1080 display the sub pixels are rectangular creating a square pixel in total.

    To transmit 1920 x 1080 pixels at 25 frames per second thus requires

    1920 x 1080 x 25 x 24 bits/second = 1244160000 bits/second

    Divide by 8 to get Bytes/second = 155520000

    In other words about 155 GB/second. A 1Hr recording of uncompressed Full-HD at 25fps would need a pvr with a storage capability of 3600 x 155 GB or 559.872 TB (A 1TB HDD could hold less than 2hrs of recording).

    And you talk about the trivial difference in the final display format. There are much more significant variables in the path from camera to display, the number of pixels the display has being amongst the least of them.

    No I don't. I just know that 1080i is 1920 x 1080 thus in the end, the full information is there which can't be said for a HD Ready TV.
  • MythicaMythica Posts: 3,808
    Forum Member
    ✭✭✭
    bobcar wrote: »
    I was referring to the 1920x1080 display having enough pixels but it still cant display the interlaced image exactly as broadcast and as Graham pointed out certainly not as the camera got it. Not that that matters because it can take the signal and display something after processing just not the original signal; the same as a standard HD ready of whatever resolution does.

    The whole point of the debate is that if you feed 1080i to a HD Ready display you do not get 1920 x 1080. You do when you feed it to a 1080p HD TV.
  • bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    Mythica wrote: »
    The whole point of the debate is that if you feed 1080i to a HD Ready display you do not get 1920 x 1080. You do when you feed it to a 1080p HD TV.

    No the point is (and remember this was in answer to your picky point in the first place) when you feed 1080i to a 1920x1080 display you do not get the same pixels out as you get in.

    This means that if according to your definition if you cannot "display" 1080i on an HD ready TV because you don't get out what you put in then you cannot "display" on a full HD either.

    Of course according to most people's use of the word "display" you can display on either set just that one has more pixels than the other, they have both processed the signal so that it will work on their particular screen before displaying the modified version.
  • Deacon1972Deacon1972 Posts: 8,171
    Forum Member
    Mythica wrote: »
    Because it can't display 1080i it just inputs it.
    it doesn't say it will display 1080 lines, it says, accept, process and display. In laymans terms that means it will accept a 1080 signal from a Sky HD box, process it by downscaling then display the end result.
    Mythica wrote: »

    What that TV was sold at and what that TV actually is are two different things.
    It was badged as HD ready and conforms to that standard, it will not accept 1080p, only 720/1080i. It's an early Panasonic plasma which was one of their first 1920x1080 panels, pretty sure it was before Bluray arrived so there was no 1080p source. I believe there were other manufacturers with identical sets.
  • grahamlthompsongrahamlthompson Posts: 18,486
    Forum Member
    ✭✭
    Deacon1972 wrote: »
    it doesn't say it will display 1080 lines, it says, accept, process and display. In laymans terms that means it will accept a 1080 signal from a Sky HD box, process it by downscaling then display the end result.


    It was badged as HD ready and conforms to that standard, it will not accept 1080p, only 720/1080i. It's an early Panasonic plasma which was one of their first 1920x1080 panels, pretty sure it was before Bluray arrived so there was no 1080p source. I believe there were other manufacturers with identical sets.

    Yes Hitachi had a HD Ready TV with a 1920 x 1080 panel that wouldn't work with progressive. Plenty of 1920 x 1080 TV's around that will work with 1080p50 or 1080p60 but won't do 1080p24 (from blu-ray).
  • Nigel GoodwinNigel Goodwin Posts: 58,498
    Forum Member
    Deacon1972 wrote: »
    It was badged as HD ready and conforms to that standard, it will not accept 1080p, only 720/1080i. It's an early Panasonic plasma which was one of their first 1920x1080 panels, pretty sure it was before Bluray arrived so there was no 1080p source. I believe there were other manufacturers with identical sets.

    Almost all the original Full HD sets wouldn't accept 1080P signals, it was why there was a third 'standard' name added later (Full HD 1080P).

    But as there are no 1080P only sources it doesn't really matter, simply set the source to output 1080i - it makes sod all difference anyway.
  • 2Bdecided2Bdecided Posts: 4,416
    Forum Member
    ✭✭✭
    Mythica wrote: »
    Of course there is a full image, unless you're telling me when I pause my TV that the image is only half there. Now I know how 1080i is delivered to the screen, but clearly the full image is there.
    Your PVR (or whatever you are using to pause the interlaced content) will discard one field, and upscale the remaining field (1920x540 pixels) to 1920x1080 pixels. The result is sent to the TV. 1920x1080 pixels, containing information from only 1920x540 pixels of the source.

    Cheers,
    David.
Sign In or Register to comment.