Blue-Ray or HD DVD

Collector Freaks Forum

Help Support Collector Freaks Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
ok, so speaking of interlaced vs progressive - what is better - 720p or 1080i?

I say 720p because we're talking about a moving picture (a movie), so imagine that the lines are drawn all together rather than being drawn in pairs.

Basically, imagine this. Say you have 4 lines that are going from top to bottom. When you have i, it means that lines are drawn:

_________

_________


then the missing ones are added in between:

_________
_________
_________
_________


When you have a p, all four lines are drawn at the same time:

_________
_________
_________
_________
 
https://techdigs.net/content/view/53/42/

In addition to progressive scan and interlaced video display methods, you have probably heard of 480i, 480p, 720p, 1080i and 1080p. What's the difference between progressive scan (the p) and interlaced (the i) and why should you care? Read more to find out.

In U.S. broadcast television, there are a two basic TV video display methods: progressive scan and interlaced.

Interlaced means the lines that make up the picture on your TV screen are drawn in an alternating fashion. In the U.S., first the even lines appear on the screen, then the odd lines appear. Standard TV is interlaced. This is the TV format we've all been watching since television was invented.

Progressive scan means the lines that make up the TV picture are displayed all at once in sequence. HDTVs are capable of at least 1280x720p ('p' for progressive scan). While they can also accept an interlaced signal, they natively display progressive scan video.

Progressive scan DVD players are capable of 720x480p, and the newer upscaling DVD players and high def DVD players are capable of up to 1080p depending on the model (see the TechDigs.net article Want Better DVD Movies? Buy an Upconverting Player!).

Without getting into the gory details, the interlaced method was originally used because CRT technology in early TVs wasn't fast enough to keep up with a progressive scan of approximately 480 lines. It could, however, keep up with 240 lines. Therefore, standard U.S. TV broadcasts first send the even 240 lines, and then the odd 240 lines.

The problem with interlacing technology is this alternating line drawing tends to cause the eyes to see a flicker. Unfortunately, interlacing reared its ugly head again in the 1990s when established HDTV standards included both 720p and 1080i options. Some broadcast networks use 1280x720p (720 lines progressively displayed) and some use 1920x1080i (540 even lines drawn, then 540 odd lines drawn). As of 2006, no U.S. broadcast network uses 1080p, or 'full HD' (1920x1080p).

The problem with 1080i is that despite having more total lines, it generally doesn't look as good as 720p. This is especially true for high-motion video such as sports. If you have a large (over 46") HDTV hooked up properly and want to see an example of this, watch a punt return on HDTV NBC Sunday Night Football, and then watch a punt return on HDTV Monday Night ESPN Football. The difference is significant. With far less aliasing (visible chunky pixels), ESPN's 1280x720p looks substantially better than NBC's 1920x1080i. While some of this may be due to the compression used by NBC or the cable outlet, most of it is due to interlacing.
 
i knew I was asking this question at the right forum!!! thanks guys.

Isn't interlaced generally better for non fast moving images though?

my TV is an old one, a 2001 Panasonic widescreen model. It does 480p and 1080i, but not 720p!!

now i'm kicking myself for even asking the question!
 
Last edited:
It could be, since it's using less data. I have noticed some fast moving stuff (at least on my computer) start sort of flash through each frame when something is going fast, but an HD-DVD or Blu-Ray player with a TV shouldn't have any problems since they are built to display movies and the computer problem is only because I was on a slow computer or I was running a lot of programs (Vista messes with my movie playback. And Itunes)
 
Isn't interlaced generally better for non fast moving images though?

no. once again, look at the lines i drew a few posts back. for a moving image, would you rather the lines be drawn separately or drawn all at once? people may note that the human eye mostly cannot note the difference, but when it comes to frame rate and a fast moving film scene, like a shootem-up action scene, you will be able to notice the difference between i and p, IMO. no ghosting.

now if you're looking at a still picture that is in 720p next to one in 1080i, they'll look the same. why? because it's not moving, therefore no lines are being drawn.
 
good to know. i MUST get a new tv now... i've got some really good mileage out of mine so far.

no. once again, look at the lines i drew a few posts back. for a moving image, would you rather the lines be drawn separately or drawn all at once? people may note that the human eye mostly cannot note the difference, but when it comes to frame rate and a fast moving film scene, like a shootem-up action scene, you will be able to notice the difference between i and p, IMO. no ghosting.

now if you're looking at a still picture that is in 720p next to one in 1080i, they'll look the same. why? because it's not moving, therefore no lines are being drawn.
 
Yes--native resolution is what matters.

Yes, my 62" DLP is 1920x1080 :rock but my 42" plasma is 1024x768 :cuss. Which brings me to another question...

I had a 44" LG DLP (1280x720 native resolution) which died (very nice TV by the way) so I had to get it replaced under warranty. I was disappointed that I couldn't get a direct replacement but they offered me a 46" Sony LCD rear projection or a 42" Panasonic plasma so I took the 42" because I didn't really like the look of the Sony. Since the Sony's native resolution was 1280x720 but the plasma I got is only 1027x768, did I make the right decision?
 
I say 720p because we're talking about a moving picture (a movie), so imagine that the lines are drawn all together rather than being drawn in pairs....

This I understand but if the newer technology is really really fast to quickly display the interlaced image what difference does it make? For example, if the TV takes the interlaced signal and deinterlaces it quickly before showing it on the screen what difference does it make assuming both images would have the same compression? Really from what I read it's not the interlaced signal that's the problem, it's actually the frame rate and compression, correct? Also, how come there is very little talk about compression with the 1080i image? Most guides/references just talk about the number of lines.
 
This I understand but if the newer technology is really really fast to quickly display the interlaced image what difference does it make? For example, if the TV takes the interlaced signal and deinterlaces it quickly before showing it on the screen what difference does it make assuming both images would have the same compression? Really from what I read it's not the interlaced signal that's the problem, it's actually the frame rate and compression, correct? Also, how come there is very little talk about compression with the 1080i image? Most guides/references just talk about the number of lines.

720p is fewer pixels, which equals faster. 1/60th second per frame.

1080i is two interlaced fields that make a frame, at 1/30th second for each field.
 
Sure, if I can get it to work. Live has been kind of odd. I got into a Free-for-all game yesterday for the first time but I think it was lagging a bit.

I've got work study till 9 (central) though so it'll be later
 
720p is fewer pixels, which equals faster. 1/60th second per frame.

1080i is two interlaced fields that make a frame, at 1/30th second for each field.

Bare with me as I try to understand all of this...

Ok, but can't I get a better faster TV that will process the 1080i signal at 1/60th second per frame which would be just as good as the TV that processes the 720p signal at 1/60th frame per second. Which would also mean the 720p signal on this better faster TV would be 1/120th second per frame?
 
Interlaced is also a form of compression so it won't have as much quality as something in progressive scan where each frame is completely full.

I get this but were can I get more information that explains the compression. No one seems to talk about this in the information I have been reading up on, just the interlacing, etc.
 
Bare with me as I try to understand all of this...

Ok, but can't I get a better faster TV that will process the 1080i signal at 1/60th second per frame which would be just as good as the TV that processes the 720p signal at 1/60th frame per second. Which would also mean the 720p signal on this better faster TV would be 1/120th second per frame?

Hm, not sure what you mean by better/faster TV. If I were you, and I was buying, i'd shoot for something with at LEAST the 720p option (which will also have 1080i), but would go for a 1080p. That's what I have, so i'm covered!
 
Back
Top