Is 1080i that different from 1080p?
I have a Pioneer projector that goes up to 1080i and am wondering if I will be gimped when I go blu-ray?
Observing members:
0
Composing members:
0
6 Answers
They’re both spectactular. The “i” indicates “interlaced” – only every other horizontal line is active at any given moment, they alternate at roughly 60 fields per second, theoretically imperceptible.
But put a 1080i and 1080p display next to each other and you may see the difference.
1080i has half the resolution as 1080p.
Blu-Ray will look amazing either way. You miss out on the resolution that 1080P has to offer…the resolutions are..
1080i = 1366×768
1080P = 1920×1080
it’s more significant for computer usage for a monitor, or games. In a Blu-Ray, you can tell, but they both look really good.
@RandomMrdan: actually, HD video at 1080i can be scaled to displays of either 1366 or 1920 lines of vertical resolution.
The specifics are in the source resolution (where the source signal is coming from, which is 99.9% of the time either 720 or 1080 lines), versus the display’s resolution (in the case of plasmas and LCDs can be 720, 768, or 1080 lines).
Here’s a good primer article on HD resolutions:
http://www.cnet.com/hdtv-resolution/
@sndfreQ find me a display that will support a 1920×1080 display that isn’t 1080P.
A display that supports 1920×1080 natively will of course support both 1080i and 1080p. There are the same number of pixels in both formats, it’s just the source signal that determines the i or the p. The issue is not with receiver (displays), but one of the source (transmission) signal.
Remember that legacy projection-type displays and even CRT monitors that were part of the early HD systems (usually the ones that contained both analog and digital HDTV tuners, aka “HDTV ready”) were able to decode 720p and 1080i.
720p and 1080i are broadcast standards for broadcasting HDTV (aka “source” signals). Some networks only broadcast in 720p; the issue was and is, that before true 1920×1080 displays (plasma and LCD) came out, the old tuners were essentially, SD TVs that down-res’d either the 720p signal or the 1080i signal, down to the TV display’s native resolution (480p in most cases).
1080i is a legacy tech, as it was the most compatible with CRT displays which are inherently interlaced. 1920 displays can and have always been able to support either interlaced or progressive scan. The limited resolution was this “middle” tech that displayed down-res’d 1080i to 1366.
Answer this question
This question is in the General Section. Responses must be helpful and on-topic.