1080i vs 1080p

What's the difference between 1080i and 1080p?

1080i and 1080p are high definition display formats. The 1080 refers to the pixel resolution (1,920 pixels across the screen by 1,080 pixels down the screen, or about 2 million pixels total). The difference between 1080i vs 1080p lies in the way the video signal is sent from the source device and displayed on the screen.

1080i vs 1080p
 Lifewire 
1080i
  • Used for traditional TV broadcasting.

  • Interlaced image display.

  • Video is less sharp.

1080p
  • Used for Blu-ray and online streaming.

  • Progressive image display.

  • Smoother motion rendering.

Although CRT HDTVs can display 1080i natively, LCD, Plasma, and OLED TVs display images progressively. This means that incoming 1080i video signals must be converted to either 720p or 1080p. This process is called deinterlacing.

How 1080i and 1080p look on your TV screen depends both on video content and processing capability of the source player (the Blu-ray disc player, upscaling DVD player, media streamer, or home theater receiver). Many TVs have built-in processors that can handle deinterlacing. Processors used in TVs may yield similar or the same results as the processors used in many DVD or Blu-ray disc players, but most Blu-ray disc players can also output 1080p/24 (the standard for traditional film) to a compatible TV.

1080i Pros and Cons

Advantages
  • Requires less bandwidth.

  • Fine for most cable programming.

Disadvantages
  • Prone to aliasing effects.

  • Requires a high screen refresh rate.

The "i" in 1080i refers to interlaced scan. In 1080i, each video frame is sent or displayed in alternating fields. The fields are composed of 540 columns of pixels running from the top to the bottom of the screen, with the odd fields displayed first and the even fields displayed second. Together, both fields create a full-frame. The 1080i format is most commonly used by TV broadcasters such as CBS, CW, NBC, and many cable channels.

1080p Pros and Cons

Advantages
  • Minimal motion blur and visual distortions

  • Better for rendering 3D video games.

Disadvantages
  • Requires greater bandwidth.

  • Has been supplanted by 4K.

For 1080p, each video frame is sent or displayed progressively. This means that both the odd and even fields that make up the full-frame are sequentially displayed, one following the other. The final displayed image is smoother looking than 1080i, with less motion blur and fewer jagged edges. 1080p is most commonly used on Blu-ray discs and selected streaming, cable, and satellite programming.

There are also differences in how 1080p is displayed with regards to the frame rate. For this reason, the video resolution and frame rate are listed together. A TV advertised as 1080p/60, for example, has a frame rate of 60 FPS.

1080p/60 and PC Sources

When you connect a PC to an HDTV via DVI or HDMI, the graphics display signal of the PC may indeed be sending out 60 discrete frames every second (depending on source material), instead of repeating the same frame twice, as with film- or video-based material from a DVD or Blu-ray disc.

In this case, no additional processing is required to render a 1080p/60 frame rate via conversion. Computer monitors typically don't have a problem accepting this type of input signal directly, but some TVs might.

Final Verdict

If you can't afford a new 4K TV, a 1080p display should be suitable for streaming online video and playing 3D games.