HDTV
An HDTV (High-Definition Television) is a television display that offers a significantly higher picture resolution than older standard-definition (SD) sets. The term applies to computers because modern HDTVs can be used as large, high-resolution monitors for PCs, and computer technology is integral to the digital processing and transmission of HD video.
Key concepts of HDTV in computing
Higher resolution and pixel density: HDTV provides a sharper, more detailed image due to a higher pixel count. The two most common formats are:
720p: A progressive scan with a resolution of 1280 x 720 pixels.
1080p: A progressive scan with a resolution of 1920 x 1080 pixels, often called "Full HD".
Progressive vs. interlaced scanning: Progressive (p): Draws all the lines of a single frame sequentially, creating a smoother image. All modern computer monitors use progressive scanning.
Interlaced (i): Draws odd and even lines in alternating fields to create a single frame. This was common in early HDTV and requires less bandwidth but can cause motion artifacts.
Digital signal processing: The video and audio sent to an HDTV are compressed digital data streams, much like computer data. To produce the best picture, the computer must be able to output a high-definition signal that is compatible with the HDTV.
Wide aspect ratio: HDTV uses a widescreen aspect ratio of 16:9, which is much closer to the human field of vision than the older 4:3 ratio. This requires computer graphics and displays to be configured properly to fill the wider screen.
Advanced connections: Modern HDTVs use HDMI (High-Definition Multimedia Interface) ports, which send both high-quality digital audio and video through a single cable. Computers can be connected to an HDTV using an HDMI port on the graphics card or with an adapter for other video outputs like DisplayPort.