High Definition capabilities have truly become ubiquitous, but understanding HD is still relegated to elite geeks, until now.
High Definition Means High Resolution
When understanding high definition, the first step is to understand resolution, because where there's high resolution there is high definition. Resolution is simply a description of the number of pixels being viewed. Pixels are the smallest parts that make up a full video picture. High Definition generally starts at 1280 x 720 (aka 720i or 720p) pixels and goes up from there. This simply means that there are 1280 pixels if you counted them from left to right, and 720 pixels if you count from top to bottom, for a total of 921,600 viewable pixels. These days, the largest defintion available to consumers is 1920 x 1080 which is also referred to as 1080i or 1080p.
1080i vs. 1080p and Hz Explained
The 'i' in 1080i stands for 'interlace scan' and the 'p' in 1080p stands for 'progressive scan'. The Hz specification that goes along with that indicates how many times per second the video image refreshes. So a sample resolution would be 1080p30, or 1080p at 30 Hz. There is a big difference between 1080i and 1080p, but what is better?
The quick answer is that 1080p is much better than 1080i. With progressive scan (1080p), your TV will refresh every single pixel every time it refreshes, which is commonly 60 times per second (or 60Hz). With interlace scan (1080i) your TV will take the 1080 lines of pixels and only refresh the even lines at one time, and the odd lines at one time. So during each refresh only half of the pixels are up to date, and the other half is old by a microsecond. This can cause a very subtle blurring effect to the naked eye when viewing fast moving video.
Generally speaking, if your TV can not do 1080p, and you have to choose between 1080i and 720p, we'd recommend 720p probably. This is especially true if you have an HDTV that is 32" or smaller. Chances are you can't tell the difference between 720p and 1080p on a screen of that size.
Interfaces
There are four main High Definition interfaces that the majority of the public use: Component (RCA), VGA, DVI, and HDMI.
Component
Component video generally uses three different RCA connectors for one single High Definition video signal. It is called component because it splits up the three components of video into three different connectors. This is in contrast to its counterpart, composite, which uses only one RCA connector to transfer a video signal, and the connector is typically yellow. Since component splits the video signal up into three connectors, more bandwidth is available than its single connector brother composite.
Component is an analog signal, and essentially splits the color video signal into Red, Green, and Blue signals, with black and white included in all three. Component can achieve resolutions from 480i to 1080p, but few HDTVs support resolutions this high through the component port.
VGA
VGA is a computer connector that uses the same type of analog signal that component uses, just in a single connector with 15 pins. VGA, or Video Graphics Array has been used in computers since 1987. It is a common mistake to assume that digital signals are better than analog. One advantage of analog is that there is a much more lenient distance limitation. With the latest advancement in VGA, also called QXGA, you can achieve resolutions of up to 2048 x 1536.
Although VGA is an interface used mainly for monitors and computers, it is also somewhat common to see VGA interfaces on select HDTVs these days. Converting from VGA to component is also pretty easy since the signals are already similar.
DVI
DVI, or Digital Video Interface, uses a completely different approach to displaying video than its analog predecessors. It is a digital signal that assigns values to each individual pixel using advanced computing, assigning each pixel color and brightness values.
There are also some other versions of DVI that are analog signal friendly. Because of the big difference between the ways an analog video system and a digital video system renders images, the DVI interface completely seperates the two. DVI-I connectors for instance are 'integrated,' or simply include the capabilities of transferring both analog (the same signal used in VGA) and digital signals.
Since DVI is a video-only signal (and no audio), it is typically used as the digital video connector for computers.
HDMI
HDMI, or High Definition Multimedia Interface, basically took DVI and added audio, making an all-in-one connector that would be convenient for home theater systems. The older version of HDMI out there right now, or HDMI v1.2, uses the DVI video signal and 8 channel surround sound audio in one compact connector. A recent revision of HDMI (aka HDMI v1.3) upgraded the bandwidth to allow 10.2 Gbit/s transfer rates, or 340 MHz. This also allows for Deep Color Support, high definition audio, and more.
Is "Full-HD" HD Enough?
Not yet.
Full HD is 1080p, which has been known over the past few years as the best definition. When I bought a 1080p TV I felt like my eyes had been opened for the first time. I could no longer be satisfied with DVD or VHS video. This low quality media seemed like I was looking at the world without prescription glasses. 1080p made the video so clear for me. I knew technology would keep advancing, but I thought this was the best quality of video.
Television has become a part of everyone’s lives. Why would we not work to make it better? Our track record shows that we always have tried to make it better. TV started out as a luxury in a small box that emitted black and white images in the late 1920’s. The first color TVs in America were sold in 1953. By 1981, HDTVs were being shown off to investors. It took a while for the size of televisions to catch up to the HDTV resolution. In 1997, 42” 480p were available to consumers. By 2006, Blu-rays, HDMI cable, and 1080p televisions were released for everyone to enjoy. Ever since, we have increased the size of TVs and “Full HD” has seemed to be the limit. While we enjoyed growing TVs, all video attempts to catch up to quality of 1080p.
Not so surprisingly, advancements in technology are what brought video quality to the amazing detail it has today. But what more could we do to make video more lifelike? There are quite a few things actually. The solutions vary from increasing the frames per second, increasing the pixels in the display, to enhancing the video signal with a video purifier.
You may have heard about 48fps because of the controversy in showings of “The Hobbit”. Standard video is recorded at 24fps (frames per second). The human eye can perceive more than this. 48fps makes the video smoother and closer to what our eyes naturally perceive. It makes it too similar to real life motion that many people do not like it. However, it still enhances the video to be lifelike, thus making it overall better.
1080p refers to the amount of pixels in the height of a display. The amount of pixels is actually 1920 x1080. To get more detail, technology has tried to shrink the size required per pixel and shove more into one display. Electronic shows are now showing off 4k and 8k televisions (4k references width of display). 4k has four times as many pixels as 1080p displays, 8k contains SIXTEEN TIMES as many pixels as our current “Full HD” displays. It is astonishing how much more detail will be available when these “Ultravision” televisions are released for retail. This is a little misleading however. Blu-rays currently only have 1080p burned onto them. We will need to have 4K media in order to play it. Also, the human eye can only perceive so much detail at a distance. The iPhone retina display is supposed to have more pixels than the eye can perceive from a distance greater than 10.5” (face to phone distance). A 42” 1080p TV will be “retina” if the viewer is about 10’ away. But who wants to be that far away? To get a better view and more detail at a closer range, manufacturers have increased the amount of pixels in commercial televisions.
Manufacturers have also created video purifiers, such as the Darblet by Darbeevision. These units take an already high-definition video signal, run their algorithms on it, and clean up the picture. To me, there does seem to be a difference in most videos. It increases the contrast and sharpens the image even more. A video purifier attempts to add “depth-cues” that are lost when recording a film. Depth-cues are what tell the eye how far things are in relation to each other. It tricks the eye into thinking it isn’t on a 2D display. It does not add more detail to the video signal, like adding more pixels on a Blu-ray disc would. However, it does clean up the picture to make it look even better on an HDTV.
Whether you have the latest technology, are waiting for the release of the best TV, or simply are content with what you have in your house, television quality is going to increase. “Full HD” is not the best by itself right now, and will be obsolete in the near future. Make sure you are ready for the high-definition you want.
By Trent Crawford
Seeing Pixels? Video Resolution and Scaling Explained
All monitors, TVs, and other video displays are physically built with a set number of pixels. Screen resolutions are the number of pixels in a display. For example, common widescreen computer monitors have a resolution of 1680x1050 and common HDTVs have a resolution of 1920x1080. The first number is the number of columns of pixels and the second number is the number of rows.
Here are some common video resolutions:
Standard definition video:
640x480 or 720x480
High definition video:
1280x720
1920x1080
Scaling is what is done when your video source resolution doesn't match your display's native or physical resolution. If you have a video running in standard definition, something like 640x480 and you want to display it on a screen that is a higher resolution like 1920x1080 the image needs to be scaled to fit the display. The image is taken from having 480 rows of pixels and making it have 1080 rows of pixels and the number of columns is adjusted proportionately.
If scaling wasn't done the video would only fill up a small portion of the larger screen. It would still only take up 480 out of the 1080 rows of pixels. Scaling is done all the time without us even thinking about it. Computer monitors and HDTVs have built in scalers to scale any signal that isn't sent at the display's native (generally maximum) resolution. If they didn't we'd run into a lot of problems like the one i just mentioned.
For example, say you are viewing a regular DVD or a standard definition TV signal on an HDTV. Both the DVD and the standard definition TV signal are going to be a 480 signal (640x480 or 720x480 for a widescreen DVD). If your TV didn't have a scaler, the video signal wouldn't fill up your nice big HDTV. What your TV does is take that lower resolution signal and it scales it, effectively stretching it to the 1920x1080 resolution (or whatever resolution your TV runs) so that it can fill the entire screen.
If you've ever changed the resolution on your computer to a lower resolution, your computer monitor will do the same thing. It will scale it to fill the entire monitor.
This has to be done because there are a set physical number of pixels in the display and the image must be scaled to that resolution to be displayed properly.
Scalers can be very helpful if you have several video sources and you want to run them all through a single connection on a display. Also, they are often used when your source signal resolution doesn't match a resolution supported by your display.
A scaler by itself does only this. It is not designed to improve the picture quality of a lower resolution image or enhance it in any other way. When a lot of people talk about a scaler they are referring to a device that will scale the resolution as we've just discussed and also has an image processor in it to try and clean up the image.
The first thing to be aware of is if you are starting with a low resolution image you can never make it look as nice as an image that started at a high resolution. You cannot make your standard TV channels come through as sharp and with as much detail as an HD channel.
A video processor will run the video signal through often many complex algorithms that adjust the video signal to clean up bad noise and to try and sharpen the image. This will often help with the quality of the picture, but it will never be a substitute for true HD sources. Many HDTVs these days have some form of image processor in them to perform these functions. The same standard definition TV signal may look better on one TV than another because of the type of image processing that is done within the TV.
If you want to squeeze the best quality you can out of your standard definition sources you would want to use a nice video processor and scaler that can clean up the image before sending it to your TV.