What Happens When You Watch SDTV on HDTV?

Article Details
  • Written By: G. Wiesen
  • Edited By: Heather Bailey
  • Last Modified Date: 18 October 2019
  • Copyright Protected:
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
People can experience an altered state of consciousness by staring into someone else's eyes for 10 minutes.  more...

November 17 ,  1973 :  US President Richard Nixon insisted he was not a crook.  more...

When you watch a signal that is SDTV on HDTV equipment, the signal is typically “upscaled” in some way to make it appear decent, though it is typically far from an HD signal. This difference occurs because of the major difference in resolution provided by standard definition television (SDTV) and high definition television (HDTV) displays. Since the resolution, and overall size, of each type of display is quite different, a signal meant for SDTV is effectively displayed improperly on an HDTV. This means that SDTV on HDTV lacks the high quality of the HD image and may look worse than on an SD display due to the difference in the two displays.

There are inherent differences between SD and HD images, so when you view SDTV on HDTV, you are basically viewing media in the “wrong” format. This issue basically comes down to how SD and HD displays are designed and created, specifically with regard to the number of pixels used in each system. A pixel, or picture element, refers to the individual points of colored light that make up the overall image displayed on a television screen. SDTVs typically have about 480 rows of pixels from top to bottom, and either 640 or 704 columns of pixels from left to right on the screen.


In contrast, an HDTV can have many more pixels, with 1,080 rows of pixels from top to bottom and 1,920 columns of pixels across the screen becoming a minimum standard. Even this much larger number of pixels is being dwarfed by new displays that have thousands more pixels in each direction, creating TVs and displays with incredibly sharp and realistic images. Formats such as “ultra high definition television” (UHDTV) may produce screens with a resolution 16 times greater than current HD standards.

The problem with watching SDTV on HDTV occurs when the SD image is transferred between the two formats. A television signal, even a digital signal, for SDTV is meant for display on a screen with a resolution of about 640x480. When that signal is displayed on a screen with a resolution of 1920x1080, the image must either appear very small on the larger screen or else be increased in size.

This increase in size that occurs when viewing SDTV on HDTV then ultimately creates a lower picture quality on that HDTV. Many TVs “upscale” the SD image to make it appear better, though this still fails to reach the picture quality of an HD signal. Due to this difference, an SD signal may ultimately appear better on an SDTV than on an HDTV, though the picture quality of an HD signal viewed on an HDTV is far superior to any image viewed on SDTV.


You might also Like


Discuss this Article

Post 2

An SD signal on an SD TV doesn't just appear better, it is much better than what is displayed on an HDTV. The HDTV processor does a horrible job of expanding the signal, creating a lot of distortion and delayed imaging. No HDTV that I have tested has a good enough graphics processor to effectively expand a view in the way that just about any computer graphics processor does.

If you have a good multimedia player on your computer, try watching that same SD DVD with the computer connected to the DVI or VGA port of the TV and see the difference.

Post 1

All the posts I view seem to imply that the observed lower quality on an HDTV is due to the stretching of the picture on a larger screen. This is not the case. I have a 39" CRT SDTV and a 47" HDTV and the size of the picture of a standard definition display is actually slightly smaller on the HDTV. But the same signal that looks clear and sharp (sharp lines and fine details), looks cartoonish on the HDTV.

The problem is the the converter is doing an extremely poor job of determining how to display this on the different number of pixels available. I have found this to be the case on every 1080p TV. Standard definition just

looks awful on them. Everyone has the misconception that it is just the size difference, or now that they are spoiled to HD the old display looks worse, but it always was like that. It is very easy to see that is just not true if you have both side by side. Is it just that a decent upconverter just wasn't considered important? If you expand a 640x480 graphic to fill out the screen using a computer input, you will see none of this blurring. The graphics engine in a computer does what an HDTV processor does not do.

Post your comments

Post Anonymously


forgot password?