Category: 

How Does Ultra High Definition Television Work?

Article Details
  • Written By: Paul Reed
  • Edited By: Shereen Skola
  • Last Modified Date: 31 August 2016
  • Copyright Protected:
    2003-2016
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
The U.S. Coast Guard led the evacuation of more than 500,000 people from Lower Manhattan on 11 September 2001.  more...

September 27 ,  1940 :  The World War II Axis powers formed with the signing of the Tripartite Pact.  more...

Late in the 20th century, broadcasters began introducing high definition (HD) television programming to provide better picture clarity and resolution than standard analog or digital televisions. These improvements were made at the same time as light-emitting diode (LED) and liquid crystal diode (LCD) televisions in flat screen formats provided the necessary technology to view high definition signals. Ultra high definition television (UHDTV) televisions provide up to 16 times more picture elements, or pixels, than HD screens, permitting high definition picture quality on larger screens.

In the 20th century, initial high definition signals used analog technology, which is similar to radio transmissions of the picture and sound. Analog high definition required up to four times the signal bandwidth of standard television, which restricted its use. Development of digital signals, where the picture and sound are converted to binary zeros and ones, then converted back to the television format at the receiver, allowed much more data to be carried on a signal band.

Television signals use transmission frequencies similar to radio, marine and telephone communications. Development of high definition and ultra high definition television broadcasting required new digital compression technologies, which takes a standard digital signal and compresses it electronically to allow more data to be carried on an existing signal. These improvements permitted high definition signals to be transmitted to customers beginning in the 1990s.

Ad

As customers demanded larger televisions, manufacturers made improvements in high definition signals and electronics to permit televisions with screens 50 inches (125 cm) and larger to be produced. There are limits to screen size with high definition signals, because eventually the picture quality degrades and the screen refresh rate, called scanning, can be seen. These limitations led to development of ultra high definition television technology, to allow high definition on larger screens. UHDTV was first demonstrated in 2002 by researchers at Japan's state-owned broadcaster NHK.

Initial development of UHDTV was limited to laboratory testing of signal transmission and digital compression, because the UHDTV signal requires a very large amount of data. The additional data required new digital compression and transmission technologies to be developed, because the ultra high definition television signal could not be sent over existing television frequencies. Early tests of UHDTV were signals transmitted from the United Kingdom to Japan, involving very high bandwidth requirements over a dedicated frequency.

One concern of ultra high definition television technology is motion-induced nausea caused by the movement of large images on the screen. Initial testing with consumers showed that some users had symptoms similar to motion sickness when viewing UHDTV images. Moving away from the screen, limiting use of UHDTV to larger rooms or locations, may reduce the symptoms.

Ad

You might also Like

Recommended

Discuss this Article

Post your comments

Post Anonymously

Login

username
password
forgot password?

Register

username
password
confirm
email