Long History of high-definition televisions
The term high definition once described a series of television systems originating
from the late 1930s; however, these systems were only high definition when compared
to earlier systems that were based on mechanical systems with as few as 30 lines
of resolution.
The British high definition TV service started trials in August 1936 and a regular
service in November 1936 using both the (mechanical) Baird 240 line and (electronic)
Marconi-EMI 405 line (377i) systems. The Baird system was discontinued in February
1937. In 1938 France followed with their own 441 line system, variants of which were
also used by a number of other countries. The US NTSC system joined in 1941. In 1949
France introduced an even higher resolution standard at 819 lines (768i), a system
that would be high definition even by today's standards, but it was monochrome only.
All of these systems used interlacing and a 4:3 aspect ratio except the 240 line
system which was progressive (actually described at the time by the technically correct
term of 'sequential') and the 405 line system which started as 5:4 and later changed
to 4:3. The 405 line system adopted the (at that time) revolutionary idea of interlaced
scanning to overcome the flicker problem of the 240 line with its 25 Hz frame rate.
The 240 line system could have doubled its frame rate but this would have meant that
the transmitted signal would have doubled in bandwidth, an unacceptable option.
Color broadcasts started at similarly higher resolutions, first with the US' NTSC
color system in 1953, which was compatible with the earlier B&W systems and therefore
had the same 525 lines (480i) of resolution. European standards did not follow until
the 1960s, when the PAL and SECAM colour systems were added to the monochrome 625
line (576i) broadcasts.
Since the formal adoption of Digital Video Broadcasting's (DVB) widescreen HDTV transmission
modes in the early 2000s the 525-line NTSC (and PAL-M) systems as well as the European
625-line PAL and SECAM systems are now regarded as standard definition television
systems. In Australia, the 625-line digital progressive system (with 576 active lines)
is officially recognized as high definition.
Analog systems
In 1949 France started its transmissions with an 819 lines system (768i). It was
monochrome only, it was used only on VHF for the first French TV channel, and it
was discontinued in 1985.
In 1958, the Soviet Union developed Тransformator (Russian: Трансформатор, Transformer),
the first high-resolution (definition) television system capable of producing an
image composed of 1,125 lines of resolution aided at providing teleconferencing for
military command. It was a research project and the system was never deployed in
the military or broadcasting.
In 1969, the Japanese state broadcaster NHK first developed consumer high-definition
television with a 5:3 aspect ratio, a slightly wider screen format than the usual
4:3 standard.[3] The system, known as Hi-Vision or MUSE after its Multiple sub-Nyquist
sampling encoding for encoding the signal, required about twice the bandwidth of
the existing NTSC system but provided about four times the resolution (1080i/1125
lines). Satellite test broadcasts started in 1989, with regular testing starting
in 1991 and regular broadcasting of BS-9ch commenced on 25 November 1994, which featured
commercial and NHK programming.
In 1981, the MUSE system was demonstrated for the first time in the United States.
It had the same 5:3 aspect ratio as the Japanese system. Upon visiting a demonstration
of MUSE in Washington, US President Ronald Reagan was most impressed and officially
declared it "a matter of national interest" to introduce HDTV to the USA.
Several systems were proposed as the new standard for the USA, including the Japanese
MUSE system, but all were rejected by the FCC because of their higher bandwidth requirements.
At the same time that the high definition systems were being studied, the number
of television channels was growing rapidly and bandwidth was already a problem. A
new standard had to be radically efficient, needing less bandwidth for HDTV than
the existing NTSC standard for SDTV.
Rise of digital compression
Since 1972, International Telecommunication Union's radio telecommunications sector
(ITU-R) ITU-R has been working on creating a global recommendation for Analogue HDTV.
These recommendations however did not fit in the broadcasting bands which could reach
home users. The standardization of MPEG-1 in 1993 also led to the acceptance of recommendations
ITU-R BT.709[6]. In anticipation of these standards the DVB organization was formed,
an alliance of broadcasters, consumer electronics manufacturers and regulatory bodies.
The DVB develops and agrees on specifications which are formally standardized by
ETSI.
DVB created first the standard for DVB-S digital satellite TV, DVB-C digital cable
TV and DVB-T digital terrestrial TV. These broadcasting systems can be used for both
SDTV and HDTV. In the USA the Grand Alliance proposed ATSC as the new standard for
SDTV and HDTV. Both ATSC and DVB were based on the MPEG-2 standard. The DVB-S2 standard
is based on the newer and more efficient H.264/MPEG-4 AVC compression standards.
Common for all DVB standards is the use of highly efficient modulation techniques
for further reducing bandwidth, and foremost for reducing receiver-hardware and antenna
requirements.
In 1983, the International Telecommunication Union's radio telecommunications sector
(ITU-R) set up a working party (IWP11/6) with the aim of setting a single international
HDTV standard. One of the thornier issues concerned a suitable frame/field refresh
rate, with the world already strongly demarcated into two camps, 25/50Hz and 30/60Hz,
related by reasons of picture stability to the frequency of their main electrical
supplies.
The WP considered many views and through the 1980s served to encourage development
in a number of video digital processing areas, not least conversion between the two
main frame/field rates using motion vectors, which led to further developments in
other areas. While a comprehensive HDTV standard was not in the end established,
agreement on the aspect ratio was achieved.
Initially the existing 5:3 aspect ratio had been the main candidate, but due to the
influence of widescreen cinema, the aspect ratio 16:9 (1.78) eventually emerged as
being a reasonable compromise between 5:3 (1.67) and the common 1.85 widescreen cinema
format. (It has been suggested that the 16:9 ratio was chosen as being the geometric
mean of 4:3, Academy ratio, and 2.35:1, the widest cinema format in common use, in
order to minimize wasted screen space when displaying content with a variety of aspect
ratios.)
An aspect ratio of 16:9 was duly agreed at the first meeting of the WP at the BBC's
R & D establishment in Kingswood Warren. The resulting ITU-R Recommendation ITU-R
BT.709-2 ("Rec. 709") includes the 16:9 aspect ratio, a specified colorimetry, and
the scan modes 1080i (1,080 actively-interlaced lines of resolution) and 1080p (1,080
progressively-scanned lines). The current Freeview HD trials use MBAFF, which contains
both progressive and interlaced content in the same encoding.
It also includes the alternative 1440×1152 HDMAC scan format. (According to some
reports, a mooted 750 line (720p) format (720 progressively-scanned lines) was viewed
by some at the ITU as an enhanced television format rather than a true HDTV format,[8]
and so was not included, although 1920×1080i and 1280×720p systems for a range of
frame and field rates were defined by several US SMPTE standards.)
Demise of analog HD systems
However, even that limited standardization of HDTV did not lead to its adoption,
principally for technical and economic reasons. Early HDTV commercial experiments
such as NHK's MUSE required over four times the bandwidth of a standard-definition
broadcast, and despite efforts made to shrink the required bandwidth down to about
two times that of SDTV, it was still only distributable by satellite with one channel
shared on a daily basis between seven broadcasters. In addition, recording and reproducing
an HDTV signal was a significant technical challenge in the early years of HDTV.
Japan remained the only country with successful public broadcast analog HDTV. Digital
HDTV broadcasting started in 2000 in Japan, and the analog service ended in the early
hours of 1 October 2007.
In Europe, analogue 1,250-line HD-MAC test broadcasts were performed in the early
1990s, but did not lead to any established public broadcast service.
[edit] Inaugural HDTV broadcast in the United States
HDTV technology was introduced in the United States in the 1990s by the Digital HDTV
Grand Alliance, a group of television companies and MIT.[9][10] Field testing of
HDTV at 199 sites in the United States was completed August 14, 1994.[11] The first
public HDTV broadcast in the United States occurred on July 23, 1996 when the Raleigh,
North Carolina television station WRAL-HD began broadcasting from the existing tower
of WRAL-TV south-east of Raleigh, winning a race to be first with the HD Model Station
in Washington, D.C., which began broadcasting July 31, 1996.[12][13][14] The American
Advanced Television Systems Committee (ATSC) HDTV system had its public launch on
October 29, 1998, during the live coverage of astronaut John Glenn's return mission
to space on board the Space Shuttle Discovery.[15] The signal was transmitted coast-to-coast,
and was seen by the public in science centers, and other public theaters specially
equipped to receive and display the broadcast.
[edit] First regular European HDTV broadcasts
Although HDTV broadcasts had been demonstrated in Europe since the early 1990s, the
first regular broadcasts started on January 1, 2004 when Euro1080 launched the HD1
channel with the traditional Vienna New Year's Concert. Test transmissions had been
active since the IBC exhibition in September 2003, but the New Year's Day broadcast
marked the official start of the HD1 channel, and the start of HDTV in Europe.
Euro1080, a division of the Belgian TV services company Alfacam, broadcast HDTV channels
to break the pan-European stalemate of "no HD broadcasts mean no HD TVs bought means
no HD broadcasts..." and kick-start HDTV interest in Europe.[18]
The HD1 channel was initially free-to-air and mainly comprised sporting, dramatic,
musical and other cultural events broadcast with a multi-lingual soundtrack on a
rolling schedule of 4 or 5 hours per day.
These first European HDTV broadcasts used the 1080i format with MPEG-2 compression
on a DVB-S signal from SES Astra's 1H satellite at Europe's main DTH Astra 19.2°E
position. Euro1080 transmissions later changed to MPEG-4/AVC compression on a DVB-S2
signal in line with subsequent broadcast channels in Europe.
[edit] Notation
HDTV broadcast systems are identified with three major parameters:
* Frame size in pixels is defined as number of horizontal pixels × number of
vertical pixels, for example 1280 × 720 or 1920 × 1080. Often the number of horizontal
pixels is implied from context and is omitted.
* Scanning system is identified with the letter p for progressive scanning or
i for interlaced scanning.
* Frame rate is identified as number of video frames per second. For interlaced
systems an alternative form of specifying number of fields per second is often used.
If all three parameters are used, they are specified in the following form: [frame
size][scanning system][frame or field rate] or [frame size]/[frame or field rate][scanning
system]. Often, frame size or frame rate can be dropped if its value is implied from
context. In this case the remaining numeric parameter is specified first, followed
by the scanning system.
For example, 1920×1080p25 identifies progressive scanning format with 25 frames per
second, each frame being 1,920 pixels wide and 1,080 pixels high. The 1080i25 or
1080i50 notation identifies interlaced scanning format with 25 frames (50 fields)
per second, each frame being 1,920 pixels wide and 1,080 pixels high. The 1080i30
or 1080i60 notation identifies interlaced scanning format with 30 frames (60 fields)
per second, each frame being 1,920 pixels wide and 1,080 pixels high. The 720p60
notation identifies progressive scanning format with 60 frames per second, each frame
being 720 pixels high; 1,280 pixels horizontally are implied.
50Hz systems allow for only three scanning rates: 25i, 25p and 50p. 60Hz systems
operate with much wider set of frame rates: 23.976p, 24p, 29.97i/59.94i, 29.97p,
30p, 59.94p and 60p. In the days of standard definition television, the fractional
rates were often rounded up to whole numbers, like 23.98p was often called 24p, or
59.94i was often called 60i. High definition television allows using both fractional
and whole rates, therefore strict usage of notation is required. Nevertheless, 29.97i/59.94i
is almost universally called 60i, likewise 23.98p is called 24p.
For commercial naming of a product, the frame rate is often dropped and is implied
from context (e.g., a 1080i television set). A frame rate can also be specified without
a resolution. For example, 24p means 24 progressive scan frames per second, and 50i
means 25 interlaced frames per second.
One aspect of the HDTV that hasn't received a naming standard is color. Until recently,
the color of each pixel was regulated by three 8-bit color values, each representing
the level of red, blue, and green which defined a pixel color. Together the 24 total
bits defining color yielded just under 17 million possible pixel colors. Recently,
some manufacturers have designed systems that can employ 10 bits for each color (30
bits total) which provides for a pallette of 1 billion colors. They contend that
this provides a much richer picture. Until the naming of this criterion is standardized,
consumers will have to do research to ensure that a piece of equipment supports this
feature.
Most HDTV systems support resolutions and frame rates defined either in the ATSC
table 3, or in EBU specification. The most common are noted below.
|