Are you sure you really need to know this stuff Rod!
I accept it as if its magic.
There is a great explanation with diagrams of it in a third party book put out for DVX / HVX Users.
Check over on the DVX Users SIte. The author is a mod over there. (Barry Green)
Do a search on color sampling
Shooternz, Thanks !!!! I did a quick search over there and didn't find
it...found barry though...and a list of his stuff, went through it but not
about color sampling...will check again later...am in middle of making huge
pot of beef stew....
will leave this mssg here till I find it.. thanks
This article is almost what I need...but it sure would be nice to have some graphics, illustrations...
The advent of video technology and the fact that it is driven by a mass consumer market has brought it into the price range of just about everybody. This has resulted in affordable video single-frame recorders and controllers.
Video technology is based on a raster scan display refreshing format. Raster scan refers to the pattern used to scan out the image: top-to-bottom a line at a time, left-to-right along a line. A line is called a scanline. The image is drawn by an electron beam which strikes a phosphor-coated screen which emits photons in the form of light. The intensity of the electron beam is controlled by the image being scanned out whether that image is stored in the digital memory of a computer, or generated by the similar raster scanning of a video camera. After a scan of an individual scanline, the electron beam is turned off and is positioned at the beginning of the next scanline line. The time it takes to do this is called the vertical retrace interval and the signal which notifies the electronics of this is called vertical blanking or vertical sync. When the beam gets to the bottom of the image, it is turned off and is returned to the top left of the screen. The time is takes to do this is called the horizontal retrace interval and the signal which notifies the electronics of this is called horizontal blanking or horizontal sync. A complete scan of all the scanlines of an image is called a frame. In some video formats, all of the scanlines are done at once (progressive scan). In other video formats every odd-numbered scanline is done on one pass and every even-numbered scanline is done on the next pass (interlaced scan). In interlaced scan, each pass is called a field (two fields per frame).
The NTSC Standard
The National Television Systems Committee (NTSC) in 1941 established 525-line, 60.00 Hz field rate, 2:1 interlaced monochrome television in the United States. In 1953, 525-line 59.94 Hz field rate, 2:1 interlaced, composite color television signals were established. Broadcast video must correspond to this specific standard. The standard sets specific times for a horizontal scanline time, a frame time, the amplitude and duration of the vertical sync pulse, etc. Home video units typically generate much sloppier signals and would not qualify for broadcast. There are encoders that can strip old sync signals, etc. off a video signal and re-encode it so that it does correspond to broadcast quality standards. The specific pieces of video equipment will be mentioned later in this section.
There are 525 total scanline-times per frame-time in NTSC format. 29.97 frames are transmitted per second. There is a 2:1 interlace of the scanlines in alternate fields. Of the 525 total raster lines, 480 contain picture information; the remainder comprise vertical scanning overhead. The aspect ratio of a 525-line television picture is 4:3, so equal vertical and horizontal resolution are obtained at a horizontal resolution of 480 times 4/3 or 640 pixels per scanline. PAL and SECAM are the other two standards in use around the world. They differ from NTSC in specifics, like the number of scanlines per frame and the refresh rate, but both are interlaced raster formats. One of the reasons that television technology uses interlaced scanning is, when a camera is providing the image, the motion is updated every field thus producing smoother motion.
Black and White Signal
A black and white video signal is basically a single line that has the sync information and intensity signal superimposed on one signal. The vertical and horizontal sync pulses are negative with respect to a reference level with vertical sync being a much longer pulse than horizontal sync. On either side of the sync pulses are reference levels called the frontporch and back-porch. Between horizontal sync pulses, which identify the period between scanlines, is the active scanline interval. During the active scanline interval, the intensity of the signal controls the intensity of the electron beam of the monitor as it scans out the image (see Figure X).
Color Monitors and Gamma Correction
A color monitor has three electron guns, each of which can be focused on one of three phosphor coatings on the screen. These phosphors are almost always some shade of red, green and blue. One way to drive a color monitor is to have four lines going into it: red, green, blue, and sync. Sometimes green and sync are superimposed onto one line in which case it resembles a black and white TV signal. In this case a monitor would have three lines going into it: red, green/sync, and blue.
It is often the case that a doubling of the input value on one of the lines does not result in a doubling of the light emitted from the screen. Gamma correction is a modulation of the input signal used to compensate for the non-linear response of the display screen. In graphics systems this is often done by a look-up table which converts the input value to a new value such that a linear response is produced at the screen output.
Incorporating Color into the B&W Signal
When color came on the scene in broadcast television, the engineers were faced with incorporating the color information in such a way so that black & white TVs could still display a color signal and color TVs could still display black & white signals. The solution was to encode color into a high-frequency component that was superimposed on the intensity signal of the black and white video. A reference signal for the color component was added to the back-porch of each horizontal back-porch, called the color burst. The color was encoded as an amplitude and phase shift with respect to this reference signal.
A signal that has separate lines for the color signals is referred to as a component signal. Signals such as the color TV signal with all of the information superimposed on one line is referred to as a composite signal.
Because of the limited room for information in the color signal of the composite signal, the TV engineers optimized the color information for a particular hue which they considered most important: Caucasian skin tone. Because of that, the RGB information had to be converted into a different color space: YIQ. Y is luminance and is essentially the intensity information found in the black and white signal. It is computed as:
Y= 0.299*R + 0.587*G + 0.114*B
The YIQ television signal is similar to the CIE defined YUV (or XYZ) color spaces in that the Y's (luminance) are the same. U and V are color difference signals and are scaled versions of B-Y (by .5/.866) and R-Y (by .5/.701) respectively. The I and Q chromanence signals used in television pick up the remaining two degrees of freedom of the UV space. I and Q are the signals used to modulate the amplitude and phase shift of the 3.58Mz color frequency reference signal. The phase of this chroma signal, C, conveys a quantity related to hue, and its amplitude conveys a quantity related to color saturation. In fact, the I and Q stand for "in phase" and "quadrature" respectively. The NTSC system mixes Y and C together and conveys the result on one piece of wire. The result of this addition operation is not theoretically reversible; the process of separating luminance and color often confuses one for the other (e.g., the appearance of color patterns seen on TV shots of people wearing black and white seersucker suits).
Notes on Video Standards
composite video format
IDTV, EDTV, HDTV
525 line, 59.94 Hz, interlaced (525/59.94/2:1)
4:3 aspect ratio
480 scanlines; w/aspect => 640 pixels
b&w is sync and amplitude with specifics for timings
Color NTSC - add while maintaining compatability to b&w signal
color burst - 3.58MHz subcarrier
YUV (similar to YIQ actually used; In-phase & Quadrature)
Y = luminance = .299R + .587G + .114B
U = B-Y
V = R-Y
I,Q are U,V space coordinate system to actually encode color info
I,Q carry color info; use less bandwidth for these
I,Q modulate 3.58MHz subcarrier;
Q (quadrature) phase - hue, I (in-phase) amplitude - saturation
SVHS and ED-Beta format connectors - Y&C on separate wires
SVHS has severely limited bandwidth for chroma
PAL & SECAM
625/50/2:1 (still 4:3 aspect ratio)
PAL and SECAM are actually the color modulation methods. PAL similar to NTSC
576 lines have picture info
IDTV - improved definition TV
processing that involves use of field store and/or frame store (memory) techniques at the
e.g., de-interlacing at the receiver
involve no change to picture origination equipment and no change to emission standards
EDTV - extended (or enhanced) definition television
employs techniques at transmitter and receiver that are transparent to existing receivers
e.g., separation of luminance and color components by pre-combing the signals prior to
transmission (reduce NTSC artifaces such as dot crawl
e.g., use of progressive scan at camera and de-interlacing at receiver
require changes in picture origination equipment but complies with
HDTV - High Definition TV
approx. twice horizontal and twice vertical resolution, component color coding,
aspect ration 16:9 and frame rate of at least 24Hz.