How Analogue Colour Televisions Works the Coding and Decoding Process
Thank you for a fine explanation of this complex process. Well done! NTSC originally proposed the phase alternating process. RCA called it PLA. This was dropped due to the already massive cost of a color TV receiver in 1953 dollars. By eliminating the additional delay line, a very expensive component back then, they cut some of the final cost from the product. I observed a PAL receiver at the NAB convention, in the early 1990s, that was suffering severe phase distortion and looking closely at the screen I observed that it suffered the same color shift issues as NTSC. However, with the phase alternation, one line was green, the next purple. Stepping back a short distance, they eye blended the two lines and the picture was acceptable. At one of my jobs, throughout my long career in video product engineering, I had the opportunity to connect a three tube camera to a video monitor with direct RGB (525/30i). The result was extremely astounding. This was how I learned about how “bad” chroma encoding schemes could be.
A further detail of the NTSC I&Q signals is they were not equal bandwidth. The I signal, which was reds, yellow and orange image areas, was 1.5mhz. The Q signal which was the blues and violets, was 0.6mhz. This was done to give more detail to the brighter nature and flesh tone colors. The problem with this was a more costly receiver decoder design. Two different bandwidth filters were needed and because they were different frequency passbands, they had different delay. So yet another delay line was required in the I signal to keep it registered with the Q signal. Only a few early color receivers, like the RCA CT100 used the true I&Q system. This is why the color reproduction on a restored RCA CT100 is so good. But all USA TV manufactures including RCA as well as the Japanese soon went with the equal bandwidth R-Y, B-Y system to reduce consumer receiver costs. Most broadcast NTSC encoders continued to use the I&Q system but the advantage was lost in the later receivers that used equal bandwidth, about 500khz, R-Y, B-Y.
A very nice explanation but some points: NTSC did not propose the subcarrier based of fh of 15.735KHz. In fact the fh at the time of proposal was 15.750KHz. The fh and fv were later changed from 15.750KHz/30Hz to 15.735KHz/29.94Hz to reduce sound buzz. Regarding 25Hz subcarrier offset, having viewed PAL images without any offset (PAL system M), I would argue that it was totally unnecessary. In fact it caused enormous complications to editing videotape, where 4 frames had to pass before the subcarrier was in a suitable phase to join. If you look at some of the early pop videos with fast cuts, you often see the picture move left, and even move up or down by 2-lines, as the playback circuitry tries to match the phase. Another aspect of PAL was the burst blanking that Dr Bruch specified for around the vertical blanking period. It made no sense to a PAL receiver, though it did give Japanese set makers a “hook” to understand the phase of the following burst, thus allowing them to bypass the import restrictions governed by the patent holder Telefunken.
Excellent video! I’m surprised about how similar PAL and NTSC were. I was stationed in the US Army and was dismayed by how much better their PAL broadcasts were than NTSC (until I learned how recently they got Color TV compared to the US). Specifically, blacks were blacker. I think reds were a deeper red. And PAL seemed to lack a lot of artifacting NTSC had, like if there’s white text on a black background, the white text creates a band, or ghost image, to the right where it should be black. Colors didn’t bleed as much, but this could be due to greater bandwidth. The 50hz flicker was a problem for me, though.
It’s a problem even if you grow up with it. Having a 50Hz TV in your peripheral vision is incredibly distracting. The 60Hz default for 640*480 and above on computers isn’t much better, so I am really glad CRTs are dead. I was constantly going into the settings in Windows on other people’s computers and putting up the refresh rate (usually to 85Hz) because I found the thing unusable at 60Hz.
Simple PAL was not the first version of PAL but a way for cheap manufacturers to avoid paying royalties for the PAL patents to Tekefunken. Most noticable being the first line of Trinitrons from Sony. The simple PAL receiver has a hue button like in NTSC. But no other set or brand made simple PAL receivers ever again. And PAL “D” does not exist, it is called by the CCIR notation for B/G, I, D/K, M or N which denotes line count/frame rate, channel bandwidth, sound offset, and channel spacing.
NTSC has been jokingly — and deservingly — called “Never Twice the Same Colour” because the chroma phase tends to vary and the user adjustment provides no absolute reference. The Magnavox console i grew up with had both colour and tint buttons on the remote and the knobs on the set were motorised.
This was likely to be the technical blame, had a large number of persons noticed that every major network broadcasting 911 footage appeared in its own hue. Presumably this ‘color coding’ was done for internal purposes in the overall production, as by 2001 “white balance” was a perfected art. Yes, this means that 911 was likely a controlled, staged event, even from deep technical perspective.