Consumer prices and manufacturing costs are not necessarily related. Consumer prices include the costs of R&D, marketing, logistics, management, the profit margins and so on. Moreover, saving $0.50 here and there can greatly reduce the costs of manufacturing. Often these little savings can make the difference between making a loss and making a nice profit.
The $.50 savings probably refers to eliminating an extra clock generator on the CGA.
As for the difference between a TV and a proper monitor, if indeed they were both simple NTSC displays with no fancy connectors or scan rates, I suspect the increased cost was a combination of better frequency response and/or phosphor resolution (i.e. sharper picture), lower demand, and brand image.
Wait, what? The "color burst" is only relevant to analog television signals. It hasn't been used in computers since we switched away from TV monitors in the late 1980s. It's definitely not present in today's computers outside a few video-out circuits (not even VGA, only composite / S-Video output).
IBM sold Real Computers to Real Businesses and Real People. That costs money, if only to look good to the CEO who doesn't look at cars that go for less than $300k. Money means quality, right?
Could it also be the fact that good engineers love to optimise design?
Saving another clock or a more complicated divider to achieve the required video clock rate would be a damn good thing to accomplish. Sure money was also saved, and together with a few other such savings, multiplying them all by millions of units, would save quite a lot of money.
>The color subcarrier is a signal with a frequency of 3.579545 MHz.
So he explained how one choice depend on another seemingly arbitrary choice but didn't follow up the branch ... anyone want to do one better and explain why this frequency was chosen? Is it a multiplier of how fast Tesla's dog chased it's tail?