[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
NTSC (Re: [ProgSoc] Microsoft's ClearType)
> > > of the two to use. It was really the NTSC output or PAL Colour Card that
> > > did all the interpretation, and if you had only B&W output then you got a
> > > lot sharper resolution and all your colours came out as vertical stripes.
> > Those stripes ARE the colour for NTSC systems... NTSC filters out the high
> > frequency sub-band of the video stream and extracts colour information from it.
> That would imply that if you tried doing 128 vertical stripes (each with a
> width of 1/256 of the screen, and with just as much gap between them), you
> would just get a plain colour block. But NTSC broadcasts can have a
> horizontal (not vertical) resolution of over 700 lines, eg live telecasts.
Are you telling me that an NTSC broadcast can transmit 350 vertical white
stripes with black between them and that each stripe is individually visible
on close inspection? Given that the quality of NTSC is worse than PAL and
that PAL is pretty poor, I would have to see this with my own eyes to believe it.
I could well believe that NTSC is FILMED at high resolution but how much of
that gets through to the actual TV set is quite another matter.
I still remember the days of C= 64 and ZX81 machines that did their output to
the TV set and I remember that 40 columns of text was about the limit of what
was readable (and that was getting fuzzy).
> However if you draw 128 vertical stripes on an Apple //'s HGR mode, it
> makes a big block of magenta or whatever. This is the way the NTSC output
> interprets the graphic matrix. I don't think the stripes "are" NTSC's
Try sending NTSC signal to a PAL display monitor or a multisync monitor set
to sync-on-green, neither of these displays will recognise the NTSC colour
and you will see vertical stripes.
> > I had an old video camera that used a striped mask of colour filters and as
> > it scanned the back of the striped mask it generated the sub-band which
> > contained the colour information.
> Sounds more like a bad generation of colour. Test patterns always include
> stripes that are very close together, and the quality of the image is (in
> part) determined by how much/little colour bleeds out between the stripes.
This is precisely because those test patterns are attempting to confuse the
colour mechanism. They use stripes that are just a bit different from the
colour signal to prove that the TV can correctly filter colour information out
from intensity information. The reason that sometimes colour will bleed is
because the TV is having difficulty with this filtering.
> > It was a way of hacking colour into a black and white system but the hack
> > was done when black and white TV went over to colour, Apple just followed
> > the same technique. The designers were following an existing standard.
> Er nup, NTSC was introduced at least 10 years before the Apple ][+ got
> built. Anyway I think we can safely say that when computer users are
> developing high-res graphics for NTSC output these days, they don't have
> to worry about which pixel changes to what colour depending on position.
That is exactly what I am saying!
Step *1*, black and white TV.
Step *2*, NTSC standard is written as a hack to lay colour onto B&W signal.
Step *3*, apple make a monochrome machine.
Step *4*, apple add colour by following the EXISTING hack known as NTSC.
These days, we keep the RGB signal as RGB for the longest possible time
and only mix it into composite video at the last moment. We also have very
high quality filters cheaply available so that splitting up composite video
back into RGB is quite easy to do.
> Yeah, mostly on cheap sets too. A "real" TV like a Grundig or Loewe won't
> produce these artefacts, which is all they really are. The theory behind
> PAL, and NTSC too, says it "should" be possible to get any 640x480 image
> off a computer and place it on TV with the same quality. You need to
> eliminate contaminating factors like cheap TV sets, and use something
> better than a $200 VGA->TV converter, naturally. But a difficult B&W
> matrix, with lots of high-contrast stripes, should in essence be possible,
> given the $$$ ... and it's =still= NTSC or PAL in the end.
I suspect that this is POSSIBLE but only because of the availability of
digital filters and other similarly expensive techniques. When the standard
was written, no one was thinking about computers nor did they care one
iota about transmitting high contrast stripes. They wanted to transmit
things like faces and scenery for the mass audience, computers were for
space ships and nothing to do with TV sets.
> After all what do you think they've been up to in the TV engineering
> departments for the last 30 years, improving things? The specs haven't
> changed, but the implementation sure has :-)
I'll admit that I have never spent the money on a really top-notch TV set.
The old 1084 C= monitors had a PAL decoder in them and you could hook them
up to a VCR to get a TV, it generally looked pretty clean compared with
your average lounge room set. The effect of digital filters in the modem
scene has been awesome so I wouldn't be surprised if similar improvement
hasn't occured in the area of TV as well, probably TVs are more price-sensitive
than modems though.
You are subscribed to the progsoc mailing list. To unsubscribe, send a
message containing "unsubscribe" to firstname.lastname@example.org.
If you are having trouble, ask email@example.com for help.
This list is archived at <http://www.progsoc.uts.edu.au/lists/progsoc/>