At 4:33 AM +0000 3/17/03, olympus-digest wrote:
>Date: Sun, 16 Mar 2003 14:22:53 -0800 (PST)
>From: AG Schnozz <agschnozz@xxxxxxxxx>
>Subject: [OM] Sampling rates
>
>[snip]
>How much of my digital audio background is applicable to
>imaging? Not much. Why? Because Nyquist is not being applied
>to imaging in the same manner as it is applied to audio. With
>audio it was determined that you needed two samples to define a
>waveform. Partially true. You only need one sample to define a
>"voltage" at that moment in time. It's through multiple samples
>that you get an application of the voltage readings.
The Nyquist theorm applies to any situation where one is replacing a
bandlimited signal with a series of samples. It works in 1-D (audio) and 2-D
(images) equally well.
What the Nyquist Theorm says is that if one samples a bandlimited signal
densely enough, there is no loss of information from the act of sampling, and
that one can regenerate the original signal perfectly using only the samples.
In theory, "densely enough" is at least twice the highest frequency in the
signal. (A bandlimited signal by definition has a highest frequency.) In
practice, filters are not perfect, and one must sample more densely than that,
say at least three times the highest frequency.
In audio, the 3-db passband is something like 20 Hz to 20 KHz, so the highest
frequency is somewhat above 20 KHz. CDs sample at 44,000 samples per second,
which is cutting it close.
In imaging, the modulation transfer function is analogous to the audio
passband, and the 50% modulation point is analogous to the 3-db passband in
audio. If the lens has 50% modulation transfer at 100 line pairs per
millimeter, that's 200 pixels per millimeter, and 400 samples per millimeter or
more will yield perfect sampling without aliases. In practice, somewhat denser
sampling will be required, as filters are not perfect.
>With imaging, it isn't necessary to have the 2:1 sampling to
>define an object. Each pixel stands alone in its ability to
>define a quality and quantity of light striking it.
>Anti-aliasing filters of various forms must be used to
>counteract heterodyning and other oddball things that will kick
>in with a 1:1 sampling.
Depends what you mean by "define an object". The problem is if the lens is
able to reproduce a pattern in the subject that has a natural pitch that's
about the same as the pitch of the CCD pixels. In this case, there will be
strong beats (color fringes) between the pattern and the pixels. This same
problem (Moire fringes) happens with halftone prints if the halftone screens
are not correctly mis-aligned. The standard solution is to make sure that the
optical system has half the resolution of the CCD pitch, so the beats cannot
happen.
>I totally disagree with the assessment that a slightly "off"
>lens yields better digital pictures. If that was true, you
>would get a far better image when resizing a digital file if the
>original is blurry.
No, it's really true. Resizing after the fact is beside the point. Nyquist
applies only when going from the analog domain to the digital domain. Once one
is in the digital domain, resizing doesn't cause beats. Resampling to a
coarser pitch can cause beats unless the image is pre-blurred, though.
Resamplers often do the filtering on the fly as a part of the resampling
process, so it may not be obvious what's going on.
Joe Gwinn
< This message was delivered via the Olympus Mailing List >
< For questions, mailto:owner-olympus@xxxxxxxxxxxxxxx >
< Web Page: http://Zuiko.sls.bc.ca/swright/olympuslist.html >
|