Dear OMer's,
Here are some comments on the ongoing pixel debate.
chris@xxxxxxxxxxxxxx writes:
<< 100 lines per millimetre is 200 pixels per millimetre.
200px times 25.4mm is 5080 ppi.
cjb. >>
In this and previous threads on the number of pixels required to resolve a
given number of lines per mm, a ** theoretic** number of 2 pixels per line
is being used. In signal processing this theoretical limit is often termed
the Nyquist limit. That is, to sample and be able to reconstruct a sine
wave you need at least two samples per cycle. In reality you need more
samples unless you want to introduce artifacts. To sample and reconstruct
a grating of lines (a square wave) and produce sharp high contrast
edges (not turn them into a lower contrast sine wave) you need a whole
lot more. Without going into a lot of technical details you can get a feel
for why you need more pixels using three hypothetical examples:
1) The scanner sensor pixels line up exactly with our grating test negative
lines.
Result we get a perfect grating picture.
2) The scanner pixels line up offset half a pixel from our grating test
negative lines :
Result the grating vanishes and we get a uniform grey with no picture of
the
lines at all, as the sensor pixel averages over half a black line and half
a white "line".
3) The number of pixels per mm of our sensor is just slightly more than
twice the lines per mm of the grating test negative we are trying to
image.
Result the scanned images show wide bands of grey (many pixels in width)
amongst clearer sections of better imaged grating.
(As the sensor pixels first line up exactly with the black and white lines
and then are offset from the lines where they produce wide grey bars
much like in example two.)
This "beating" effect is often termed aliasing: It results from too few
samples for the
frequency content (lines /mm) of the signal (image) we are trying to scan.
Thus if we have a low resolution camera lens it might look better after
scanning
than a better lens as it will have smoothed the grating image before it
could be aliased
by sampling! Obvious beating may not occur in many images but aliasing
may still degrade the image by adding less obvious low frequency noise.
The film resolution specification of maximum lines per mm will be done at
some level
of reduced contrast where the edges of the test image lines are becoming
blurred
into a more sinusoidal shape and the darkest black of the line is not
completely
black. This probably approximates to something between a sine and a square
wave,
relieving us somewhat of trying to image a square wave which would require
many more samples than 2 per line.
Depending on grain size the film grain could well be aliased by the pixel
sampling process so it's effect could be magnified unless the scanning optics
blur it before it reaches the sensor. This is a case where worse scanner
optics
might improve the image noise. Aliasing could be why subjectively people seem
to object to grain more in scanned images than in film? It is also probably
why
digitising at high resolution and then reducing resolution in photshop
which probably averages adjacent pixels to down sample often gives better
results
than sampling at low resolution to start with.
Something else that degrades the image is that the ccd imaging
array
sensor bleeds charge from adjacent pixels reducing contrast between
adjacent pixels when they are at very different light levels.
This is different from film where adjacency effects from
depleted developer increase edge contrast and hence apparent
sharpness. (you could always tweak this in PS to try to correct, but the
scanner effect is only in one direction )
Whoever said digitizing pictures is easy and that choosing a scanner just
requires comparing specifications!
No wonder subjective evaluation of results is important.
Regards,
Tim Hughes
Hi100@xxxxxxx
< This message was delivered via the Olympus Mailing List >
< For questions, mailto:owner-olympus@xxxxxxxxxxxxxxx >
< Web Page: http://Zuiko.sls.bc.ca/swright/olympuslist.html >
|