> From: Jan Steinman <Jan@xxxxxxxxxxxxxx>
>
> There is no inherent "look" in RAW format, which is NOT what is captured
> by a digital camera anyway. By the time it gets to RAW format, it has
> already had a whole bunch of software processing, so I really cannot
> comprehend your aversion to software for modifying the "look" of a
> digital image.
I'm a fan of the following way of thinking. When you enlarge a picture that's
been taken with film, there's a certain threshold where the grain starts
showing up. If the film is any good, this appears pleasing to humans because
it's irregular. When you do the same with digital, it appears unpleasing
because of the regular raster. No in-camera or post processing can change this.
Another really bad thing is that some cameras store their pictures in JPEG
format.
Then again, all these discussions are probably the same as CD vs. records. No,
you don't get the cracks and ticks noise with digital. No, the superior
"quality" (what a subjective term) cannot be proven despite all the numbers
thrown at that job.
I play cds and yes, they sound very well on *my* stereo to *my* ears.
> For most digicams (with Bayer filter sensors), an image starts out with
> twice as much green information as red or blue, and each pixel senses
> only primary transmissive colors. Then what you perceive as the "look"
> of the sensor is generated by software, as it synthesizes this
> asymmetric matrix into full-color pixels. This is where most color
> balance algorithms operate, thus the "look" is truly variable, even with
> RAW mode.
No, look is more than just color balance.
In the meantime, the quality of digital *prints* has already equalled classic
prints. I particularly remember the Helmut Newton exhibit in NYC. Or (for
anyone in the neighbourhood) the current Dave McKean exhibit in Leuven. The
latter one is extremely impressive. Anybody not knowing Dave McKean should
urgently go out and buy some of the stories he's illustrated. Recently, it's
all digital prints for him, too. But I'm guessing they're using film and scan
it to get the resolution.
> >Film still allows more flexibility in how the
> >original image is recorded...
>
> Again, we must agree to disagree.
>
> Any given film has ONE way it reacts to light. A digicam has an infinite
> number of ways it reacts to light. You haven't convinced me otherwise.
I think the actual sensor has only one way that it reacts to light. What
happens afterwards (in the camera or on PC), is like what happens in the
darkroom.
> This is not rocket science. It is not my opinion. It is "Moore's Law,"
> which has held true for 50 years of semiconductor technology advancement.
I'm not entirely convinced Moore's Law holds for CCD. I've heard the story of
how we'll all be using digicams within months from now, for far too many times.
In the mean time, estimates are getting longer (2-3 years ?) Reminds me of
the debate on how quickly computers will be able to think just like human
beings.
Still, I do agree that for *practical* purposes (like my amateur travel
photography), digital will be here some day. But for my 2-month trip, I'd hate
to have to take a pile of memory cards, or a PC. The 2nd hand OM gear is
expensive enough to have it stolen, let alone lots of digital stuff. OM stuff
is far smaller, too. (and the user interface is better, and it doesn't feel
like goddamn plastic) :-)
Peter.
< This message was delivered via the Olympus Mailing List >
< For questions, mailto:owner-olympus@xxxxxxxxxxxxxxx >
< Web Page: http://Zuiko.sls.bc.ca/swright/olympuslist.html >
|