“Taking pictures is like tiptoeing into the kitchen late at night and stealing Oreo cookies.”— Diane Arbus
Today’s Post by Joe Farace
I think we can all agree that digital imaging is more convenient than film photography. The ability to select a white balance, choose a favorable ISO setting and view images on the fly are just a few reasons. And there’s the inevitable question: JPEG or RAW?
In case you already didn’t know, JPEG is an acronym for the Joint Photographic Experts Group who created the standard for still image compression in 1986. JPEG defines how an image is compressed and decompressed into a photograph. Your camera makes adjustments to maximize data, eliminating colors the eye can’t see and then compresses the image to save the image as a JPEG file. Because this process discards what it decides is redundant, JPEG is a lossy (not lousy) format. Keep in mind, however, that when the file is opened in a computer the lost data is, for the most part, rebuilt, especially if a low compression ratio was used.
Unlike JPEG, capturing a RAW file requires little or no internal processing by the camera. These files contain more color information but that data now requires processing. When choosing RAW all of the data from the camera’s imaging chip is saved without any processing. Image aspects such as Contrast, Saturation and Sharpness are not applied to the image file. Perhaps these food analogies will help explain the difference between a RAW capture and a compressed (JPEG) capture:
When should you use RAW and when should you use JPEG? There are lots of discussions on this subject here and what Mark has written on this subject. Read what we both have to say and make up your own mind.
Many thanks to master photographer Barry Staver for the cooking analogies.
Along with Barry Staver, Joe is co-author of Better Available Light Digital Photography that’s currently out-of-print but new copies are available at collector (high) prices or used copies for giveaway prices—less than five bucks—from Amazon.