Visualizing an image is as fundamental to photography as the cameras, lenses, and film. Even modern technology is unable to diminish the need for your eye as it has for so many other aspects of photography.
Seeing an image, however, is not a simple task. It can be aided, in some ways, the old standard of learning with a 50mm prime, for example. However, that only takes you so far.
However, there’s more to images than just what fits in the frame. In particularly relevance in this case is the way light is reflected from surfaces.
That brings me to, of all things, sunglasses.
I wear sunglasses almost religiously. Fortunately, or unfortunately as it may be, the sunglasses I wear are polarized. As a result, the way I normally perceive the world is fundamentally different from how my camera does.
While I was attaching my polarizer, I found myself wondering if photographers should make it a point not to wear polarized sunglasses.That difference was driven home, again, while I was photographing a scene at the Magic Kingdom. What to me was readily visible and obvious wasn’t to my camera (at least initially), and it took a quick review of the images to realize my folly—chimping isn’t always a bad thing. Worse, I find that this is becoming a more frequent experience for me.
I don’t know, maybe that’s a good path to follow. Using only neutral density sunglasses certainly would keep your normal perception in line with that of your camera. However, polarized sunglasses can be useful tools themselves. The drive to putting polarizing foils in sunglasses is the same as using a polarizing filter on a camera lens, cut glare, intensify color, and increase contrast.
The distance scale has long been a useful tool for photographers since it provides a good deal of information about depth of field and focusing quickly and easily. Though it could be argued that autofocus has diminished the need for a distance scale it continues to grace most mid- and higher-end lenses, though in its current form I have no idea why.
The manual focus distance scale is a thing of beauty. It conveys a ton of information about the state of focus and depth of field on the lens.
For example, from the distance scale shown one can immediately see that this lens is focused at approximately 12 feet. One can quickly estimate the depth of field at any full stop aperture. Finally, if one were to shoot at f/22, this lens would be focused at the hyperfocal distance maximizing depth of field. Oh yes, and the red dot near the center mark, that’s the IR focus point.
That’s quite a lot of info that can be gleaned from some simple markings on a lens.
I’m not a huge fan of HDR images; most of the time they look obviously over processed though when they’re well executed and the processing is understated they come off very nicely. I think the trick to good HDRs is to use the larger capture range simply as a mechanism to get enough data to put together a lower noise image with slightly better shadow detail. Perhaps I’m just a traditionalist, but I think it’s best to think of HDRs as split ND filters without be forced to have a straight line for the split.
Recently, Vincent Laforet commented on RED’s camera versus the competition, where he suggested that camera companies should act more like Apple and say nothing about their upcoming products. He argues that by keeping product development secret, Apple can insure that their product is ready without having to meet announced deadlines.
I disagree; secrecy is bad for the buyer and not necessarily bad for the manufacturer. At best secrecy makes our lives harder; at worst it hurts our ability to make good decisions on purchases that can cost as much as a new car.