The Eyes Have It: The Marvel of Viewing Filters Various Filters and Applications by Ira Tiffen

Reference: StudentFilmmakers Magazine, January 2007. The Eyes Have It: The Marvel of Viewing Filters Various Filters and Applications by Ira Tiffen. Pages 8 and 9.

Next time you venture out into the sun, take a few seconds to notice how bright it is. Almost blinding, isn’t it? But, before long, your eyes will adjust, the pupils shrinking in size, reducing the amount light entering the eye, and making it all appear normal again. What, you say, you’ve done this before? Many times?

Then you are already familiar with the core of what we’ll be discussing this month…the difference between what your eye sees and what the camera records, and the need to reconcile between the two if we are to obtain the images we want.

When George Eastman set up his filmmaking establishment, he knew early on that he’d need optical filters to get the most from the emulsions. He brought in Frederick Wratten to help him in his quest to make his film perform as needed. While this would make a good story for another time, it suffices here to note that even then it was important to adjust the realities of the world to the characteristics of the recording medium.

Your eyes have a unique advantage over the camera – your brain. It has been shown that the images your eye forms on its own really aren’t as clear as we’d suppose. In fact, the brain performs a significant amount of image processing to make the world around us adequately visible.

When you took in some sun, as earlier, you went from the expectedly dimmer interior to the relatively brighter exterior. Initially, your eye’s iris was open enough to provide visibility at the dimmer level; once outside, it began to ‘automatically’ adjust smaller to handle the increase in luminance. Camera shutter and lens diaphragm adjustment accomplishes something similar in terms of luminance. Automatic metering, with its varied means of measuring light at different points within the scene, and then using those measurements to determine an exposure setting, of course brings us even closer.

Given enough time, upwards of about twenty minutes, your eyes can adjust to a very wide range of luminance levels, allowing us to see in both the brilliance of snow-clad mountains, as well as in the dim shadows of the forest at night, without the aid of exterior optics. Sunglasses do make it more comfortable, and visibility can be enhanced with optics, but the point is that the eyes can handle a lot of variation on their own.

The eye/brain team accomplishes a good deal more for us than luminance compensation. The brain uses visible known references to evaluate and render color. When you see a white object in different kinds of light, it still looks white. The brain compares it to the other objects in the scene and makes a relative determination of how to portray it. If everything else looks ‘un-white’ next to it, then white it shall be. It has been determined that you can peer through a color as spectrally biased as a Wratten #25 Red filter long enough to still discern colors in the scene, as well as white.

Cameras, on the other hand, have only recently begun developing ‘brains’ of their own to deal with color, as well as luminance. Digital white balance, for instance, comes closest. They also have pre-sets to handle known color temperatures, for known types of lighting. There are many ways to maximize the results of these capabilities, which is a substantial subject on its own. For now, we’re interested in the fact that the eye sees the world differently than does the camera, and if we are to record images of greatest value, we need to not only understand how this works, but how to turn that understanding to our best advantage.

So what’s the biggest difference between the eye and the camera that we need to manage? When you look at a scene to evaluate its lighting, to determine whether you have lit the scene properly for the effect you desire, your eye almost immediately begins to make compensation for what it sees. Look into the shadows, the iris enlarges; shift your gaze slightly to peer into a bright reflective area and the iris diminishes accordingly. You will have a tough time trying to see the scene as the camera will, because the camera can make no such accommodations; it’s stuck with whatever you give it. Balancing the luminance levels of both highlights and shadows to provide proper levels of detail (or of no detail, if that’s what you want) within each is therefore hard to gauge by eye.

Sure, experience helps, but why not use a simple tool to give you the edge? That tool is the viewing filter. Available in several versions for various applications, viewing filters all have one key characteristic in common – they alter the perception of the eye to more closely match the less-adaptive characteristics of the recording medium. They optically remove the ‘advantages’ of the eye/brain team to make your eye see more like the camera.

The primary use of viewing filters is to evaluate lighting contrast within the scene. You want to know whether there is the right amount of light to show what you want and obscure what you want. These filters are best used by looking at the scene for only a few seconds at a time, before your eye begins to adjust, during which you see what’s visible and what’s not. The trick is to use the filter often enough to learn through comparisons of what you see to what you record how to use the filter properly. So experience is still a factor, but the filter does make it easier and the results are more consistent.

Lighting contrast viewers generally are smallish filters, large enough only to fit into an eyeshade-style mount that covers just one eye; you close the other eye in use. The filter, then, has the ability to adjust the eye to various types of evaluation.

The initial use for viewing filters was for black and white imaging. The filter developed for this is a deep amber color. Its density makes it hard for the eye to see too much into shadows, a common trait amongst the lighting contrast viewers. Its color adapts the color of the world to a mono-chromatic rendition that simulates the gray-scale rendering of black and white recording media.

There is more than one type of color viewer. The darker versions are for slower EI or ISO-equivalent ‘speeds’ – the measure of the amount of light needed to render a proper ‘normal’ exposure for that recording medium. The greater the ability of the camera to record in dimmer light due to a greater EI or ISO equivalent speed, the greater the need to allow the eye to see into darker areas, and so the filter needed becomes necessarily lighter.

For many years, the standard color viewer was a ND 2.0, meaning that it was a neutral density filter, essentially a dark gray that transmitted only about 1% of the overall light. This worked fine for the slower speed media. Years ago, people at Kodak approached me with a request to produce a new filter for the faster speeds becoming available. Their testing had shown a lighter filter, one that transmitted about 4% of the light, seemed to work best. That collaboration yielded the first lighting contrast viewer for what was to become the ubiquitous faster speeds.

Viewers also come in other colors to accommodate process imaging, for instance. Green, blue, and even red-screen work involves being able to determine whether the lighting is both adequate for the effects, and that spill is not going to be problematic. Primary green, blue, and red filters are used for these purposes.

Another application is to use a contrasting color filter to inspect a rear-illumined screen for holes. A red filter will readily highlight pinholes in a blue or green screen. The hole will be bright red while the rest of the screen will be almost black.

The blue filter has an additional use in adjusting video monitors using color bars. Adjust the bars to render properly even brightness of both darker and lighter bars when made monochromatic by looking through the blue filter.
Finally, one additional type of filter, the ‘gaffer glass,’ provides sufficient protection from the damaging rays of the sun to allow determining the timing of cloud cover more effectively. This filter can also be used to judge the condition and position of a filament in a light to determine proper placement when installing a new bulb, or trying to see whether a new one is imminently needed.

The various filter manufacturers that are the prime sources of these filters provide important safety information regarding their use. Please read, understand, and apply this information accordingly.

The offerings from each maker will vary, but the intention is the same – to give your eye the ability to see more like the camera does, and to use that ability to get the lighting right. Succeed in this, and you’ll agree that the eyes really do ‘have it.’

In over 30 years of making optical filters, Ira Tiffen created the Pro-Mist, Soft/FX, Ultra Contrast, GlimmerGlass, and others, netting him both a Technical Achievement Award from the Academy of Motion Picture Arts and Sciences and a Prime-Time Emmy Award. Elected a Fellow of the SMPTE in 2002, he is also an Associate member of the ASC, and the author of the filter section of the “American Cinematographer Manual.”

About Us

StudentFilmmakers.com is where creatives grow. Learn filmmaking, connect with industry pros, and access tools, contests, and inspiring educational resources.

Advertisers

Sign up for our Newsletter

Discover exclusive access to free webinars, hands-on workshops, and cutting-edge insights into emerging technologies and workflows. Sign up with the form above to stay ahead in the fast-evolving world of filmmaking.

Scroll to Top
×