This section provides a quick look at how a digital camera works. You don't really need to understand quantum physics to operate a digital camera, but a basic comprehension of what's going on inside your picture box can help you troubleshoot vexing photographic challenges later on; plus, if you're the typical serious photographer, satisfy your curiosity. While this explanation will be simplified greatly, you'll find the concepts apply to virtually all the digital cameras on the market today.
Capturing the Image
The birth of a digital picture begins when illumination from a light source bounces off a subject (or is transmitted through a backlit translucent subject like a stained glass window). Each portion of the subject absorbs some of the wavelengths of light while allowing others to find their way to your camera's lens ([1] on the diagram shown in Figure 2.7). In the illustration, I show only two big fat "beams" of rainbow-colored light passing through the front of the lens, when in truth there are zillions, all composed of photons (light particles) that behave as if they were waves. (Wave/particle duality is one of those quantum physics puzzles we're going to steer clear of!)
The light strikes the lens' glass elements, represented by [2] in the diagram. I show just a single convex element in the illustration, but in real life lenses consist of 4 to 15 or 20 or more elements of varying shapes, which move in unison or individually, depending on how the lens is focused or zoomed. To make things even more interesting, lens elements can shift to compensate for camera shake, countering the photographer's tendency for unsteadiness at slow shutter speeds!
Fixed-focus, non-zooming lenses are the simplest: They are designed to focus an image on the sensor in a single way, so no provision has to be made for moving the elements. As the lens' functions become more complex, additional elements are required to correct the image at particular magnifications or focus positions. The goal in all cases is to direct the light beams ([3] in the diagram) to a sharply focused position on the camera's sensor, marked with a [4] in the illustration.
The sensor serves as the camera's "film" and, like film, contains substances that are sensitive to light. Most digital cameras today use either CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) sensors. We'll look at sensor types in a little more detail shortly. For now, all you need to know is that a sensor is an array (a layout of rows and columns) of very tiny light-sensitive diodes. Electrons are created when enough photons strike one of the diodes. The greater the number of photons that reach a single photosite, the more electrons that accumulate and the brighter that pixel becomes in your final image.
The minimum number of photons required to register an image determines the "sensitivity" of the sensor; very sensitive sensors require fewer photons and thus can record an image with less light. When you crank up the ISO setting of your digital camera (say, from ISO 100 to ISO 800), you're effectively changing this threshold and telling the sensor to require fewer photons before recording an image for a particular photosite or pixel. That's why you get a grainy "noise" effect at higher ISOs: The sensor may record electrical interference or other non-picture information as a pixel at these higher sensitivities. In general, the larger the sensor, the less noise produced.
Traditional CMOS chips are inherently less sensitive to light, and so more susceptible to noise. Yet, they require up to 100 times less power to operate (which translates into longer battery life) and are much cheaper to produce than CCD chips, which is why they have become popular in digital cameras (and scanners). CMOS sensors have become sophisticated enough that they're seeing use in more advanced cameras (costing $1,000 or more) and, impressively, boast some of the best low-noise characteristics among digital cameras.
With a CCD sensor, the electrical charge is transported to a corner of the pixel array and converted from an analog signal to a digital value. CMOS chips include transistors at each position in the array to amplify the signal and conduct it to the analog-to-digital converter through tiny wires. Even though CCD chips predominated a few years ago, CMOS technology is improving all the time, and is now found in the most sophisticated digital cameras, including 12- and 16-megapixel models from Nikon and Canon. They're also used in the simplest devices, including camera phones, web cameras, and toy cameras.
Viewing the Image
Once the light from your subject reaches the sensor, a lot of things start happening. The most important event is your opportunity to view or preview the image, either through a color LCD display panel on the back of the camera or through a viewfinder ([5] in the diagram). The electronic nature of a digital camera provides many viewing options. You may use one or more of the following viewing choices, depending on your camera model.
View on the LCD display. These viewing panels, which operate like miniature laptop display screens, show virtually the exact image seen by the sensor. The LCDs measure roughly 1.5 to 2.5 inches diagonally (although there are some 3.5-inch LCDs that should find their way into cameras in the near future). They generally display 98 percent or more of the picture view seen by the lens. An LCD may be difficult to view in direct light, and some sort of backlighting scheme generally is used to make them as bright as possible in such illumination. LCDs also can be difficult to see when shooting dim subjects if the camera doesn't amplify the signal to provide a bright view.
View through an optical viewfinder. Many digital cameras have a glass direct-view system called an optical viewfinder that you can use to frame your photo. Optical viewfinders can be simple window-like devices (with low-end, fixed-magnification digital cameras) or more sophisticated systems that zoom in and out to roughly match the view that the sensor sees. The advantage of the optical viewfinder is that you can see the subject at all times (with other systems the view may be blanked out during the exposure). Optical systems may be brighter than electronic viewing, too. A big disadvantage is that an optical viewfinder does not see exactly what the sensor does, so you may end up cutting off someone's head or otherwise do some unintentional trimming of your subject.
View through an electronic viewfinder (EVF). The EVF operates like a little television screen inside the digital camera. You can view an image that closely corresponds to what the sensor sees but is easier to view than the LCD display. The EVF goes blank during exposures, however, and it may have problems displaying images in low light, or produce ghost images when subjects move.
View an optical image through the camera lens (with single-lens reflex models). Another kind of optical viewfinder is the through-the-lens viewing provided by the SLR camera. With such cameras, an additional component (not shown in the diagram) reflects light from the taking lens up through an optical system for direct viewing. Some kinds of cameras use a mirror system. The mirror reflects virtually all the light up to the viewfinder. Then, the mirror swings out of the way during an exposure to allow the light to reach the sensor instead. Sometimes, a beamsplitting device is used instead. A beamsplitter does what you expect: It splits the beam of light, reflecting part to the viewfinder and allowing the rest of the light to strike the sensor.
As you might guess, because a beamsplitter steals some of the illumination for the viewfinder, neither the sensor nor the viewfinder receives the full intensity of the light. However, the system does mean that the image needn't blank out during exposure.
Taking the Picture
When you press the shutter release button ([6] in the diagram), the camera takes the photo. Some cameras have actual mechanical shutters that open and close for a specific period of time (representing the shutter speed), while others perform the same function electronically. Electronic shutters actually "dump" the image from the sensor just prior to the exposure, then make the sensor active again just for the interval when the picture is taken, providing a very good simulation of how a mechanical shutter works.
If you partially depress the shutter release before pushing it down all the way, most cameras carry out a few last-second tasks. Your automatic exposure and focus are locked in. If you like, you can reframe your image slightly and the camera will keep the same exposure and focus settings. With autofocus, the focus is adjusted by maximizing the contrast of the main subject, or by more sophisticated means. For example, Sony pioneered an autofocus system using a Class 1 laser that projects a grid of light on your subject. The camera then analyzes the contrast between the subject and the laser pattern. This system is particularly good in low light levels in which the subject's contrast under the existing illumination may not be enough to focus easily. Other cameras use a simple LED lamp as a focus assist light.
If the existing illumination is not sufficient, the electronic flash ([7] in the illustration) may fire. Most cameras interpret the amount of electronic flash light bouncing back from the subject to calculate the correct exposure. Some use a preflash a moment before the main flash to calculate exposure. The preflash also may cause the irises of living subjects to contract slightly, reducing the chance of red eye. The best systems elevate the on-board flash as high as possible above the lens, as shown in the dSLR example in Figure 2.8, which produces more natural lighting and further reduces red-eye effects.
The shutter speed has no effect on the exposure from the electronic flash in most cases, because the electronic flash's duration (1/1,000th to 1/50,000th second or less) is much briefer than the typical shutter speed. While lens openings can adjust exposure within limits, most electronic flash provides additional exposure flexibility by emitting varying amounts of light, providing a shorter exposure at closer distances.
The electrical signals from the sensor, once converted to digital form within the electronics of the camera, are stored on digital media, such as CompactFlash (CF), Secure Digital (SD) cards, or some other media, such as the Sony Memory Stick, xD card, or Hitachi MicroDrive mini hard disk. The time needed to store an image can be as little as a few seconds to 30 seconds or longer, depending on the size of the image, the compression method and ratio you've selected, and the "speed" of the storage media. (Some memory cards take significantly longer to store an image than others.) I've located the electronics and storage at [8] on the diagram, but the actual position can vary widely by digital camera vendor and model. The most popular position seems to be the right side of the camera, or in a compartment on the bottom.
Capturing the Image
The birth of a digital picture begins when illumination from a light source bounces off a subject (or is transmitted through a backlit translucent subject like a stained glass window). Each portion of the subject absorbs some of the wavelengths of light while allowing others to find their way to your camera's lens ([1] on the diagram shown in Figure 2.7). In the illustration, I show only two big fat "beams" of rainbow-colored light passing through the front of the lens, when in truth there are zillions, all composed of photons (light particles) that behave as if they were waves. (Wave/particle duality is one of those quantum physics puzzles we're going to steer clear of!)
Figure 2.7. A lot goes on inside your digital camera when a picture is taken, as represented by this diagram.
The light strikes the lens' glass elements, represented by [2] in the diagram. I show just a single convex element in the illustration, but in real life lenses consist of 4 to 15 or 20 or more elements of varying shapes, which move in unison or individually, depending on how the lens is focused or zoomed. To make things even more interesting, lens elements can shift to compensate for camera shake, countering the photographer's tendency for unsteadiness at slow shutter speeds!
Fixed-focus, non-zooming lenses are the simplest: They are designed to focus an image on the sensor in a single way, so no provision has to be made for moving the elements. As the lens' functions become more complex, additional elements are required to correct the image at particular magnifications or focus positions. The goal in all cases is to direct the light beams ([3] in the diagram) to a sharply focused position on the camera's sensor, marked with a [4] in the illustration.
The sensor serves as the camera's "film" and, like film, contains substances that are sensitive to light. Most digital cameras today use either CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) sensors. We'll look at sensor types in a little more detail shortly. For now, all you need to know is that a sensor is an array (a layout of rows and columns) of very tiny light-sensitive diodes. Electrons are created when enough photons strike one of the diodes. The greater the number of photons that reach a single photosite, the more electrons that accumulate and the brighter that pixel becomes in your final image.
The minimum number of photons required to register an image determines the "sensitivity" of the sensor; very sensitive sensors require fewer photons and thus can record an image with less light. When you crank up the ISO setting of your digital camera (say, from ISO 100 to ISO 800), you're effectively changing this threshold and telling the sensor to require fewer photons before recording an image for a particular photosite or pixel. That's why you get a grainy "noise" effect at higher ISOs: The sensor may record electrical interference or other non-picture information as a pixel at these higher sensitivities. In general, the larger the sensor, the less noise produced.
Traditional CMOS chips are inherently less sensitive to light, and so more susceptible to noise. Yet, they require up to 100 times less power to operate (which translates into longer battery life) and are much cheaper to produce than CCD chips, which is why they have become popular in digital cameras (and scanners). CMOS sensors have become sophisticated enough that they're seeing use in more advanced cameras (costing $1,000 or more) and, impressively, boast some of the best low-noise characteristics among digital cameras.
With a CCD sensor, the electrical charge is transported to a corner of the pixel array and converted from an analog signal to a digital value. CMOS chips include transistors at each position in the array to amplify the signal and conduct it to the analog-to-digital converter through tiny wires. Even though CCD chips predominated a few years ago, CMOS technology is improving all the time, and is now found in the most sophisticated digital cameras, including 12- and 16-megapixel models from Nikon and Canon. They're also used in the simplest devices, including camera phones, web cameras, and toy cameras.
Viewing the Image
Once the light from your subject reaches the sensor, a lot of things start happening. The most important event is your opportunity to view or preview the image, either through a color LCD display panel on the back of the camera or through a viewfinder ([5] in the diagram). The electronic nature of a digital camera provides many viewing options. You may use one or more of the following viewing choices, depending on your camera model.
View on the LCD display. These viewing panels, which operate like miniature laptop display screens, show virtually the exact image seen by the sensor. The LCDs measure roughly 1.5 to 2.5 inches diagonally (although there are some 3.5-inch LCDs that should find their way into cameras in the near future). They generally display 98 percent or more of the picture view seen by the lens. An LCD may be difficult to view in direct light, and some sort of backlighting scheme generally is used to make them as bright as possible in such illumination. LCDs also can be difficult to see when shooting dim subjects if the camera doesn't amplify the signal to provide a bright view.
View through an optical viewfinder. Many digital cameras have a glass direct-view system called an optical viewfinder that you can use to frame your photo. Optical viewfinders can be simple window-like devices (with low-end, fixed-magnification digital cameras) or more sophisticated systems that zoom in and out to roughly match the view that the sensor sees. The advantage of the optical viewfinder is that you can see the subject at all times (with other systems the view may be blanked out during the exposure). Optical systems may be brighter than electronic viewing, too. A big disadvantage is that an optical viewfinder does not see exactly what the sensor does, so you may end up cutting off someone's head or otherwise do some unintentional trimming of your subject.
View through an electronic viewfinder (EVF). The EVF operates like a little television screen inside the digital camera. You can view an image that closely corresponds to what the sensor sees but is easier to view than the LCD display. The EVF goes blank during exposures, however, and it may have problems displaying images in low light, or produce ghost images when subjects move.
View an optical image through the camera lens (with single-lens reflex models). Another kind of optical viewfinder is the through-the-lens viewing provided by the SLR camera. With such cameras, an additional component (not shown in the diagram) reflects light from the taking lens up through an optical system for direct viewing. Some kinds of cameras use a mirror system. The mirror reflects virtually all the light up to the viewfinder. Then, the mirror swings out of the way during an exposure to allow the light to reach the sensor instead. Sometimes, a beamsplitting device is used instead. A beamsplitter does what you expect: It splits the beam of light, reflecting part to the viewfinder and allowing the rest of the light to strike the sensor.
As you might guess, because a beamsplitter steals some of the illumination for the viewfinder, neither the sensor nor the viewfinder receives the full intensity of the light. However, the system does mean that the image needn't blank out during exposure.
Taking the Picture
When you press the shutter release button ([6] in the diagram), the camera takes the photo. Some cameras have actual mechanical shutters that open and close for a specific period of time (representing the shutter speed), while others perform the same function electronically. Electronic shutters actually "dump" the image from the sensor just prior to the exposure, then make the sensor active again just for the interval when the picture is taken, providing a very good simulation of how a mechanical shutter works.
If you partially depress the shutter release before pushing it down all the way, most cameras carry out a few last-second tasks. Your automatic exposure and focus are locked in. If you like, you can reframe your image slightly and the camera will keep the same exposure and focus settings. With autofocus, the focus is adjusted by maximizing the contrast of the main subject, or by more sophisticated means. For example, Sony pioneered an autofocus system using a Class 1 laser that projects a grid of light on your subject. The camera then analyzes the contrast between the subject and the laser pattern. This system is particularly good in low light levels in which the subject's contrast under the existing illumination may not be enough to focus easily. Other cameras use a simple LED lamp as a focus assist light.
If the existing illumination is not sufficient, the electronic flash ([7] in the illustration) may fire. Most cameras interpret the amount of electronic flash light bouncing back from the subject to calculate the correct exposure. Some use a preflash a moment before the main flash to calculate exposure. The preflash also may cause the irises of living subjects to contract slightly, reducing the chance of red eye. The best systems elevate the on-board flash as high as possible above the lens, as shown in the dSLR example in Figure 2.8, which produces more natural lighting and further reduces red-eye effects.
Figure 2.8. The higher the flash and farther from the lens, the less likely red eye will crop up.
The shutter speed has no effect on the exposure from the electronic flash in most cases, because the electronic flash's duration (1/1,000th to 1/50,000th second or less) is much briefer than the typical shutter speed. While lens openings can adjust exposure within limits, most electronic flash provides additional exposure flexibility by emitting varying amounts of light, providing a shorter exposure at closer distances.
The electrical signals from the sensor, once converted to digital form within the electronics of the camera, are stored on digital media, such as CompactFlash (CF), Secure Digital (SD) cards, or some other media, such as the Sony Memory Stick, xD card, or Hitachi MicroDrive mini hard disk. The time needed to store an image can be as little as a few seconds to 30 seconds or longer, depending on the size of the image, the compression method and ratio you've selected, and the "speed" of the storage media. (Some memory cards take significantly longer to store an image than others.) I've located the electronics and storage at [8] on the diagram, but the actual position can vary widely by digital camera vendor and model. The most popular position seems to be the right side of the camera, or in a compartment on the bottom.
No comments:
Post a Comment