A Simple Model for Sharpness in Digital Cameras – Polychromatic Light

We now know how to calculate the two dimensional Modulation Transfer Function of a perfect lens affected by diffraction, defocus and third order Spherical Aberration  – under monochromatic light at the given wavelength and f-number.  In digital photography however we almost never deal with light of a single wavelength.  So what effect does an illuminant with a wide spectral power distribution, going through the color filter of a typical digital camera CFA  before the sensor have on the spatial frequency responses discussed thus far?

Monochrome vs Polychromatic Light

Not much, it turns out.  Because we are assuming that the system is linear we can use superposition to simulate such a situation by calculating the monochromatic response of the system at every wavelength in the range of interest with the equations we have used so far, weigh the results by the impact of the illuminant and spectral sensitivity function of the CFA at the specific wavelength – and add them all up to obtain the polychromatic response.

Below you can see the spectral sensitivity of the average green CFA channel from a selection of semi-current digital cameras measured by Jun Jiang et al. at the Rochester Institute of Technology[1] at 10nm intervals for wavelengths between 400 and 720nm.

weights

The blue line represents the relative intensity of ‘Daylight’ illuminant D50.  The weights were computed every 10nm as the product of the  relative intensity of that illuminant times the spectral sensitivity of the green color filter.  The following figure shows the MTF of a circular aperture with f-number equal to 5.6: first assuming monochromatic light (the solid line, what we’ve been calculating so far in this series); and then from a polychromatic illuminant after having gone through the green CFA (the dotted line)

effect-of-polychromatic-light-on-diffraction

As you can see there is very little difference.  The reason is that 0.535um is  very close to the center of gravity of the filtered polychromatic light wavelengths (that’s why I chose it for these discussions:-).  Because of superposition, MTF curves resulting from wavelengths higher than the weighted mean average out with those from lower wavelengths.  Note however that diffraction extinction is no longer abrupt – it is more spread out, as expected because of the multiple wavelengths present.  Wavelengths lower than mean extinction are zero therefore they are unable to counteract results from above mean extinction, so the cut-off spreads out.

Next is the model with a mixture of aberrations

effect-of-polychromatic-light-on-abe

and below with a variety of f-numbers: ignore the legend there,  the solid lines represent results with monochrome light at 0.535um and the dots the polychromatic version, both with the same aberrations.

effect-of-polychromatic-light-on-n

Mono Model Works for Polychromatic Light

Therefore it seems that we can use the simple monochromatic model discussed so far to obtain results valid for polychromatic light –  as long as we choose a reference wavelength that is representative of the illuminant and filters used.  In that case the main difference is a slight extension and softer landing for diffraction cut-off.  Alan Robinson suggests that ‘The point spread function at large defocus has significant fine structure, and this gets smoothed out too, though not as much as I initially expected’.  Thanks Alan.

Next: using the simple model to glean information about our cameras and lenses.

Notes and References

1. Space of Spectral Sensitivity Functions for Digital Color Cameras database, 2013

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.