Greg Beetham wrote:Too me it just shows what we already know, that f22 is soft compared to f11 etc. I guess another thing is, choosing something that is only two dimensional doesn’t give f22 a chance to show what it actually has some use for, probably not so much today with digital, but back in the film era where you couldn’t do any stacking it had some use.
The f/22 finds a lot of use with people who need a fill flash for casual shooting outside in bright sun. Having nothing more than on-board flash with no HSS, f/22 becomes the only option in some cases. And no, I'm not one of those people.
Progressive nature of softening at aperture tightening is exactly the property telling you that it is not coming from diffraction.
Diffraction effectively creates a low-pass filter with very peculiar properties. First of, the diffraction is non-dissipative. Which means that the energy of the suppressed spatial frequencies is not removed from the image, it's just relocated to the lower spatial freqs. The lower frequency that gets most of the boost is exactly the half of the diffraction cutoff frequency. Which would mean that at f/22 16Mp image would get the 6-pixel features/freqs emphasised while all features smaller than 4-pixel totally obliterated. It would look exactly like running a USM at 6px 'radius' at 200% plus a 100% low-pass at 3px radius. It should be actually looking sharpened in a downsampled image and would also show strong halos.
So the result would be very far from smooth blurring you observe with your SLR lenses, but it is exactly the thing you may see in any simple optical microscope.
The softening we see with our lenses progressively smooths out all the higher frequencies in the image, even those well below the half-cutoff, and it also decreases the contrast, both local and global. While the global contrast loss is often caused by diffuse reflections off the iris blades edges, the local contrast loss comes from the simple haze in non-perfect optical media used in the lens.
The regular lens softening looks exactly like a result of a simple dissipative filter, much like the effect some simple RC filter circuits cause to electric signals. The microscopic imperfections and refraction index variations in the optical medium are the things that create that dissipative blurring. But! Exactly those things also let some of the image detail well past the diffraction limit to still appear in the captured image, although at deeply attenuated energies. It happens because the original wavefront loses its self-coherence interacting with those imperfections/variations as different paths in that dissipative optical media introduce random phase shifts into adjacent parts of the wavefront, and so the different parts of the original wavefront lose the ability to create interference patterns on the sensor top which is exactly the diffraction we could expect there.
Thus it's exactly the non-perfect nature of our lenses that prevents the diffraction from happening in our lenses/cameras, and it's exactly the imperfection of our lenses that lets us see the detail past the diffraction limits. It's just up to the lens manufacturer to decide how much haze is needed to provide an optimum balance between haze and diffraction suppression, and then to charge us more for the better end result.