I’ve noticed that the majority of my photos have some form of CA from minor fringes at mid range focal lengths, to major shadowing in full zoom situations.
I use a bridge camera currently, but I’m curious as to what it is that causes them to appear.
Is it a camera’s sensor?
The lenses?
Or is it a case of post processing mistakes, caused by a lack of understanding of how different modules affect one another?
I got to thinking about this while looking at other photos taken with the same camera that I own, and then looking at photos taken with the camera I wish to get.
In each instance I could still see CA present in many images, but not all of them.
Most of the instances of CA that I observed that were taken using the camera that I own showed the effect in many of the same style of shots, at the same focal length, but not the same ISO.
So, in an effort to try to understand what exactly chromatic aberration is, I’ve come to ask you fine folks.
But the good news is that DT can handle this problem really well. I can’t speak for RT but I am sure it also handles it well. The most common CA problem in my images occur when there is a transition from light to dark such as the edge of a building, leaves or mountain range against a bright overcast sky. DT’s raw CA module is usually sufficient to handle this.
I think most aberration is due to a divergence between idealized custom lens designs and designs which use more widely available universal parts. To translate an image without aberration is an exacting science requiring much precision. Highly presise glass elements can be manufactured for specific lens designs at great expense. To save money, designers can replace all the precision elements with near-enough common catalog parts. With every substitution, the aberrations increase.
To add to that, since the proliferation of mirrorless cameras, camera manufacturers have been dealing with this problem software-wise rather than hardware-wise.
This is because we no longer see the image created by the lens directly through the optical viewfinder.
Chromatic aberration and geometric distortion are actually two different optics problems with a single software solution. Geometric distortion is where the lens doesn’t splay the image on the sensor with the same dimensions as a planar view of the scene, and is radially symmetric from the optical center. Chromatic aberration is where the diffraction of the various wavelengths isn’t symmetric, and the abberation is also radially symmetric from the optical center. Accordingly, correction of both is usually incorporated in the same operation, where pixels are moved to correct the alignment, all three channels moved the same for distortion, each channel move slightly different to align for the chromatic offset. In adobe corrections, the polynomial terms are specified per-channel, and the polynomial curves capture both corrections.
Edit: For both the lens design can be shaped to mitigate both. However, this usually comes at the expense of resolution. Recently camera manufacturers have decided to include software mitigation in the lens design trade space, letting distortion and CA go in favor of sharpness and requiring the correction to be done in software. Rubs some the wrong way, but really is a decent trade, as shifting pixels doesn’t lose much information, where resolution not captured is impossible to correct…
The wikipedia page linked by @paperdigits is the best starting point.
The bottom line is something like this:
every lens has CA, but its magnitude may be irrelevant if it is close to the pixel size of the sensor, and even much larger CA can be dealt with in post-production.
you can fix transverse CA in post-production easily, axial CA trickier, most algorithms will just smudge it out, so you face a trade-off between CA and detail. Which is why lens designers try to minimize this aspect.
don’t expect much from a bridge camera lens, they have a lot of design constraints. an interchangeable lens, especially a prime, or a pro lens, usually has better CA.