One way to automatically correct those distortions is collecting correction data in a database like PTLens and Lensfun. The photo software knows what corrections to apply to images taken by a certain camera with a certain lens at a certain focal length. That approach however requires significant effort for maintaining the database plus you will always be playing catch up with the camera manufacturer developing new lenses. Not an issue for independent open source efforts, we've been running after information from vendors ever since, but not a good situation for the camera vendor. Unless he runs after the software companies to update their databases whenever he releases a new lens, images developed from his cameras RAW files will be not optimal. I would not want that, especially not if I wanted to reduce development cost and retail price for new products.
Panasonic most likely went the second way, which is having software makers implement the same correction algorithms they use in their cameras and embedding the necessary correction parameters inside the photos meta data. Doing it that way allows for lesser synchronization effort between Panasonic and software vendors and faster time to market. With some effort it might be possible to use this correction data also in open source software.
That being said, there's now a couple of tasks to be done until the correction data can be used:
- find out what information is stored, and where
- reverse engineer the format of the correction parameters
- understand how it's applied to image data
- proof of concept implementation
Reverse engineering is a black box approach. In our case the camera is the black box, inputs are the controls like aperture, focal length, exposure program etc, output are the generated RW2 files. You cannot observe the inner workings of the black box other than by observing differences in the output when making modifications to the input. It definitely helps if you can make an educated guess about the inner workings of the black box. For geometry correction, I know a bit about how its corrected, so that is my first target.
I know that geometry distortion depends on the focal length, so I made a couple of shots with the kit lens at 14mm and 45mm, two at each. Then I looked at the Exif data and found one tag, "Exif.PanasonicRaw.0x0119", where the data did not change for images taken at 14mm, but changed significantly when going to 45mm.
84 236 201 47 38 0 0 0 43 1 0 0 141 1 1 0 208 14 238 1 86 2 2 252 196 9 226 3 228 72 36 134
Looking at the data, you see something interesting - a lot of Zeros appearing at 45mm. Nothing changes from 45mm to 35mm, which looks like this:
152 94 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 86 2 0 0 196 9 0 0 35 234 24 155
Still a lot of Zeros, but lesser. What does this tell us? Now, I know that the geometry of the lens is almost perfect at 45mm, so I expect that there is less correction necessary. It's reasonable to assume that some parameters of the correction formula will become very small, or zero. The data in the Exif tag apparently follows our expectations. I also assume that the data is not encrypted.
161 54 43 194 51 0 0 0 0 0 0 0 0 0 1 0 42 0 1 0 86 2 1 0 196 9 245 1 176 16 17 65
The next task is now to find out how the data is formatted. Most likely these parameters are rational numbers. What I don't know yet is the representation. Is it 16, 32 or 64 bit per coefficient? Am I looking at fixed or floating point numbers, little or big endian? Does this tag only contain correction parameters for geometry, or also for chromatic aberration? The tag contains 32 bytes, so it's either 16, 8, or 4 coefficients. I'll have to do some research on geometry correction, maybe Panasonic uses something that is public domain.
It looks like a prominent method of correcting lens geometry is based on Zernike polynomials. The PanoTools lens model uses polynomials as well. One property of polynomials is that you can use as many parameters as you like, it just depends on the order of the polynomial. It's merely a question of computing power available. I could start with a 2nd or 4th order polynomial like the PanoTools model and try to match the tag information with coefficients computed by Hugin.