I think I've found an acceptable approximation to an entropy filter. It is a bit fuzzy in its rigor, but it produces results that look good to me.
My goals was that a sample set like (1, 1, 1, 1, 1, 1, 1, 1, 1) would have a value of zero and (0, 32, 64, 96, 128, 160, 192, 224, 256) would have a maximal value.
I treat each interval between each pair of samples as "part" of the distibution. All of the intervals are defined to have the same probability p. Each value in an interval then has a probability p/(interval length). For convenience we let p=1 (scaling the distribution adds a constant value to each entropy calculation, but the relative entropy values should remain constant).
The entropy of an internal is then:
E_ij = - Length_ij * Log2(1/Length_ij)
E_ij = Length_ij * Log2(Length_ij)
For each ordered sample pair (i,j).
This results in values ranging from 0 to around 30k. We can scale these values to [0..1) which can then easily be converted to [0..255]. However, this conversion seems to be result in an entropy map that consists largely of only black and white pixels. I'm not happy with the loss of detail when converting to Gray8 format.
I've played with other approaches to normalize the entropy results. The following image shows the results of using the relative position in a sorted list of values.
This shows that small values are high represented in the final distribution. Areas that were black in the first picture are now full of static. The bright area also seem to mix together too much. I don't think this is a very useful metric either.
To discount the large number of small values, I find unique values (when rounded to the nearest integer) in the image. These unique values are placed into a list and the relative position of a value in this unique list is used to scale the image.
This looks much like the scale image, but is a bit brighter and seems to have more dynamic range. I can see the static in the dark ares, but it doesn't dominate the image like the second version. I've completely lost the relative entropy values, but since my technique is quite ad-hoc, I have little confidence in the relative values' usefulness anyway.
At this point, I think I am ready to move on to new things.



No comments:
Post a Comment