Sunday, August 29, 2010

Entropy Filter v3

I keep making mistakes.

Let Z be the set of sampled points and Z_i be a specific point in Z.
Let N be the number of values in an interval
N = abs(Z_i - Z_j) + 1
(The one is added to avoid an infinite PDF for identical sample values)
Let p be the probability of a value in an interval.
p = 1/N
The entropy E_ij of an interval is then.

E = - N * p * log2(p)
E = - N * (1/N) * log2(1/N)
E = - log2(1/N)
E = log2(N)

The distribution of values seems to be much more uniform in the resulting distiribution. I can see no difference between the normalization scaling and unique value rank methods I decribed in my previous post.



1 comment:

  1. Alan

    I am a photographer and am interested in your Entropy filter. Would it be possible to get a copy to use on my Digital images? Thanks.

    ReplyDelete