This is an archive copy of the IUCr web site dating from 2008. For current content please visit https://www.iucr.org.
[IUCr Home Page] [CIF Home Page] [imgCIF Home Page]
NEXT | BACK | INDEX

IMAGE ENTROPY



Shannon's source coding theorem shows that optimal coding is achieved when the length of the code assigned to the ith symbol is -log2 P(i) where P(i) is the probability of the occurence of symbol i.

The zeroth order entropy H(P) is defined as:

              n
             ---
  H(P) =  _  \  p  log  p
             /   i    2  i
             ---
             i=1

measured in bits per "symbol" (pixel value)

where n is the number of separate symbols.

For a memoryless source the entropy defines a limit on compressibility for the source. (Images are not memoryless sources.)

Higher order entropies exist, but they are impractical to apply.