Re: Does good compression = good encryption?

Ryan Shelswell (ryan@nospam.socs.uts.edu.au)
Fri, 17 Nov 1995 17:54:22 +1100 (EST)

Previously, Sammy wrote:

>This is really interesting. Coming from a science background entropy in
>physics was defined to be the amount of "order" or lack of chaos in a
>system. I wonder if different people, even within the same field define
>it slightly differently? Anyway I suppose it would also would depend on
>how you define the amount of "order" - as I recall this wasn't a simple
>thing to do. In cryptography is there a clear-cut definition for order
>vs. disorder by any chance?

Yes, there is. Entropy as used by knowledgeable people in the field would
almost certainly be a reference to entropy in the Information-Theoretic
sense. Information Theory was invented mainly by Claude Shannon a whiles
back... one of the Tannenbaums has stuff on it if you're (a bit) interested.

Then, Peter wrote:

> What's the difference between randomness and chaos? (No philosophical
> dissertations please!). I really don't see much difference in the two
> definitions of "entropy" (as seen by Sammy) except that they use
> different descriptive words... "chaos" and "randomness".Order, here,
> is a lack of chaos/randomness. So an increase in entropy means an
> increase in chaos/randomness. Simple...?

Chaos and randomness do indeed have different meanings, when used as
mathematical terms. For definitions, look around...

When used as rhetorical terms, they lack a fixed definition but gain
meaning through association and use.

> There is no real way to quantify order, universally. Whenever a method
> is developed to quantify order/chaos, it can only apply to a subset of
> the possible situations. ie. to quantitatively define the order/chaos
> in weather patterns, and then an encrypted file would require two vastly
> different sets of parameters.

There is masses of physics study into the question "How do we measure the
entropy of this system?". It's pretty interesting, but I guess requires
real study. Beware of thinking information theory entropy is the same as
any other kind - don't mix a physical model with a represntational one!

Ryan