Information Entropy

Concerning the second law of thermodynamics:

Dr. Luciano Floridi explains in chapter 3 of his book, Information: A Very Short Introduction:

Entropy is a measure of the amount of “mixedupness” in processes and systems bearing energy or information. It can be seen as an indicator of reversibility: if there is no change of entropy then the process is reversible. A highly structured, perfectly organized message contains a lower degree of entropy or randomness, […] and hence it causes a smaller data deficit, which can be close to zero […]. By contrast, the higher the potential randomness of the symbols in the alphabet, the more bits of information can be produced by the device.

Similarly, from chapter 8 of Dr. Yunus Çengel’s book, Introduction to Thermodynamics and Heat Transfer, (as it relates to information entropy):

Processes can occur in a certain direction only, not in any direction. A process must proceed in the direction that complies with the increase of entropy principle […]. A process that violates this principle is impossible. [….] The performance of [information] systems is degraded by the pretense of irreversibilities, and entropy generation is a measure of the magnitudes of the irreversibilities present during that process. The greater the extent of irreversibilities, the greater the entropy generation. Therefore, entropy generation can be used as quantitative measure of irreversibilities associated with a process.

In retrospect, it appears prudent to quantify information entropy in order to accomplish the maximization of information dissemination, a matter of importance to intelligence officers and journalists alike, a notion similarly expressed by to Dr. Floridi in chapter 5:

Thermodynamics and information theory are often allies sharing one goal: the most efficient use of their resources, energy, and information. [….] The ‘green’ challenge is to use information more and more intelligently in order to reduce that energy input to ever lower targets, while keeping or increasing the output.

To what degree does an informer create information entropy, and to what degree does an informer minimize or maximize said entropy when an informee has synthesized the informants information?

Can information entropy be measured in a singular piece of semantic media, such as a news article, and can information entropy be juxtaposed with other, related pieces of semantic media?

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s