Information Science – What the Hell Is Information?

Articles from this series


  1. Information science - Introduction
  2. Information science – Uniqueness and essential questions
  3. Information science – Philosophical approaches
  4. Information science – Paradigms
  5. Information science – Epistemologies

Main source


Introduction to Information Science - DAVID BAWDEN and LYN ROBINSON


Source

Information



Source

Lately terms like “information age” and “information society” are being ever-increasingly used. Who among us can though say, with clear conscience, that you know what information is? For several decades information scientists have been dealing with this term. During that time the word has been used in more than 10 000 cases differently. In this article I’ll present two main point of views that are being currently used when regarding information and numerous definitions that have been created under the banner of Information science.

Information physical and biological



Source

The term information as “attribute of the physical world” originated when the entropy has been studied. Entropy is scale of disorderliness of the physical system, whereas information is the data about how any given aspect works. Entropy occurs, when not enough information about any given aspect of the physical world is gathered. In other words, there is formal mathematic relation between information and entropy. Information is supposedly fundamental aspect of physical universe. It is not energy, nor matter. Wheeler thinks that “bit” was prior to the energy and matter. If that was the case the physical world (energy and matter) would be generated by the information in the very “core” of everything.
Very similar approach has been taken in biology. The discovery of genetic code and successive advancement in molecular biology has led the scientists to an opinion that information is more fundamental attribute of the living organism that for example metabolism, reproduction and other manifestations of life.

Mathematical theory of information



Source

This theory originated due to the technical problems during transmission of message through various communication networks. Whereas the theory above deals with the information “empirically”, which means that the transmission happens in the physical world (and thus we try to measure it trough physics, chemistry and biology), this theory takes the syntactic approach – meaning that the used language and codes are being studied. Both of the theories ignore the semantic approach (what the message is trying to convey) and pragmatic approach (in which context the message is relevant to the receiver of the information).
The mathematical theory is also called Shannon-Weaver theory. First the quantitative extent of the information had to be revealed, so the comparison of several transmittal systems could have been done. The extent of the information understood in this way has then been counted as a logarithm of all the possible arrangements of the codes. (Sorry I don’t know how to type the logarithmic E and internet was of no help to me…all of you that understand mathematics will probably get it :P)

H = -KE pi log pi

Pi is the probability of every symbol. K is the constant defining the unit. – is there because we want the quantity of information (H) positive. The probability of each symbol is always below 0, which would lead to the outcome being negative otherwise. Shannon has called the H “the entropy”.

Numerous definitions of information



Source

Brookes “Scrappy knowledge”

Floridi “Meaningful data”

Feather “Summary of data in comprehensible form suitable for communication”

Shannon “Communicated codes”

Belkin “Alternation of the structure”

Bateson “Difference that makes the difference” (lol)

Bates “Certain structure of organisation of the matter and energy to which the meaning is being lend by a living being”


H2
H3
H4
3 columns
2 columns
1 column
5 Comments