Main Page | See live article | Alphabetical index

Meaning

Meaning is one of the most difficult and complex concepts we have, studied in semantics (a branch of linguistics) and in the philosophy of language (the theory of meaning in particular).

See especially: philosophy of language and the meaning of meaning.

Meaning in a wider sense is also part of the general theory of information.

This theory tries to formulate rules, about

The meaning of an information or a thing is its value or its sense.

Often the meaning of an information or a thing is only realized, when it is lost or it tends to get lost.

Table of contents
1 How does meaning come up ?
2 How can we measure meaning ?
3 In which sequence of information don`t we find any meaning ?
4 Information with meaning is always something in between complete randomness and complete uniform order
5 Literature

How does meaning come up ?

Meaningful things can come up Many people think, that meaningful things can only be made by mankind. That of course depends on the way, how the word meaning is defined. How broad or narrow you think of it. From a scientific point of view meaning can emerge out of nature itself, without the help of man or any other creator. An important example of the new production of meaning is the biological evolution of new species.

How can we measure meaning ?

The quality of an information , its meaning, is not as easily measured as the quantity of an information. You can think of a book, in which you can find only nonsense. On the other hand you can think of a book full of mathematical formulas. In the second book every letter and every sign have their meaning. The quantity of information , for example the number of letters or the number of pages, can be the same. The quality of both books is certainly different.

Here we must establish other criterias to measure the quality of its information: For example:

The quantity of meaning can be quite small, whereas the quantity of kilobytes can be quite big. You can fill a screen of your computer with randomly chosen letters. Then you have a maximum of information quantity but a minimum of information quality.

In which sequence of information don`t we find any meaning ?

If you take a random sequence of 0 and 1, then you can assign to it only some meaning on how long this sequence is or how it was created. Aside this you can not find any meaning in it.

Information with meaning is always something in between complete randomness and complete uniform order

Literature