A New Information Theory
Shannon: a mathematical theory of communication and its interpretations developed from the work of Claude E. Shannon, a 20th century mathematician, electronic engineer and cryptographer known as "the Father of Information Theory" as a result of his ground-breaking postwar work for Bell Laboratories.
"Information Theory" is a mathematical, statistical approach to analyzing and measuring information in all forms of messaging, including natural language processing, statistical inference, data transmission, molecular codes, economic modeling and so forth. There are various interpretations of the application of Information Theory on natural language that seem to be contradictory.
This is the starting point of our experiment.
Is it possible that a one-dimensional sequence of signs carries a multilevel or multidimensional array of information? If so, can this lead to an expanded new mathematical theory of information? How would this affect the reading, describing and understanding of DNA, music, or extraterrestrial noises?
This study project will examine the various applications and interpretations of the Information Theory to explore the unknown, and should be exciting not only for mathematicians, but also for artists, biologists, linguists, computer scientists, etc.
The techniques used will range from discussions to scientific examination and analysis, to art and innovative experimentation with different forms of collaborative intelligence. The process and the end result, whatever it turns out to be, will be determined to a great extent by who joins the project!
Interested? Want to know more?
Please contact FUFF