Claude Shannon published in 1948, at the age of 32, the article ‘a mathematical theory of communication’ introducing the world to the concept of ‘information entropy’ and its now famous mathematical formulation H=-ƩP(x)logP(x), which meant the invention of Information Theory, and the worldwide recognition of Claude as its father.
Today we want to share one of Shannon’s lessons through his focus on solving a fundamental problem: trustworthy communication through an untrustworthy channel. By insisting on solving this need, Shannon ended up developing his Information Theory.
When we transmit information through a channel, the signal received at the output is the superposition of the transmitted signal plus the noise. Thanks to the Information Theory and its mathematical formulation, we can reach an error level as low as we want, adding redundancy in transmission by incorporating encoding systems in sending and decoding ones in reception.
By creating in such an elegant way the Information Theory on which computer and communication sciences are based, Shannon shows us the importance of asking good questions, as the fundamental basis for obtaining incredible answers.
At Im3 we want to follow Shannon’s example, so we set ourselves challenges to provide answers to the engineering of the future.