**Abstract:** Information theory spread from original application in communication theory to many
diverse fields, for instance physics (statistical mechanics), mathematics (probability theory) or
computer science (algorithmic complexity). The basic ideas of information theory such as entropy,
mutual information and entropy rate can be exploited in time series analysis to measure an
information flow between two series, thereby detect possible causality. The information theoretical
approach allows us detect statistical dependencies of all types, not only linear coupling as is the
case for standard tools (auto/cross-correlation function, Granger causality test), hence, non-linear
systems may be examined too.