$\renewcommand{\vec}[1]{\mathbf{#1}}$ $\newcommand{\tens}[1]{\mathrm{#1}}$ $\newcommand{\R}{\mathbb{R}}$ $\newcommand{\suml}{\sum\limits}$ $\newcommand{deriv}[1]{\frac{\mathrm{d}}{\mathrm{d}#1}\,}$ $\newcommand{dd}[1]{\mathrm{d}#1}$
MathJou | blogjou

MathJou

Sometimes, when leaving my mind alone with itself, ideas pop up that are at the same time fascinating and completely unsorted. In order to give them the chance to be further explored and to provide them with the appropriate respect, I try to collect some of them here.

  • From self-information to thermodynamic entropy

    The goal of this post is to introduce Shannon entropy as an information theory concept with its origin in self-information of events, and then linking it to the thermodynamic concept of entropy through maximization.

  • Information partititioning into synergistic, unique, and redundant components

    I would like to explain how the partitioning of mutual information into synergistic, unique, and redundant components works. This is a short summary of Section 2 in Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables.

  • Equifinality or why $x\neq 5$

    ** THIS ARTICLE IS A STUB **