The first method is based on the Gaussian Mutual Information, which exploits the non stationarity and the correlation of the sources and is implementated via a time-frequency analysis and a joint diagonalisation algorithm. Reference papers: Exploiting Source Non Stationary and Coloration in Blind Source Separation (D. T. Pham and J.-F. Cardoso) Blind Separation of Instantaneous Mixtures of Non Stationary Sources Blind Separation of Instantaneous Mixture of Sources via the Gaussian Mutual Information Criterion Joint Approximate Diagonalization of Positive Definite Matrices
The second method is based on the minimisation of the Marginal Mutual Information criterion and exploits the non Gaussianity of the sources. The criterion is expressed in terms of the entropies which are estimated through a kernel method. Reference papers: Fast algorithms for Mutual Information Based Independent Component Analysis Blind Separation of Instantaneous Mixture of Sources via an Independent Component Analysis Fast Algorithm for Estimating Mutual Information, Entropies and Score Functions
The third method is also based on the minimisation of the Marginal Mutual Information criterion, but exploits the non stationarity of the sources as well. Reference paper: Blind Separation of Non Stationary Non Gaussian Sources.
The ponstnonlinear algorithm is based on the minimisation of the Marginal Mutual Information criterion. Reference papers: Blind source separation in postnonlinear mixtures