In this work, we first describe the multivariate linear methods most commonly used in neurophysiology and

show that they can be extended to assess the existence of nonlinear interdependences between signals. We then review the concepts of entropy and mutual information followed by a detailed description of nonlinear methods based on the concepts of phase synchronization, generalized synchronization and event synchronization. In all cases, we show how to apply these methods to study different kinds of neurophysiological
data.