# Resources

We have share code accompanying most of our publications, most of it in Matlab or Python. This listing also includes packages for which a substantial portion of the code was developed by collaborators of the group. Details and references can be found in repositories. Please refer to the publications page for further resources and code, or to our github page github.com/mackelab.

## Code | |
---|---|

## Modelling multivariate binary data with correlationsWebsite | CodeMultivariate, correlated binary data arises in a variety of applications. In neuroscience, spike trains recorded from populations of neurons are often analyzed after discretizing the data into binary multivariate ‘words’, which often exhibit correlations across neurons. The statistical structure of these ‘spike words’ has attracted substantial interest: What models are well suited for capturing these correlations? What insights can we gain from these statistics about putative underlying mechanisms? What are the functional consequences of these correlations? We have tackled these questions both by developing statistical models for discretized neural population spike trains, and by theoretically analyzing the properties of these models. The toolbox Corbinian includes a number of statistical methods for multivariate binary data, including -
Dichotomized Gaussian models for multivariate binary and count data, as described in Macke et al 2009. The Dichotomized Gaussian provides a very efficient method for sampling multivariate binary data with specified second-order correlations, and with higher-order correlations which are realistic for neural population data. -
Second-order maximum entropy models for multivariate binary data, also known as Ising models in statistical physics Schwartz et al 2013, and for calcuating the entropy bias in these models as described in Macke et al 2013 and Macke et al 2012. -
Extended second-order maximum entropy models with population-acitivty count features, also known as K-pairwise models Tkacik et al 2014. -
Dichotomized Gaussian and Maximum Entropy models for homogeneous models, in which many theoretical quantities of interest (including entropy and specific heat) can be easily calculated, as described in Macke et al 2011.
CorBinian is the successor to our pop_spike repository, which was developed in collaboration with Philipp Berens, Alexander Ecker, Andreas Tolias, Matthias Bethge, Manfred Opper, Iain Murray and Peter Latham. The MCMC methods for fitting maximum entropy models to large neural population is based on code developed by Tamara Broderick and Michael Berry. | |

## Density estimation likelihood-free inferenceGitHub | Docs | Paperdelfi is a Python package for density estimation likelihood-free inference. Different inference algorithms are implemented: - A basic version of a likelihood-free inference algorithm that uses a mixture-density network to approximate the posterior density
- The algorithm proposed in the paper Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation (Papamakarios & Murray, 2016)
- Sequential Neural Posterior Estimation, as proposed in the paper Flexible statistical inference for mechanistic models of neural dynamics (Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke, 2017)
| |

## Linear dynamical system models with Poisson observations for modelling neural population spike trainsCode | CodeSSIDNeural populaton spike trains can exhibit substantial correlations both across neurons and across time. For data obtained from cortical multi-electrode recordings, these correlations are unlikely to arise solely from direct synaptic interactions, but are more likely to arise from cortical dynamics or shared modulatory influences. A simple model for capturing the structure of correlations in cortical recordings is to assume that they arise from a common coupling to an underlying dynamical system. A simple model for such systems is to assume that the underlying dynamics are low-dimensional and linear. As neural spike trains consist of multivariate point processes (or, in discrete time, count processes), such low-dimensional dynamics need to be combined with a suitable observation model, e.g. a Poisson model. We call the resulting model a Poisson-observation Linear Dynamical System, PLDS. We developed a number of different estimation procedures for PLDS models, including both a Laplace- and variational approximation for state inference, expectation maximization (EM) for parameter learning, and nonlinear subspace identification for fast parameter estimation (or as initialization for EM) The methods are described in Macke et al 2011, Buesing et al 2012 and reviewed in Macke et al 2015, and were developed in collaboration with Lars Buesing, Maneesh Sahani, John Cunningham, Byron Yu and Krishna Shenoy. Much of this code was written by Lars Buesing (Gatsby Unit & Columbia University, bitbucket). He also provided implementations of nuclear-norm minisation (as developed in Pfau et al 2013, github) and exponential family PCA. Yuan Gao contributed code for fitting generalized count linear models , as described in Gao Buesing Shenoy Cunningham NIPS 2015. Code for Subspace Identification can be found in a separate repository. | |

## Psignifit 4- Bayesian Psychometric function fitting without a pain in the neckCodePsychometric functions play a central role in psychology and behavioural neuroscience, and are used extensively to model detection or discrimination behaviour as a function of an independent variable, e.g. the contrast of a visual stimulus. After data collection, researchers frequently fit a psychometric function to their data relating the independent variable on the abscissa to the observer’s behaviour on the ordinate. Previous Bayesian inference methods for psychometric function estimation are based on MCMC methods, which can be fiddly in practice and require manual interaction and expert knowledge. Together with Heiko Schuett, Stefan Harmeling and Felix Wichmann, we overcame these limitations using numerical integration methods. We are also included an beta-binomial observation method which is important for modelling non-stationary observers. The repository implementing Heiko’s method (Psignifit 4) is available at the github page of the Wichmann lab. | |

## Gaussian Process methods for modelling cortical mapsCode and data | paper | paper 2The primary visual cortex of primates and carnivores exhibits a remarkable organization– neurons are organized into topographic maps according to their tuning properties. Most strinkingly, neurons are arranged according to their preferred orientation into ‘orientation preference maps’ (OPMs) which include both `iso-orientation domains’ and ‘pin-wheels’, i.e. singularities around which multiple orientation collumns are arranged radially. Estimation of OPMS from neural imaging measurements is a challenging statistical question. We developed Gaussian process methods for modelling orientation preference maps which make it possible to encode prior knowledge about the statistical structure of OPMs as well as about the correlation-structure of the noise, with the goal of making it possible to estimate maps more efficiently, and to obtain estimates of uncertainty about the estimated map. The method was developed with Matthias Bethge, Mattias Kaschube and Leonard White, and is described in Macke et al 2011 ) and Macke et al 2010. | |

## Gaussian process factor analysis with Poisson observationsCodeAdvances in multi-cell recording techniques allow neuroscientists to record neural activity from large populations of neurons. A popular approach to analysing neural population activity is to identify a smooth, low-dimensional summary of the population activity. Gaussian process factor analysis (GPFA) is a powerful technique for neural dimensionality reduction. However, it is based on a simplifying assumption that is not appropriate for neural spike trains: Spike counts conditioned on the intensity are Gaussian distributed, which fails to capture the discrete nature of neural spike counts or the mean-variance relationships typically observed in neural spike trains. Here we propose to extend GPFA using a more realistic assumption for spike generation: spike counts conditioned on the intensity are Poisson distributed, and the intensity is a non-linear function of the latent state. In his masters-thesis, Hooram developped and implemented methods for fitting Gaussian Process Factor Analysis models with Poisson observations. | |

## Modelling inter-trial dependence in psychophysicsCodePsychophysical experiments are used extensively in psychology and behavioural neuroscience. While it is often used that different trials of an experiment are statistically independent, it is well known that there are many violations of this assumption, and both human observers and animals often exhibit so called ‘sequential dependencies’ across multiple trials. Together with Ingo Fruend and Felix Wichmann, we developed statistical methods for modelling psychophysical data with sequential dependencies. The method can be used to estimate psychometric functions in the presence of serial dependence, and also to gain insight into the structure of the dependencies (e.g. how do errors on previous trials influence the decision on the next trial?). Methods are described in detail in Fruend et al 2014. | |

## Poisson dynamical system models for nonstationary population spike trainsCode | Data (Tolias lab, BCM) | PaperNeural population activity often exhibits rich variability. This variability can arise from single-neuron stochasticity, neural dynamics on short time-scales, as well as from modulations of neural firing properties on long time-scales, often referred to as neural non-stationarity. To better understand the nature of co-variability in neural circuits and their impact on cortical information processing, we introduced a hierarchical dynamics model that is able to capture both slow inter-trial modula- tions in firing rates as well as neural population dynamics, and derived a Bayesian Laplace propagation algorithm for joint inference of parameters and population states. This repository includes a matlab implementation of our NIPS paper. The method was developed and implemented by Mijung Park and Gergo Bohner (Gatsby Unit, UCL). | |

## Variational Inference on Tree Structured Hidden Markov ModelsCode | PaperMethods for fitting tree-structured hidden markov models with binary observations to identify hierarchical states. |

## Courses | |
---|---|

## Machine Learning I, GS Neural Information Processing## Tuebingen, 2012repositorySlides, lectures notes and exercises are available for the course ‘Machine Learning I’ which Jakob taught together with Mathias Bethge at the Graduate School for Neural Information Processing Systems in WS 2012. |