New paper about estimation bias published

The number of neurons that can be recorded from simultaneously keeps growing every year– as a result, the complexity of models that are needed to describe these data also grows. Therefore, these models have more parameters which need to be estimated from data, and in turn, the recording time that is needed in order to constrain these parameters also grows. If, say, the number of neurons is doubled, by how much do we need to increase the recording time? Together with Peter Latham and Iain Murray, we addressed this question in one particular case: For second order maximum entropy models, a popular class of models for describing neural population data, we calculated the size of the systematic error in the entropy (‘bias’) as a function of the recording time: Link to paper