Computer Science

Efficient approximations for probabilistic synaptic connectivity in large-scale network models

Published on - Annual Meeting of the Society of Neuroscience

Authors: Margaux Calice, Julien Ballbé, Lyle J Graham

Several point-neuron network models combine data from anatomy and electrophysiology, with probability or deterministic rules to construct realistic synaptic connections. For reviews of these models and general guidelines for network models see (2,3). In anatomical data-driven models, connectivity between areas can be derived from their architecture and laminar composition (2). For local connectivity, neurons close to each other will have a greater tendency to connect (3,7). We can built models strongly informed by these results where the connection probability dependent of the intersomatic distance for pairs of neurons follows a Gaussian or an exponential decay (3,6). While this approach arguably is the closest simulacrum to the experimental protocols, configuring large-scale models can become computationally onerous due to the pairwise sampling procedure. Derivatives of the distance-dependent probability rule have been proposed to improve this situation, including The Brain Scaffold Builder (BSB) (4), which limits the number of pairwise iterations according to “a maximum search radius derived from the cumulative probability function of the distance distribution“. However, other large models, e.g. Schmidt et al., 2018 (5), which produces 2.42 ⋅ 1010 synapses, and Billeh et al., 2020 (8) rely on supercomputers for practical computing times. Here we present preliminary results for an algorithm that promises to extends the capability of more modest hardware to further optimize the distance-dependence framework, while maintaining a certain degree of realism in the network.