Biomimetics
18 Dec 2024

The Connectome of a Fly as a Computational Reservoir

Reservoir computing is a neural network paradigm that requires minimal training, unlike most advanced ML models. A reservoir is a pool of neurons that acts as a non-linear expansion and a temporal buffer for data that can be represented as a time series.

Example schematics of reservoir computing
Example schematics of reservoir computing

Reservoir computing can be subdivided into liquid-state machines and echo-state networks, depending on the neuronal model used as activation function. Normal practice when creating a reservoir is to randomly initialize a desired number of neurons and synapses, according to a desired sparsity, spectral radius, and distribution of the non-zero weights [1]. Whilst the size of the reservoir, the spectral radius, and the sparsity have been shown to be the predominant factors influencing the capability of the reservoir, the distribution of non-zero weights and the exact topology of the reservoir have limited impact and are often ignored. [2] In this work, we explore the possibility of using the topology and weight distribution of the connectome of a fly, recently documented and published, to create a reservoir and implement an echo-state network [3].

Scatterplot of the connectome
Scatterplot of the connectome

Based on the information taken from the scanned connectome, we can create a connectivity matrix of the entire brain or of a sub-portion of the desired size. When selecting a desired number of neurons, we prioritize higher connectivity. We then investigate the performance of such architectures and compare it to state-of-the-art reservoirs, to better characterize the potential of the fly connectome. The results show that the fly connectome architecture is significantly more resilient to overfitting when compared to a randomized topology, particularly in those cases already prone to overfitting.

References

  1. Lukoševičius, M. A practical guide to applying echo state networks. Neural Networks: Tricks of the Trade: Second Edition. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. 659-686. https://www.ai.rug.nl/minds/uploads/PracticalESN.pdf
  2. Viehweg, J., Worthmann, K., & Mäder, P. Parameterizing echo state networks for multi-step time series prediction (2023). Neurocomputing, 522, 214-228. https://www.sciencedirect.com/science/article/abs/pii/S0925231222014291
  3. Schlegel, P., Yin, Y., Bates, A.S. et al. Whole-brain annotation and multi-connectome cell typing of Drosophila. Nature 634, 139–152 (2024). https://doi.org/10.1038/s41586-024-07686-5
Hamburger icon
Menu
Advanced Concepts Team