Nature, Communications Physics journal publishes paper by Dane Taylor and Bengier Ulgen Kilic

Mapping neuron-to-neuron communication: Dane Taylor, PhD, assistant professor of mathematics, and UB PhD candidate Bengier Ulgen Kilic conduct research to better understand the biological and mathematical mechanisms that allow brains to manifest memories as neuron activity patterns that can be stored and reliably called upon later for cognitive processing.  "Simplicial cascades are orchestrated by the multidimensional geometry of neuronal complexes"  is the resulting paper published by Nature, Communications Physics. The international journal is called "the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions." Read research news by Barbara Branning.

research news

UB researchers propose alternative method to map neuron activity

neuron activity.

By BARBARA BRANNING

Published December 2, 2022

Print
Dane Taylor.
“We leveraged advanced mathematical modeling to identify a plausible mechanism that can orchestrate such cascades, and the next step will be to team up with neuroscientists that can experimentally test our hypothesized mechanism. ”
Dane Taylor, assistant professor
Department of Mathematics

A paper by researchers in the Department of Mathematics that could impact the way neuron-to-neuron communications are mapped was recently published in the Nature journal Communications Physics.

The paper, titled “Simplicial cascades are orchestrated by the multidimensional geometry of neuronal complexes,” was authored by Dane Taylor, assistant professor of mathematics, and UB PhD candidate Bengier Ulgen Kilic. It discusses new methods for the mathematical modeling of neuronal networks. The paper appeared in the Nov. 8 issue of the journal.

The main goal of the research is to better understand the biological and mathematical mechanisms that allow brains to manifest memories as neuron activity patterns that can be stored and reliably called upon later for cognitive processing, Taylor explains.  

Kilic says the work will “lay the pavement for integrating rapidly growing in vivo experimental data with theoretical frameworks, particularly in artificial intelligence and computational neuroscience domains.”

In general, higher-order communication is a hard concept to grasp in the context of the brain network, Kilic says. “However, we know that nervous systems are balls of neurons, and every single neuron in our nervous system is bombarded with synaptic signals at any second.

“Although the manner in which neurons communicate remains a mystery, there is evidence that behavior that lies beneath human cognition and consciousness in part emerges from these higher-order interactions,” Kilic says.

In their paper, Taylor and Kilic study the speed and rate of dissemination of neuronal activity using the “mathematical” lens of geometry and topology.

The process of modeling neurons — those cells that transmit information throughout the brain and nervous system via electrical impulses — involves representing the neuron’s biophysical and geometrical characteristics with a mathematical structure, such as a graph.

Neuronal networks are believed to encode memory using nonlinear functions that translate a set of inputs — the activities of neurons that influence another neuron — into a neuron response, Taylor says. “Different input patterns give rise to different neuron responses, and a graph-based model is an oversimplification for the possible types of responses.” 

Taylor and Kilic propose expanding these graphs into simplicial complexes — or hypergraphs — that can map connections between vast networks of neurons.

“Historically, the modeling and analysis of both biological neuronal networks and synthetic neuronal networks (for AI) have relied on building models that are based on the mathematical framework of two-dimensional graphs,” Taylor says.

“This is, in part, due to the added mathematical complexity that arises when trying to keep track of dyadic, triadic and higher-order interactions among groups of neurons.

“For example,” he says, “the activity of three neurons that jointly interact through a three-way relationship can be significantly more complicated than that for three neurons whose interactions are limited to two-way relationships between pairs of neurons.

“The firing of several nearby neurons can lead to an avalanching cascade of neuron firings that spreads across a neuronal network,” Taylor says. “Such neuronal avalanches have been widely observed and have been empirically connected to memory formation and cognitive function, such as human logic and intuition.”

Taylor uses the following analogy to illustrate the difference between traditional models and simplicial models.

“Consider if your friends and family and coworkers all give you different information and recommendations about whether or not to go see a newly released movie. A simplicial complex model, or equivalently, a hypergraph model, would allow you to make a decision by incorporating all the information provided by everyone in a complicated, unconstrained way,” he says. 

“In contrast, a graph-based model for your ultimate decision would involve you considering each person’s recommendation independently, making a separate initial decision for each recommendation. Given a set of initial decisions, you combine them to make your ultimate decision,” he says. “Thus, your final decision is made based on initial, independent sub decisions, rather than making a fully informed decision that integrates all the available information together from everyone.”

Taylor says it’s well-known that neuron activity spreads as a cascade of spiking activity, which he likened to epidemic spreading and cascading blackouts in a power grid. But very little is known about the neuronal mechanisms that can guide the cascades of spiking neurons.

“We leveraged advanced mathematical modeling to identify a plausible mechanism that can orchestrate such cascades, and the next step will be to team up with neuroscientists that can experimentally test our hypothesized mechanism,” he says.

For Taylor and Kilic, its application for AI is a longer-term goal, although other students in his group are studying it. Taylor notes it will be up to two years before they redirect the current project from studying biological neurons to artificial neural networks.

The project was funded in part by the Simons Foundations, National Science Foundation and UB’s Julian Park Publication Fund.

UB Mathematics Faculty Profile

  • Dane Taylor

    PhD

    Dane Taylor.

    Dane Taylor

    PhD

    Dane Taylor

    PhD

    Assistant Professor

    Research Interests

    Network-Data Analytics — methodology development for community detection, ranking systems, network inference, and manifold learning; Dynamics on and of Networks — nonlinear and stochastic systems including social contagions, oscillator synchronization, and network evolution.

    Education

    PhD, Applied Mathematics, University of Colorado, Boulder

    Research Summary

    I lead a research group developing theory and methodology for network-based models of data, algorithms and computational infrastructure. My group focuses data-driven models of complex systems, working along the following pursuits:

    • Developing of mathematically and statistically rigorous data-science methodology; 

    • Developing neurocomputation theory for the dynamics of biological neuronal networks;
    • Developing theoretical foundations for artificial neural networks including ResNets, NeuralODEs and graph neural networks;

    • Developing novel models and theory for consensus dynamics to study collective learning and develop new methods for decentralized ML/AI;
    • Developing theory for structural/dynamical mechanisms that facilitate multiscale self-organization.
    • Collaborating with domain experts in the biological, physical, social, and medical sciences and engineering to address domain-driven problems.

    Selected Publications

    • BU Kilic and D Taylor (2022) Simplicial cascades are orchestrated by the multidimensional geometry of neuronal complexes. Communications Physics 5, 278.
    • NB Erichson, D Taylor, Q Wu and MW Mahoney (2021) Noise-response analysis of deep neural networks quantifies robustness and fingerprints structural malware. In Proceedings of the SIAM International Conference on Data Mining, 100-108.
    • D Taylor, MA Porter and PJ Mucha (2021) Tunable eigenvector-based centralities for multiplex and temporal networks. Multiscale Modeling & Simulation 19(1), 113–147.
    • D Taylor (2020) Multiplex Markov chains: Convection cycles and optimality. Physical Review Research 2, 033164.
    • D Taylor, SA Myers, A Clauset, MA Porter and PJ Mucha (2017) Eigenvector-based centrality measures for temporal networksMultiscale Modeling and Simulation 15(1), 537-574.
    • D Taylor, PS Skardal and J Sun (2016) Synchronization of heterogeneous oscillators under network modifications: Perturbation and optimization of the synchrony alignment functionSIAM Journal on Applied Mathematics 76(5), 1984-2008.
    • D Taylor, S Shai, N Stanley and PJ Mucha (2016) Enhanced detectability of community structure in multilayer networks through layer aggregationPhysical Review Letters 116, 228301.
    • D Taylor et. al. (2015) Topological data analysis of contagion maps for examining spreading processes on networks. Nature Communications 6, 7723.
    • J Sun, D Taylor and EM Bollt (2015) Causal network inference by optimal causation entropy. SIAM Journal on Applied Dynamical Systems 14(1), 73-106.
    •  PS Skardal, D Taylor and J Sun (2014) Optimal synchronization of complex networks. Physical Review Letters 113, 144101.