Neuro-inspired theory, modeling and applications
- Our theoretical research is concerned with understanding and analytically describing various aspects of neural network dynamics. On the modeling side, we are mainly interested in functional networks (i.e., networks that do something that we consider useful), for which we take inspiration from both biology and AI research. An essential aspect of model functionality concerns robustness, since one of our goals is the embedding of functional networks in neuromorphic substrates and their application to real-world problems.
Our Research Interests
- New article in Elife: how natural-gradient descent enable efficient synaptic plasticity!
- "Natural-gradient learning for spiking neurons".
- DEEP MINDS Podcast (in German) - Info
- In the episode 7 of the DEEP MINDS Podcast: Laura Kriener
and Julian Göltz
talked about "AI Hardware of the Future: What is Neuromorphic Computing?"
- New contribution to eLife: Learning across three different global brain states!
- Our group leader speaks his mind!
- New Article in NeurIPS: Learning with slow neurons is not a problem anymore!
This awesome work was selected for an oral presentation at the 35th Conference on Neural Information Processing Systems (NeurIPS 2021)!