Neuro-inspired Theory, Modeling and Applications
- Our theoretical research is concerned with understanding and analytically describing various aspects of neural network dynamics. On the modeling side, we are mainly interested in functional networks (i.e., networks that do something that we consider useful), for which we take inspiration from both biology and AI research. An essential aspect of model functionality concerns robustness, since one of our goals is the embedding of functional networks in neuromorphic substrates and their application to real-world problems.
Our Research Interests
Highlights
- New article in Elife: how natural-gradient descent enable efficient synaptic plasticity!
- DEEP MINDS Podcast (in German) - Info
- In the episode 7 of the DEEP MINDS Podcast: Laura Kriener
and Julian Göltz
talked about "AI Hardware of the Future: What is Neuromorphic Computing?"
- New contribution to eLife: Learning across three different global brain states!
- "Learning cortical representations through perturbed and adversarial dreaming".
See Publications - Press Release - Very Well Health article
- Our group leader speaks his mind!
- New Article in NeurIPS: Learning with slow neurons is not a problem anymore!
This awesome work was selected for an oral presentation at the 35th Conference on Neural Information Processing Systems (NeurIPS 2021)!
- "Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons".
See Publications - Press Release
- > More to explore: News and Publications