AIML Mid-Week Seminar: Can biological neurons further improve our understanding of artificial neural networks and vice versa?
In the VPNL, we record from visual neurons in the dragonfly brain. A set of these neurons respond to optic-flow, encoding the dragonfly's own motion through the world. This underlies their remarkable ability to hover nearly stationary, then rapidly patrol at over 60KM/hour. Other neurons respond to tiny targets amidst cluttered surrounds, even predicting the target's future location following an occlusion. These target-detecting neurons can implement a winner-takes-all network, allowing for the selection of a single target amidst distractors (e.g. feasting on midges in a swarm).
From this neurobiology, we uncover fundamental principles of neuronal coding and apply these to the development of signal processing and neural network models. We observe interesting properties of "real" neurons and relate thee to the architectures of artificial neural networks. We then translate these models into hardware, applied to both ground and aerial autonomous systems.