Differentiable Neural Computers

Dec 15, 2016

Davis Auditorium, CEPSR
Speaker: Greg Wayne, Google Deepmind


I will describe a neural network architecture that interfaces to a large external memory, providing the network with the capacity to store a large amount of information, e.g. a database of a large number of facts, and perform goal-directed retrieval. This network acts as a hybrid of the capabilities of self-learning neural networks with the abilities of computers to carry out algorithms defined over complex data structures. Time permitting, I will also exhibit some research on computational motor control.

Speaker Bio

Greg Wayne received his B.S. in Symbolic Systems from Stanford University, his M.S. in Applied Mathematics from City University of New York, and his Ph.D. in Neuroscience from Columbia University, working in the theoretical neuroscience laboratory of Larry Abbott. Since 2014 he has been at Google DeepMind in London, pursuing research primarily on artificial neural network memory systems and motor control.



Hosted by Aurel A. Lazar.

500 W. 120th St., Mudd Room 524, New York, NY 10027    212-854-5660               
©2014-2017 Columbia University