Detached nerve interfaces with synthetic gradients

Authors: Max Jaderberg, Wojciech Marian Czarnecki, Simon Osindero, Oriol Vinyals, Alex Graves, David Silver, Koray Kavukcuoglu

When practicing neural networks, the modules (layers) are locked: they can only be updated after redistribution. We remove this limitation by including the learned error gradient model, synthetic gradients, which means we can update grids without a complete addition. We show how this can be applied to forward-facing networks that allow each layer to be performed asynchronously, to RNN networks that extend the time that models can remember, and to multi-network systems to enable communication.

For more information and related work, see paper.

See it in ICML:

Monday, 7 August from 10.30 to 10.48 @ Darling Harbor Theater (debate)

Monday, August 7 from 6:30 pm to 10:00 pm @ Gallery # 1 (Poster)


LEAVE A REPLY

Please enter your comment!
Please enter your name here