Research Repository

Deep Learning in Target Space

Fairbank, Michael and Samothrakis, Spyridon and Citi, Luca (2020) Deep Learning in Target Space. Working Paper. arXiv. (Unpublished)

[img]
Preview
Text
2006.01578v2.pdf - Submitted Version

Download (1MB) | Preview

Abstract

Deep learning uses neural networks which are parameterised by their weights. The neural networks are usually trained by tuning the weights to directly minimise a given loss function. In this paper we propose to re-parameterise the weights into targets for the firing strengths of the individual nodes in the network. Given a set of targets, it is possible to calculate the weights which make the firing strengths best meet those targets. It is argued that using targets for training addresses the problem of exploding gradients, by a process which we call cascade untangling, and makes the loss-function surface smoother to traverse, and so leads to easier, faster training, and also potentially better generalisation, of the neural network. It also allows for easier learning of deeper and recurrent network structures. The necessary conversion of targets to weights comes at an extra computational expense, which is in many cases manageable. Learning in target space can be combined with existing neural-network optimisers, for extra gain. Experimental results show the speed of using target space, and examples of improved generalisation, for fully-connected networks and convolutional networks, and the ability to recall and process long time sequences and perform natural-language processing with recurrent networks.

Item Type: Monograph (Working Paper)
Uncontrolled Keywords: cs.NE
Divisions: Faculty of Science and Health > Computer Science and Electronic Engineering, School of
SWORD Depositor: Elements
Depositing User: Elements
Date Deposited: 23 May 2022 11:09
Last Modified: 23 May 2022 11:09
URI: http://repository.essex.ac.uk/id/eprint/29623

Actions (login required)

View Item View Item