Research Repository

Match memory recurrent networks

Samothrakis, S and Vodopivec, T and Fasli, M and Fairbank, M (2016) Match memory recurrent networks. In: 2016 International Joint Conference on Neural Networks (IJCNN), 2016-07-24 - 2016-07-29, Vancouver, BC, Canada.

[img]
Preview
Text
wcci-networks(1).pdf - Accepted Version

Download (382kB) | Preview

Abstract

Imbuing neural networks with memory and attention mechanisms allows for better generalisation with fewer data samples. By focusing only on the relevant parts of data, which is encoded in an internal 'memory' format, the network is able to infer better and more reliable patterns. Most neuronal attention mechanisms are based on internal networks structures that impose a similarity metric (e.g., dot-product), followed by some (soft-)max operator. In this paper, we propose a novel attention method based on a function between neuron activities, which we term a 'match function', which is augmented by a recursive softmax function. We evaluate the algorithm on the bAbI question answering dataset and show that it has stronger performance when only one memory hop is used in both terms of average score and in terms the number of solved questions. Furthermore, with three memory hops, our algorithm can solve 12/20 benchmark questions using 1000 training samples per task. This is an improvement on the previous state of the art of 9/20 solved questions, which was held by end-to-end memory networks.

Item Type: Conference or Workshop Item (Paper)
Additional Information: Published proceedings: Proceedings of the International Joint Conference on Neural Networks
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Faculty of Science and Health > Computer Science and Electronic Engineering, School of
Depositing User: Elements
Date Deposited: 09 Nov 2018 15:48
Last Modified: 09 Nov 2018 16:15
URI: http://repository.essex.ac.uk/id/eprint/21302

Actions (login required)

View Item View Item