Research Repository

The N-Tuple bandit evolutionary algorithm for automatic game improvement

Kunanusont, K and Gaina, RD and Liu, J and Perez Liebana, D and Lucas, SM (2017) The N-Tuple bandit evolutionary algorithm for automatic game improvement. In: IEEE Congress on Evolutionary Computation (CEC), 2017, 2017-06-05 - 2017-06-08, San Sebastian, Spain.

1705.01080v1.pdf - Accepted Version

Download (486kB) | Preview


This paper describes a new evolutionary algorithm that is especially well suited to AI-Assisted Game Design. The approach adopted in this paper is to use observations of AI agents playing the game to estimate the game's quality. Some of best agents for this purpose are General Video Game AI agents, since they can be deployed directly on a new game without game-specific tuning; these agents tend to be based on stochastic algorithms which give robust but noisy results and tend to be expensive to run. This motivates the main contribution of the paper: the development of the novel N-Tuple Bandit Evolutionary Algorithm, where a model is used to estimate the fitness of unsampled points and a bandit approach is used to balance exploration and exploitation of the search space. Initial results on optimising a Space Battle game variant suggest that the algorithm offers far more robust results than the Random Mutation Hill Climber and a Biased Mutation variant, which are themselves known to offer competitive performance across a range of problems. Subjective observations are also given by human players on the nature of the evolved games, which indicate a preference towards games generated by the N-Tuple algorithm.

Item Type: Conference or Workshop Item (Paper)
Additional Information: Notes: 8 pages, 9 figure, 2 tables, CEC2017
Uncontrolled Keywords: cs.AI
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Faculty of Science and Health
Faculty of Science and Health > Computer Science and Electronic Engineering, School of
SWORD Depositor: Elements
Depositing User: Elements
Date Deposited: 08 Sep 2017 12:58
Last Modified: 15 Jan 2022 00:52

Actions (login required)

View Item View Item