Gavin Brown

School of Computer Science. University of Manchester. [Web]

Gavin Brown


A Unifying Framework for Information Theoretic Feature Selection

Feature selection and statistical hypothesis testing are a fundamental tool in many data-intensive scientific fields, from health informatics to astronomy. Over the past 3 decades, the mutual information has been used in all these areas, to identify good features from the data – with many heuristics taking many different forms. In this talk I will describe recent work which identified a unifying principle that lies behind 3 decades of literature in this area. This enabled us to move forward to new and interesting challenges such as hypothesis testing in “positive-unlabelled” data, and identifying precisely how many labelled examples we need to have a valid statistical test.


Short Bio

Gavin Brown completed his PhD in Machine Learning at the University of Birmingham, UK, for which he was awarded the British Computer Society’s Distinguished Dissertation Prize, 2004. Following this he was awarded a 5-year Fellowship at the University of Manchester, and is now a Reader in Machine Learning. His work has been financially supported by the UK research councils, the EU FP7 framework, and the pharmaceutical industry. In 2013, work with a PhD student on information theoretic feature selection was again awarded the BCS Prize, making them the only second-generation winners of this national prize in its 30 year history. Gavin is a keen public communicator, engaging in several public events per year on issues around artificial intelligence and machine learning – including several appearances on the BBC children’s channel, explaining robots and AI.


Ilustraciones de la ciudad de Albacete cedidas por Alicia Gosalbez
Copyright © 2020 Conferencia CAEPIA 2015