Skip to content



Inferring change points in signal levels through deterministic minimization of a generalized global functional

Little, M (Aston University/MIT)
Thursday 16 January 2014, 15:50-16:10

Seminar Room 1, Newton Institute


Abrupt level change points are ubiquitous. Knowing the change points and levels of a time series, is critical to many practical signal analysis problems in science and engineering. For this, and other reasons, the problem of detecting level shifts, first studied in the 1940's in process control, is of enduring interest. In this talk I will detail a set of simple, novel, generalized, deterministic nonlinear algorithms for this problem. These algorithms are based on a global functional which, when minimized, finds the maximum a-posteriori location of the change points and values of the levels. This global functional approach subsumes some well-known algorithms for this problem that have been developed in digital image processing contexts, and also folds in several algorithms from statistical machine learning that have hitherto been seen as distinct. The algorithms are computationally simple, and many are convex optimization problems for which standard, fast implementations are available.


The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.

Back to top ∧