Abstract: |
Uncertainty quantification can begin by specifying the initial state of a system as a probability measure. Part of the state (the 'parameters') might not evolve, and might not be directly observable. Many inverse problems are generalisations of uncertainty quantification such that one modifies the probability measure to be consistent with measurements, a forward model and the initial measure. The inverse problem, interpreted as computing the posterior probability measure of the states, including the parameters and the variables, from a sequence of noise-corrupted observations, is reviewed in the talk. Bayesian statistics provides a natural framework for a solution but leads to very challenging computational problems, particularly when the dimension of the state space is very large, as when arising from the discretisation of a partial differential equation theory.
In this talk we show how the Bayesian framework leads to a new algorithm - the 'Variational Smoothing Filter' - that unifies the leading techniques in use today. In particular the framework provides an interpretation and generalisation of Tikhonov regularisation, a method of forecast verification and a way of quantifying and managing uncertainty. To deal with the problem that a good initial prior may not be Gaussian, as with a general prior intended to describe, for example a geological structure, a Gaussian mixture prior is used. This has many desirable properties, including ease of sampling to make 'numerical rocks' or 'numerical weather' for visualisation purposes and statistical summaries, and in principle can approximate any probability density. Robustness is sought by combining a variational update with this full mixture representation of the conditional posterior density. |