Adaptive Markov chain Monte Carlo algorithms : some ergodicity results (joint work with Jeffrey Rosenthal)Abstract In this talk, I will present some recent results on adaptive Markov chain Monte Carlo algorithms. That is, MCMC algorithms where the successive transition kernels are allowed to depend on the past of the process. I will start with a simple example that shows that such algorithms do not necessarily converge to the expected distribution. Then I will discuss some general condition under which an adaptive MCMC algorithm generates a stochastic process that is ergodic, with appropriate stationary distribution. Finally, I will present an application to the random walk Metropolis (RWM) algorithm. We propose an adaptive RWM that sequentially adjusts its scale parameter using the Robbins-Monro stochastic algorithm in order to find the optimal scale parameter as in Roberts et al. (1998). Our algorithm thus automatically determines and runs the optimal RWM scaling, with no manual tuning required.
Filtering with reversible jump MCMC in a class of doubly stochastic Poisson processes with marksAbstract In the modelling of ultra-high-frequency financial data we propose a class of marked doubly stochastic Poisson processes in which the intensity process can be considered as a generalization of the classical shot-noise process. For these processes, the filtering of the unobservable intensity can be performed by reversible jump Markov chain Monte Carlo algorithms. The proposed simulation method turns out to be useful in a number of situations, such as price forecasting and option pricing. Simulation experiments have shown good filtering performances.
Particle methods for an ill-posed inverse problem : recovering volatility from option prices by evolutionary optimization (joint work with Sana Ben Hamida)Reference
S. Ben Hamida, R. Cont, Recovering volatility from option prices by evolutionary optimization, Rapport Interne CMAP-534, Centre de Mathématiques Appliquées, École Polytechnique, Palaiseau, May 2004.
Particle filters and coin flips (joint work with Persi Diaconis and Richard Montgomery)Abstract We will show how particle filters can help analyse coin flipping. We used 100 frames of coin flips for 50 coin flips to estimate the angular momentum vector in an ordinary person's coin flips. This study shows that there is a small bias (of the order of 1% when a coin is flipped). To do the study, we painted special markings on the coins that allowed for online camera calibration and used Bayesian computations to follow the ellipses through the movements of the coin.
Particle methods for the simulation of rare events (joint work with Frédéric Cérou, Pierre Del Moral and Pascal Lezaud)Abstract Using the general framework of Feynman-Kac formulae and their approximation in terms of interacting particle systems, as exposed in Del Moral (2004), we present a class of algorithms based on the notion of importance splitting : given a sequence of increasingly critical events, simulated trajectories of the underlying Markov process are either allowed to survive and to give rise to continuing offspring trajectories, or are terminated, depending on whether the event with higher criticity occurs or not. Within this general framework, it is possible not only to estimate the probability that the rare event occurs and to estimate the transition probabilities between two events with increasing criticity, but also to get statistical information about typical critical trajectories. Special attention is paid to the problem of extinction of the particle system, which arises when the event with higher criticity does not occur for any simulated trajectory, and we present the sequential algorithm proposed by Oudjane (2000), which makes sure that a given fixed number of trajectories survives at each generation. For each of these algorithms, a CLT provides an expression for the asymptotic variance of the approximation error.
Slides .pdf file (120 Kb)
Asymptotic analysis of the SIR algorithmAbstract We review the techniques to derive (uniform) consistency and asymptotic normality for the sampling importance resampling iteration, and we discuss the implications of these results for the design of iterative and sequential SIR methods.
Volatility of daily stock returns estimation by means of particle filter : The IBEX caseSlides .pdf file (512 Kb)
Adaptive particle filters and high-dimensional systemsAbstract First, I will present a novel particle filter algorithm for the adaptive estimation of a partially observed system. Our main assumption for the identifiability of the parameters is that different parameters correspond to different limiting distributions for the observation process. Thus, we can use a weighted average of particle filters corresponding to different parameters, where the weights are given by comparing the limiting distributions of the simulated and the actual observation. Then, I will describe certain techniques used for the simulation of high dimensional systems, under certain assumptions on the separation of scales. I will discuss the combination of these techniques and interacting particle filters for the estimation of partially observed high dimensional systems, possibly with unknown parameters.
Sequential Monte Carlo samplers (joint work with Pierre Del Moral and Arnaud Doucet)Abstract In this talk, we develop a general methodology to sample from a sequence of probability distributions defined on a common space. We propose to approximate these distributions by a large set of random samples which evolves over time using simple sampling and resampling mechanisms. This methodology not only yields a whole new set of principled algorithms to make parallel Markov chain Monte Carlo runs interact but also allows us to derive new algorithms to perform global optimization or to solve sequential Bayesian estimation problems. This talk is illustrated by several complex examples arising in Bayesian inference.
Slides .pdf file (200 Kb)
Particle filter methods and MCMC for partially observed stochastic differential equations [cancelled]Abstract In this talk I will be covering recent developments for MCMC and particle filter methods for stochastic differential equations (SDEs) which are only partially observed at discrete time intervals. In particular I will focus on when standard algorithms for both procedures degenerate. A solution which makes the efficiency of such methods invariant to the choice of discretisation of the SDE will be highlighted.
Retrospective MCMC for inference for Dirichlet mixture models (joint work with Omiros Papaspiliopoulos)Abstract This talk will introduce a methodology for the Bayesian analysis of Dirichlet mixture models which involves imputation of the Dirichlet process itself. This is done without approximation using a retrospective technique. The method is extended to allow hyper-parameter uncertainty using non-centering methods, and the algorithm is shown to have good mixing properties for relatively large data sets.
Coupling constructions and MCMC convergenceAbstract We review the use of coupling constructions, via minorisation and drift conditions, to prove quantitative bounds on the convergence to stationarity of MCMC algorithms. We discuss applications to Gibbs sampler algorithms.
Reference
J.S. Rosenthal, Quantitative convergence rates of Markov chains : A simple account, Electronic Communications in Probability 7 (paper #13), 123-128 (2002).
On the Brownian directed polymer in a Gaussian random environment (joint work with Carles Rovira)Slides .pdf file (120 Kb)
Reference
C. Rovira, S. Tindel, On the Brownian directed polymer in a Gaussian random environment, to appear in Journal of Functional Analysis.
Portfolio optimization algorithm for a partially observed stochastic volatility model : how to use a genetic particle algorithmAbstract We describe the theoretical and pratical use of a stochastic volatility filtering algorithm based on work by Del Moral, Jacod, and Protter, combined with a straight Monte-Carlo method, or a partially analytic calculation, designed to optimize a portfolio of stock and bond under a risk-averse utility function, in the case of stochastic volatility, and assuming that observations and rebalacing times are discrete.
Back to the workshop home-page, the BS/IMS conference home-page.