Abstracts MLQA 2011
ODEs and discrete simulation (slides)
Jean-Yves Le Boudec
We consider a generic model of N interacting objects, where each object has a state and the evolution of the system depends only on the collection of states at any point in time. This is quite a general modeling framework, which was successfully applied to model many forms of distributed systems and communication protocols. When the number of objects N is large, one often uses simplifying assumptions called "mean field approximation", "fluid approximation", "fixed point method" or "decoupling assumption". In this talk we explain the meaning of these concepts, how they are related to Ordinary Differential Equations and how a fast simulation methods can be derived. We also show that the first two, namely mean field approximation and fluid approximation, are generally valid, whereas the last two, namely fixed point method and decoupling assumption, require more care, and may not hold even in simple cases. We give sufficient conditions under which they are valid. We illustrate the concepts with some easy-to-follow examples.
Stochastic process algebras and ODEs (slides)
Stochastic process algebras have been successfully applied to quantitative evaluation of systems for over a decade. Such models may be used to analyse the timeliness of response or the utilisation of resources of systems.
By modelling systems as collections of individual agents, the process algebra approach allows the modeller to capture the exact form of interactions and constraints between entities. At any given time the state of the system is the collection of states exhibited by the individual agents. Inevitably this approach suffers from the problem of state space explosion making analysis inefficient or even infeasible, particularly in systems where we are interested in collective dynamics. In these systems, although we model the behaviour of individuals, we aim to analyse the behaviour of the populations to which they belong. Examples include clients accessing a server, people moving through their physical environment, or molecules interacting within cells.
In this talk I will discuss recent work on such population-oriented models described in stochastic process algebra and interpreted as a set of ordinary differential equations, the so-called fluid approximation.
On the Relationship between ODE, Simulation and Stochastic Process Algebras (slides)
In this talk we will concentrate on the mathematical foundations of the fluid-approximation of stochastic process algebra models. We will discuss limit results that guarantee that, in the limit of large populations, the trajectories of the Continuous Time Markov Chain constructed from a Stochastic Process Algebra model cannot be distinguished from the solution of the fluid ODE. We will also discuss the conditions that must be satisfied by SPA models for such limit theorems to be applicable, commenting also on issues like error bounds and speed of convergence (which are connected with the properties of the phase space of ODE). Furthermore, we will present the relationships between the average behaviour of the SPA model and fluid ODE, which can be proved to be a first order approximation of the former, considering also approximations of higher-order moments. Finally, we will also discuss possible extensions of limit theorems to deal with situations in which the classic deterministic limit theorems are not applicable, for instance when not all the populations of agents can be assumed to increase to infinity. In these cases, the candidate limit process is no more an ODE, but rather a (stochastic) hybrid system.
Stochastic Hybrid Analysis of Markov Population Models (slides)
In this talk we will consider discrete-state Markov processes that exhibit a population structure. They find various applications in different domains and are particularly well-suited to describe the interactions between different molecular species in the living cell. The direct analysis of such Markov population models is very challenging due to the "curse of dimension" but their regular structure allows fluid approximations on different levels. Such approximations result in ordinary or partial differential equations that describe the evolution of stochastic (hybrid) models in purely continuous or continuous-discrete state spaces. I will present various numerical techniques to analyze such models over time and in equilibrium. Moreover, the appropriateness of different fluid approximations will be discussed based on several examples from systems biology. Finally, I will talk about the importance of stability analysis for Markov population models and present techniques to reason about the mathematical foundations of multi-stability and oscillation of stochastic models.
Approximation of continuous space systems and associated metrics and logics (slides)
We survey a few approximation techniques for Labelled Markov Processes, a model of probabilistic processes where the state space is a measure space. These techniques are either based on the structure of the model or on quotienting with respect to some partitioning of the state space. However, instead of quotienting with respect to any arbitrary partition, we obtain a more practical approximation by agregating states that share some similarities. The techniques that we focus on have properties of convergence with respect to the set of satisfied formulas and with respect to some notion of distance (or metric) between processes.
Indeed, the accuracy of approximations is measured through three features: simulation relations, metrics and logics. Approximants of a process S should -- converge in the metric to S, that is, d(S_i,S) converges to 0, as the approximation S_i goes finer and finer. -- be such that every formula satisfied by S is satisfied by some of its approximants. -- possibly be simulated by S; one of our approximation scheme, defined with conditional expectation, does not satisfy this requirement.
Finally we also list a few other techniques that apply to alternative models of continuous systems.
Small Thematic Working Group on Hidden Markov Models (slides)
Markov Processes are perhaps the most widely used class of stochastic models for the quantitative analysis of IT Systems. They come in many forms; examples include DTMCs and CTMCs which admit no nondeterminism, and MDPs and CTMDPs which do admit nondeterminism. Logics derived from Computation Tree Logic has proved useful as "query languages" for such models; examples include the logics PCTL and CSL as well as their extensions. A number of state-of-the-art stochastic model checkers are able to check whether a logical formula holds on a given model.
A Hidden Markov Model is a stochastic model in which observations depend on a Markov Process that cannot be observed. They have many uses in applied statistics, speech recognition, bioinformatics, environmental processes, econometrics, image processing and computer vision, wind power forecasting, and has also been used in security. There has been much less focus on logics for Hidden Markov Models and indeed a study of the extent to which logics provide the appropriate "query language" for Hidden Markov Models. Therefore hardly any model checkers for Hidden Markov Models exist.
A main aim of this working group is to to identify a small consortium that would be interested in developing logics, "query languages", and model checkers that would be able to deal with some of the applications of Hidden Markov Models. Hopefully this would lead to useful tools in much the same way that existing stochastic model checkers make the analysis of Markov Processes available to a wider audience than applied statisticians.
A text books on Hidden Markov Models: Zucchini, W. and I. MacDonald. 2009. Hidden Markov Models for Time Series. Chapman & Hall/CRC, London.
A tutorial paper on Hidden Markov Models: Lawrence R. Rabiner: A tutorial on Hidden Markov Models and selected applications in Speech Recognition. Proceedings of the IEEE, 1989, pages 257-286.
A paper on a logic for Hidden Markov Models: Lijun Zhang, Holger Hermanns, David N. Jansen: Logic and Model Checking for Hidden Markov Models. FORTE 2005: 98-112