Poster Session at MLQA 2010: Difference between revisions

From ERCIM Working Group MLQA
No edit summary
No edit summary
 
(12 intermediate revisions by the same user not shown)
Line 1: Line 1:
__NOTOC__
We invited posters under two categories:
We invited posters under two categories:


Line 5: Line 6:
* Overview of the recent research activities of a group, in relation to the themes of MLQA.
* Overview of the recent research activities of a group, in relation to the themes of MLQA.


There were 11 poster submissions to the MLQA poster session. The posters will be soon available online.
There were 11 poster submissions to the MLQA poster session. Click on a poster title in order to access the correspondent poster submission in the pdf format.

==Michael Huth, Nir Piterman, Daniel Wagner. [http://wiki.ercim.eu/wg/MLQA/images/1/14/Submission_1.pdf p-Automata Approach to Probabilistic Verification].==

The poster introduces an automata based approach to the
verification of probabilistic systems. Our so-called
p-automata combine the combinatorial structure of alternating
tree automata with the ability to quantify probabilities of
regular sets of paths. These foundations enable abstraction-based
probabilistic model checking for probabilistic specifications
that subsume Markov chains, PCTL, LTL and CTL* like logics.

The poster is based on results from "New Foundations for
Discrete-Time Probabilistic Verification" to appear in
Proceedings of QEST 2010.

== Nataliya Skrypnyuk, Michael Smith. [http://wiki.ercim.eu/wg/MLQA/images/b/b2/Submission_2.pdf Abstractions of Stochastic Process Algebras.] ==

Stochastic process algebras are compositional modelling languages
that can be used to describe systems whose behaviour evolves
probabilistically over time - from distributed computer systems, to
signalling pathways in cells. Two such languages are Interactive
Markov Chains (IMC), and the Performance Evaluation Process Algebra
(PEPA), which handle synchronisation between model components in
different ways. The semantics of PEPA can be described as a continuous
time Markov chain (CTMC), whereas for IMC it is a continuous time
Markov decision process (CTMDP).

In both PEPA and IMC, it is very easy to write a model whose state space
is exponentially larger than its description. It is therefore important to
look for ways to abstract such models, so that they can become small
enough to analyse. Importantly, we want to perform the abstraction at
the language level, so that we maintain the compositionality of the model.
In this poster, we illustrate two different approaches to this - abstraction
of IMC using pathway analysis (a type of static analysis), and abstraction
of PEPA using compositional aggregation of states.

== Igor Cappello and Paola Quaglia. [http://wiki.ercim.eu/wg/MLQA/images/c/cb/Submission_3.pdf Quantitative Assessment of Web Services: Applying SCOWS to a Real-Life Scenario]. ==

We present an example of application of SCOWS to a real-life scenario
(loan request) developed within the SENSORIA European Project. SCOWS
is a stochastic process algebra tailored for the purpose of
quantitative assessment of Web Services: applying the labelled
semantics to a SCOWS service S, it is possible to derive the Labelled
Transition System (LTS) representing all the possible behaviours of S.
The LTS can then be used to generate a CTMC on which quantitative
model checking can take place.

In order to automatically derive the LTS and minimize the state space
needed to represent it, we implemented SCOWS_lts, a tool written in
Java whose output can be imported in a model checker (e.g. PRISM) and
used to perform quantitative analysis.

We report some results obtained assessing the impact of one parameter
(the rate at which a clerk processes the submitted loan request) on
the overall performance of the service.

== Ender Yüksel, Hanne Riis Nielson, and Flemming Nielson. [http://wiki.ercim.eu/wg/MLQA/images/4/44/Submission_4.pdf High Security at a Low Cost]. ==

In the future tiny devices with microcontrollers and sensors will be in charge
of numerous activities in our lives. Tracking our energy consumption and CO2
emission, controlling our living conditions, enforcing security, and monitoring
our health will be some examples of their functions These devices will form
wireless networks to communicate with one another, moreover their power
consumption will be very low. It is not hard to predict that our modern society
will depend on the correct operation of these devices, and the security of the
network they are operating.

Such sensor-based systems, also known as "cyber-physical systems", achieve
security by means of cryptographic protocols. In a simplistic setting where the
power consumption should be minimum and the processing power is limited, it
is more likely that all devices in the network will share the same cryptographic
key. In this study, we are working on the trade-off between two challenges:
"the cryptographic key should be changed frequently to preserve security" and
"the cryptographic key should be changed rarely to save power". We work on the
ZigBee wireless sensor network standard, that offers the advantages of simple
and low resource communication. We model the system as a continuous-time
Markov chain, and analyze it by posing a number of questions shedding light
on its behaviour. The properties we are interested in are expressed in continuous stochastic logic, and probabilistic model checker Prism is used in the analysis.

== Erik de Vink, Suzana Andova, and Luuk Groenewegen. [http://wiki.ercim.eu/wg/MLQA/images/8/84/Submission_5.pdf Towards Dynamic Adaptation of Probabilistic Systems]. ==

Dynamic system adaptation is modeled in Paradigm as coordination of
collaborating components. A special component McPal allows for addition
of new behavior, of new constraints and of new control in view of a new
collaboration. McPal gradually adapts the system dynamics. It is shown
that the approach also applies to the probabilistic setting. For a
client-server example, where McPal gradually adds probabilistic behavior
to deterministic components, precise modeling of changing system
dynamics is achieved. This modeling of the transient behavior, spanning
the complete migration range from as-is collaboration to to-be collabo-
ration, serves as a stepping stone for quantitative analysis of the
system during adaptation.

== Fuyuan Zhang and Piotr Filipiuk. [http://wiki.ercim.eu/wg/MLQA/images/5/5c/Submission_6.pdf Model Checking is Static Analysis]. ==

The link between model checking and static analysis has been of great interest
to academia for many years. Early work of Bernhard Steffen and David Schmidt
has shown that classic data-flow analysis is an instance of model checking. The
poster gives an overview of our research aiming to show that also model checking
is static analysis of modal logics and that model checking can be carried by the
static analysis engine; namely ALFP suite. We use Flow Logic, which is a
state-of-the-art approach to static analysis that bridges the gap between a number of approaches to static analysis including Data Flow Analysis, Constraint Based Analysis, Abstract Interpretation and Type and Effect Systems. In the developments it has been demonstrated that Flow Logic is a robust approach that is able to deal with a variety of calculi, programming languages and recently modal logics such as CTL, ACTL and alternation-free mu-calculus. In order to calculate the analysis solution, we use ALFP Suite that computes the least model guaranteed by the Moore family theorem for ALFP. In its current version, the suite supports two solvers, one being a differential worklist solver that is based on a representation of relations as prefix trees and the other being a solver in continuation passing style that is based on a BDD representation of relations.

== Vashti Galpin, Jane Hillston, and Luca Bortolussi. [http://wiki.ercim.eu/wg/MLQA/images/8/83/Submission_7.pdf A Stochastic Hybrid Process Algebra]. ==

The process algebra HYPE was developed to model hybrid systems which are
those systems that exhibit both discrete and continuous behaviour. Our new
process algebra extends HYPE by including the ability to model stochastic
behaviour.

The poster will introduce stochastic HYPE, minimising notational overhead and
through the use of a networking example. It will also contain graphs describing
the evolution of the example over time based on stochastic simulation.

== David Cachera, Thomas Jensen, Arnaud Jobin, and Pascal Sotin. [http://wiki.ercim.eu/wg/MLQA/images/1/12/Submission_8.pdf A Linear Operator Framework for Analysing Resource Usage]. ==

We present a semantics-based framework for analysing the quantitative
behaviour of programs with regard to resource usage. We start from an
operational semantics in which costs are modelled using a dioid
structure, which generalizes the classical (max,plus) semiring
structure over reals. The dioid structure of costs allows for defining
the quantitative semantics as a linear operator. We then develop an
approximation framework of such a semantics in order to effectively
compute global cost information from the program. We show how this
framework is related to, and thus can benefit from, the theory of
abstract interpretation for analyzing qualitative properties, We focus
on the notion of long-run cost which models the asymptotic average
cost of a program, and show that our abstraction technique provides a
correct approximation of the concrete long-run cost. We illustrate our
approach on an application example devoted to the computation of
energy consumption for a language with complex array operations and
explicit energy modes management.

== Christoffer Sloth and Rafael Wisniewski. [http://wiki.ercim.eu/wg/MLQA/images/e/eb/Submission_9.pdf Verification of Continuous Dynamical Systems by Timed Automata]. ==

The poster outlines a method for abstracting continuous dynamical systems by finite
timed automata, which allows automatic verification of continuous systems. The novelty
of the method is that the abstraction is generated from a partition of state space which
is generated by intersecting set-differences of positive invariant sets.

It is chosen to abstract continuous systems by timed automata, since tools for efficient
verification of such models exist. Additionally, Lyapunov functions are chosen for
generating the partition, because their sub-level sets are positive invariant; hence, it is
possible to determine a priori if the abstraction is sound or complete. Furthermore, for
certain systems it is possible to approximate the reachable set of the continuous system
arbitrarily close by the timed automaton. The structure added to the abstraction by
partitioning the state space by positive invariant sets allows the timed automaton to be
composed of multiple timed automata. This makes it possible to parallelize the verification
process of the timed automaton.

Some methods require explicit solutions of the differential equations that describe the
continuous dynamics, which are usually unknown. Therefore, the proposed method only
relies on the Lyapunov functions and their derivatives, which can be calculated.

The proposed abstraction is applied to an example, which illustrates how sound and
complete abstractions are generated.

== Lei Song, Flemming Nielson, and Bo Friis Nielson. [http://wiki.ercim.eu/wg/MLQA/images/9/94/Submission_10.pdf A Broadcast Based Stochastic Calculus]. ==

In our current work we give a stochastic calculus based on broadcast communication. Only
broadcast actions are associated with rates while all the reception actions are passive and
assigned with weights. By doing so there is no need to handle synchronisation between
actions with different rates. We also give a label transition system of our calculus from
which we can get a CTMC naturally. We show the usefulness of our calculus by giving
several examples from performance analysis setting such as batch Markovian arrival
process (BMAP), marked markovian arrival process (MMAP) and closed queueing networks.

== Massimo Callisto De Donato, Flavio Corradini, Maria Rita Di Berardini, Emanuela Merelli, and Luca Tesei. [http://wiki.ercim.eu/wg/MLQA/images/0/00/Submission_11.pdf Tailoring the Shape Calculus for Quantitative Analysis]. ==

The Shape Calculus, a bio-inspired language, is a non-deterministic calculus in which processes
are composed of a 3D shape moving in a space and of a behaviour. A process behaviour is formally
expressed as a CCS-like process algebra with deterministic time where channels - the active sites -
are particular surfaces of the shape. A site can bind, after a proper collision, to another compatible
channel forming a more complex shape and a new process. Contextually to the introduction of the
Shape Calculus we defined BioShape, an environment in which a network of Shape Calculus processes,
together with more specific information about motion and behaviour, can be simulated. In particular,
we showed that such a tool is suitable for representing a biological phenomenon at different spatial
and temporal scales and can be used to perform in silico experiments.

Our aim is to complete the Shape Calculus by providing verification techniques at specification level.
The objective is to find the right abstractions and boundaries that permit the application of existing
quantitative model checking or quantitative equivalence checking techniques to the evolution of a
given network of Shape Calculus processes. Also suitable logic languages for specifying the properties
must be identified. In such a way we target to improve Shape Calculus to promote a coupling between
the calculus and the BioShape tool to work in synergy towards the gaining of quantitative information
about simulated environments.

Go back to [[MLQA 2010|MLQA meeting at FLoC 2010]]

Latest revision as of 12:21, 20 July 2010

We invited posters under two categories:

  • Presentation of recent or on-going work relating to models, logics, tools, and/or applications with respect to discrete, stochastic and/or continuous systems and properties.
  • Overview of the recent research activities of a group, in relation to the themes of MLQA.

There were 11 poster submissions to the MLQA poster session. Click on a poster title in order to access the correspondent poster submission in the pdf format.

Michael Huth, Nir Piterman, Daniel Wagner. p-Automata Approach to Probabilistic Verification.

The poster introduces an automata based approach to the verification of probabilistic systems. Our so-called p-automata combine the combinatorial structure of alternating tree automata with the ability to quantify probabilities of regular sets of paths. These foundations enable abstraction-based probabilistic model checking for probabilistic specifications that subsume Markov chains, PCTL, LTL and CTL* like logics.

The poster is based on results from "New Foundations for Discrete-Time Probabilistic Verification" to appear in Proceedings of QEST 2010.

Nataliya Skrypnyuk, Michael Smith. Abstractions of Stochastic Process Algebras.

Stochastic process algebras are compositional modelling languages that can be used to describe systems whose behaviour evolves probabilistically over time - from distributed computer systems, to signalling pathways in cells. Two such languages are Interactive Markov Chains (IMC), and the Performance Evaluation Process Algebra (PEPA), which handle synchronisation between model components in different ways. The semantics of PEPA can be described as a continuous time Markov chain (CTMC), whereas for IMC it is a continuous time Markov decision process (CTMDP).

In both PEPA and IMC, it is very easy to write a model whose state space is exponentially larger than its description. It is therefore important to look for ways to abstract such models, so that they can become small enough to analyse. Importantly, we want to perform the abstraction at the language level, so that we maintain the compositionality of the model. In this poster, we illustrate two different approaches to this - abstraction of IMC using pathway analysis (a type of static analysis), and abstraction of PEPA using compositional aggregation of states.

Igor Cappello and Paola Quaglia. Quantitative Assessment of Web Services: Applying SCOWS to a Real-Life Scenario.

We present an example of application of SCOWS to a real-life scenario (loan request) developed within the SENSORIA European Project. SCOWS is a stochastic process algebra tailored for the purpose of quantitative assessment of Web Services: applying the labelled semantics to a SCOWS service S, it is possible to derive the Labelled Transition System (LTS) representing all the possible behaviours of S. The LTS can then be used to generate a CTMC on which quantitative model checking can take place.

In order to automatically derive the LTS and minimize the state space needed to represent it, we implemented SCOWS_lts, a tool written in Java whose output can be imported in a model checker (e.g. PRISM) and used to perform quantitative analysis.

We report some results obtained assessing the impact of one parameter (the rate at which a clerk processes the submitted loan request) on the overall performance of the service.

Ender Yüksel, Hanne Riis Nielson, and Flemming Nielson. High Security at a Low Cost.

In the future tiny devices with microcontrollers and sensors will be in charge of numerous activities in our lives. Tracking our energy consumption and CO2 emission, controlling our living conditions, enforcing security, and monitoring our health will be some examples of their functions These devices will form wireless networks to communicate with one another, moreover their power consumption will be very low. It is not hard to predict that our modern society will depend on the correct operation of these devices, and the security of the network they are operating.

Such sensor-based systems, also known as "cyber-physical systems", achieve security by means of cryptographic protocols. In a simplistic setting where the power consumption should be minimum and the processing power is limited, it is more likely that all devices in the network will share the same cryptographic key. In this study, we are working on the trade-off between two challenges: "the cryptographic key should be changed frequently to preserve security" and "the cryptographic key should be changed rarely to save power". We work on the ZigBee wireless sensor network standard, that offers the advantages of simple and low resource communication. We model the system as a continuous-time Markov chain, and analyze it by posing a number of questions shedding light on its behaviour. The properties we are interested in are expressed in continuous stochastic logic, and probabilistic model checker Prism is used in the analysis.

Erik de Vink, Suzana Andova, and Luuk Groenewegen. Towards Dynamic Adaptation of Probabilistic Systems.

Dynamic system adaptation is modeled in Paradigm as coordination of collaborating components. A special component McPal allows for addition of new behavior, of new constraints and of new control in view of a new collaboration. McPal gradually adapts the system dynamics. It is shown that the approach also applies to the probabilistic setting. For a client-server example, where McPal gradually adds probabilistic behavior to deterministic components, precise modeling of changing system dynamics is achieved. This modeling of the transient behavior, spanning the complete migration range from as-is collaboration to to-be collabo- ration, serves as a stepping stone for quantitative analysis of the system during adaptation.

Fuyuan Zhang and Piotr Filipiuk. Model Checking is Static Analysis.

The link between model checking and static analysis has been of great interest to academia for many years. Early work of Bernhard Steffen and David Schmidt has shown that classic data-flow analysis is an instance of model checking. The poster gives an overview of our research aiming to show that also model checking is static analysis of modal logics and that model checking can be carried by the static analysis engine; namely ALFP suite. We use Flow Logic, which is a state-of-the-art approach to static analysis that bridges the gap between a number of approaches to static analysis including Data Flow Analysis, Constraint Based Analysis, Abstract Interpretation and Type and Effect Systems. In the developments it has been demonstrated that Flow Logic is a robust approach that is able to deal with a variety of calculi, programming languages and recently modal logics such as CTL, ACTL and alternation-free mu-calculus. In order to calculate the analysis solution, we use ALFP Suite that computes the least model guaranteed by the Moore family theorem for ALFP. In its current version, the suite supports two solvers, one being a differential worklist solver that is based on a representation of relations as prefix trees and the other being a solver in continuation passing style that is based on a BDD representation of relations.

Vashti Galpin, Jane Hillston, and Luca Bortolussi. A Stochastic Hybrid Process Algebra.

The process algebra HYPE was developed to model hybrid systems which are those systems that exhibit both discrete and continuous behaviour. Our new process algebra extends HYPE by including the ability to model stochastic behaviour.

The poster will introduce stochastic HYPE, minimising notational overhead and through the use of a networking example. It will also contain graphs describing the evolution of the example over time based on stochastic simulation.

David Cachera, Thomas Jensen, Arnaud Jobin, and Pascal Sotin. A Linear Operator Framework for Analysing Resource Usage.

We present a semantics-based framework for analysing the quantitative behaviour of programs with regard to resource usage. We start from an operational semantics in which costs are modelled using a dioid structure, which generalizes the classical (max,plus) semiring structure over reals. The dioid structure of costs allows for defining the quantitative semantics as a linear operator. We then develop an approximation framework of such a semantics in order to effectively compute global cost information from the program. We show how this framework is related to, and thus can benefit from, the theory of abstract interpretation for analyzing qualitative properties, We focus on the notion of long-run cost which models the asymptotic average cost of a program, and show that our abstraction technique provides a correct approximation of the concrete long-run cost. We illustrate our approach on an application example devoted to the computation of energy consumption for a language with complex array operations and explicit energy modes management.

Christoffer Sloth and Rafael Wisniewski. Verification of Continuous Dynamical Systems by Timed Automata.

The poster outlines a method for abstracting continuous dynamical systems by finite timed automata, which allows automatic verification of continuous systems. The novelty of the method is that the abstraction is generated from a partition of state space which is generated by intersecting set-differences of positive invariant sets.

It is chosen to abstract continuous systems by timed automata, since tools for efficient verification of such models exist. Additionally, Lyapunov functions are chosen for generating the partition, because their sub-level sets are positive invariant; hence, it is possible to determine a priori if the abstraction is sound or complete. Furthermore, for certain systems it is possible to approximate the reachable set of the continuous system arbitrarily close by the timed automaton. The structure added to the abstraction by partitioning the state space by positive invariant sets allows the timed automaton to be composed of multiple timed automata. This makes it possible to parallelize the verification process of the timed automaton.

Some methods require explicit solutions of the differential equations that describe the continuous dynamics, which are usually unknown. Therefore, the proposed method only relies on the Lyapunov functions and their derivatives, which can be calculated.

The proposed abstraction is applied to an example, which illustrates how sound and complete abstractions are generated.

Lei Song, Flemming Nielson, and Bo Friis Nielson. A Broadcast Based Stochastic Calculus.

In our current work we give a stochastic calculus based on broadcast communication. Only broadcast actions are associated with rates while all the reception actions are passive and assigned with weights. By doing so there is no need to handle synchronisation between actions with different rates. We also give a label transition system of our calculus from which we can get a CTMC naturally. We show the usefulness of our calculus by giving several examples from performance analysis setting such as batch Markovian arrival process (BMAP), marked markovian arrival process (MMAP) and closed queueing networks.

Massimo Callisto De Donato, Flavio Corradini, Maria Rita Di Berardini, Emanuela Merelli, and Luca Tesei. Tailoring the Shape Calculus for Quantitative Analysis.

The Shape Calculus, a bio-inspired language, is a non-deterministic calculus in which processes are composed of a 3D shape moving in a space and of a behaviour. A process behaviour is formally expressed as a CCS-like process algebra with deterministic time where channels - the active sites - are particular surfaces of the shape. A site can bind, after a proper collision, to another compatible channel forming a more complex shape and a new process. Contextually to the introduction of the Shape Calculus we defined BioShape, an environment in which a network of Shape Calculus processes, together with more specific information about motion and behaviour, can be simulated. In particular, we showed that such a tool is suitable for representing a biological phenomenon at different spatial and temporal scales and can be used to perform in silico experiments.

Our aim is to complete the Shape Calculus by providing verification techniques at specification level. The objective is to find the right abstractions and boundaries that permit the application of existing quantitative model checking or quantitative equivalence checking techniques to the evolution of a given network of Shape Calculus processes. Also suitable logic languages for specifying the properties must be identified. In such a way we target to improve Shape Calculus to promote a coupling between the calculus and the BioShape tool to work in synergy towards the gaining of quantitative information about simulated environments.

Go back to MLQA meeting at FLoC 2010