MLQA 2013

From ERCIM Working Group MLQA
Revision as of 13:56, 25 February 2013 by Ender (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Models and Logics for Quantitative Analysis

March 24, 2013 Rome, Italy

 

Overivew

The fifth annual meeting of the MLQA working group will take place on Sunday 24 March 2013 as part of the European Joint Conferences on Theory And Practice of Software (ETAPS) in Rome, Italy. This year edition will be held in parallel with the Quantitative Aspects of Programming Languages and Systems (QAPL) workshop. Invited speakers will be shared between the two events.



Invited speakers

  • Martin Fränzle, University of Oldenburg
    A tight integration of symbolic, numeric, and statistical methods for the analysis of cyber-physical systems
  • Paolo Zuliani, Newcastle University
    Statistical Model Checking for Cyber-Physical Systems
  • Flemming Nielson, Technical University of Denmark
    Guarding against Denial of Service Attacks

Abstracts of the invited talks are available here.


Preliminary Programme

  • 09:00 - 10:00 Invited talk: A tight integration of symbolic, numeric, and statistical methods for the analysis of cyber-physical systems, by Martin Fränzle
  • 10:00 - 11:00 Invited talk: Statistical Model Checking for Cyber-Physical Systems, by Paolo Zuliani
  • 11:00 - 11:30 coffee
  • 11:30 - 12:30 Invited talk: Guarding against Denial of Service Attacks, by Flemming Nielson

Organiser



Minutes

Minutes of MLQA 2013 will be available after the workshop.



Abstracts of Invited Talks

  • Martin Fränzle, University of Oldenburg
    Title: A tight integration of symbolic, numeric, and statistical methods for the analysis of cyber-physical systems
    Abstract: Cyber-physical systems emerge in various application domains, among them transportation and power networks, provoking a quest for quantitatively analyzing them with respect to multiple viewpoints, including functional safety, reliability, and the interaction of the geometric perspective with the former two. Based on examples from various domains under different viewpoints, we identify stochastic hybrid systems as a model able to represent the dynamics relevant to these different viewpoints. We suggest a symbolic representation of their analysis problems based on an appropriate hybrid logic and present analysis tools automatically discharging the proof obligations arising by a blend of symbolic, safe numeric, and statistical reasoning.
  • Paolo Zuliani, Newcastle University
    Title: Statistical Model Checking for Cyber-Physical Systems
    Abstract: Statistical model checking is a simulation-based approach for verifying temporal logic properties of complex stochastic systems. Traditional model checking techniques work by searching efficiently the entire state space of the system. However, many interesting systems (e.g., cyber-physical systems) have exceedingly large, or even infinite, state spaces. This makes techniques based on exhaustive search unfeasible. Statistical model checking is instead based on sampling and system simulation, and thereby scales better with system size. We present a statistical model checking approach based on Bayesian statistics, and we apply it to cyber-physical systems coded as Stateflow/Simulink models. Next, we illustrate recent developments involving rare events and non-deterministic systems. We conclude by discussing the strength and weaknesses of the approach and possible directions for future research.

  • Flemming Nielson, Technical University of Denmark
    Title: Guarding against Denial of Service Attacks
    Abstract: We outline an approach to the design of systems ensuring that the best possible operation is guaranteed even in the presence of denial of service attacks. As our vehicle of exposition we choose a process calculus allowing to time-out on expected inputs, to base subsequent actions on the extent to which expected inputs were successful and to suitably encrypt the messages exchanged. This is accompanied by an array of analysis techniques for determining the extent to which suitable caution has been exercised: a robustness analysis for ensuring that default data are available when actual data is missing, a safety and security analysis for additionally ensuring that flow of information does not violate integrity models like the Biba model, and a quantitative analysis for determining the probability distributions over the trustworthiness of the data upon which decisions are made.