[1]  David Monniaux. Réduction de réseaux et factorisation. rapport de stage de première année de magistère, École normale supérieure de Lyon, 1996. French. [ bib  .ps.gz ] 
[2]  David Monniaux. Méthodes formelles et cryptographie. rapport de stage de seconde année de magistère, École normale supérieure de Lyon, 1997. French (introduction), English (scientific matters). [ bib  .dvi.gz  .pdf ] 
[3]  David Monniaux. Réalisation mécanisée d'interpréteurs abstraits. Rapport de DEA, Université Paris VII, 1998. French. [ bib  .dvi.gz  .pdf ] 
[4] 
David Monniaux.
Decision procedures for the analysis of cryptographic protocols by
logics of belief.
In 12th Computer Security Foundations Workshop, pages 4454.
IEEE, 1999.
[ bib 
DOI 
.ps.gz 
.pdf ]
Belieflogic deductions are used in the analysis of cryptographic protocols. We show a new method to decide such logics. In addition to the familiar BAN logic, it is also applicable to the more advanced versions of protocol security logics, and GNY in particular; and it employs an efficient forwardchaining algorithm the completeness and termination of which are proved. Theoretic proofs, implementation decisions and results are discussed.

[5] 
David Monniaux.
Abstracting cryptographic protocols with tree automata.
In Sixth International Static Analysis Symposium (SAS'99),
number 1694 in Lecture Notes in Computer Science, pages 149163. Springer
Verlag, 1999.
[ bib 
.ps.gz 
.pdf ]
Cryptographic protocols have so far been analyzed for the most part by means of testing (which does not yield proofs of secrecy) and theorem proving (costly). We propose a new approach, based on abstract interpretation and using regular tree languages. The abstraction we use seems finegrained enough to be able to certify some protocols. Both the concrete and abstract semantics of the protocol description language and implementation issues are discussed in the paper.

[6] 
David Monniaux.
Abstract interpretation of probabilistic semantics.
In Seventh International Static Analysis Symposium (SAS'00),
number 1824 in Lecture Notes in Computer Science, pages 322339. Springer
Verlag, 2000.
[ bib 
DOI 
.ps.gz 
.pdf ]
Following earlier models, we lift standard deterministic and nondeterministic semantics of imperative programs to probabilistic semantics. This semantics allows for random external inputs of known or unknown probability and random number generators. We then propose a method of analysis of programs according to this semantics, in the general framework of abstract interpretation. This method lifts an “ordinary” abstract lattice, for nonprobabilistic programs, to one suitable for probabilistic programs. Our construction is highly generic. We discuss the influence of certain parameters on the precision of the analysis, basing ourselves on experimental results.

[7] 
David Monniaux.
An abstract MonteCarlo method for the analysis of probabilistic
programs (extended abstract).
In 28th Symposium on Principles of Programming Languages (POPL
'01), pages 93101. Association for Computer Machinery, 2001.
[ bib 
DOI 
arXiv 
.ps.gz 
.pdf ]
We introduce a new method, combination of random testing and abstract interpretation, for the analysis of programs featuring both probabilistic and nonprobabilistic nondeterminism. After introducing “ordinary” testing, we show how to combine testing and abstract interpretation and give formulas linking the precision of the results to the number of iterations. We then discuss complexity and optimization issues and end with some experimental results.

[8] 
David Monniaux.
Backwards abstract interpretation of probabilistic programs.
In European Symposium on Programming Languages and Systems (ESOP
'01), number 2028 in Lecture Notes in Computer Science, pages 367382.
Springer Verlag, 2001.
[ bib 
DOI 
.ps.gz 
.pdf ]
We introduce an extension of the framework of backwards abstract interpretation to probabilistic programs. This extension is proved to be sound with respect to a semantics that we introduce and prove to be equivalent to the standard semantics of probabilistic programs. We then propose a generic construction lifting ordinary abstract interpretation lattices to probabilistic programs.

[9] 
David Monniaux.
An abstract analysis of the probabilistic termination of programs.
In 8th International Static Analysis Symposium (SAS'01), number
2126 in Lecture Notes in Computer Science, pages 111126. Springer Verlag,
2001.
[ bib 
DOI 
.ps.gz 
.pdf ]
It is often useful to introduce probabilistic behavior in programs, either because of the use of internal random generators (probabilistic algorithms), either because of some external devices (networks, physical sensors) with known statistics of behavior. Previous works on probabilistic abstract interpretation have addressed safety properties, but somehow neglected probabilistic termination. In this paper, we propose a method to automatically prove the probabilistic termination of programs using exponential bounds on the tail of the distribution. We apply this method to an example and give some directions as to how to implement it. We also show that this method can also be applied to make unsound statistical methods on average running times sound.

[10] 
David Monniaux.
Analyse de programmes probabilistes par interprétation
abstraite.
Thèse de doctorat, Université Paris IX Dauphine, 2001.
Résumé étendu en francais. Contents in English.
[ bib 
.pdf ]
The study of probabilistic programs is of considerable interest for the validation of networking protocols, embedded systems, or simply for compiling optimizations. It is also a difficult matter, due to the undecidability of properties on infinitestate deterministic programs, as well as the difficulties arising from probabilistic aspects.

[11] 
David Monniaux.
Analysis of cryptographic protocols using logics of belief: an
overview.
Journal of Telecommunications and Information Technology,
4:5767, 2002.
[ bib 
.pdf ]
When designing a cryptographic protocol or explaining it, one often uses arguments such as “since this message was signed by machine B, machine A can be sure it came from B” in informal proofs justifying how the protocol works. Since it is, in such informal proofs, often easy to overlook an essential assumption, such as a trust relation or the belief that a message is not a replay from a previous session, it seems desirable to write such proofs in a formal system. While such logics do not replace the recent techniques of automatic proofs of safety properties, they help in pointing the weaknesses of the system. In this paper, we present briefly the BAN (Burrows  Abadi  Needham) formal system [?] as well as some derivative. We show how to prove some properties of a simple protocol, as well as detecting undesirable assumptions. We then explain how the manual search for proofs can be made automatic. Finally, we explain how the lack of proper semantics can be a bit worrying.

[12] 
B. Blanchet, P. Cousot, R. Cousot, J. Feret, L. Mauborgne, A.
Miné, D. Monniaux, and X. Rival.
Design and implementation of a specialpurpose static program
analyzer for safetycritical realtime embedded software.
In T. Mogensen, D.A. Schmidt, and I.H. Sudborough, editors,
The Essence of Computation: Complexity, Analysis, Transformation,
number 2566 in Lecture Notes in Computer Science, pages 85108. Springer
Verlag, 2002.
[ bib 
.ps.gz 
.pdf ]
We report on an experience in the design and implementation of a specialpurpose static program analyzer for the verification of critical embedded realtime software.

[13] 
David Monniaux.
Abstraction of expectation functions using gaussian distributions.
In Lenore D. Zuck, Paul C. Attie, Agostino Cortesi, and Supratik
Mukhopadhyay, editors, Verification, Model Checking, and Abstract
Interpretation: VMCAI '03, number 2575 in Lecture Notes in Computer Science,
pages 161173. Springer Verlag, 2003.
[ bib 
DOI 
.ps.gz 
.pdf ]
We consider semantics of infinitestate programs, both probabilistic and nondeterministic, as expectation functions: for any set of states A, we associate to each program point a function mapping each state to its expectation of starting a trace reaching A. We then compute a safe upper approximation of these functions using abstract interpretation. This computation takes place in an abstract domain of extended Gaussian (normal) distributions.

[14] 
Bruno Blanchet, Patrick Cousot, Radhia Cousot, Jérôme Feret, Laurent
Mauborgne, Antoine Miné, David Monniaux, and Xavier Rival.
A static analyzer for large safetycritical software.
In ACM SIGPLAN Conference on Programming language design and
implementation (PLDI), pages 196207. ACM, 2003.
[ bib 
DOI 
arXiv 
.ps.gz 
.pdf ]
We show that abstract interpretationbased static program analysis can be made efficient and precise enough to formally verify a class of properties for a family of large programs with few or no false alarms. This is achieved by refinement of a general purpose static analyzer and later adaptation to particular programs of the family by the enduser through parametrization. This is applied to the proof of soundness of data manipulation operations at the machine level for periodic synchronous safety critical embedded software. The main novelties are the design principle of static analyzers by refinement and adaptation through parametrization, the symbolic manipulation of expressions to improve the precision of abstract transfer functions, ellipsoid, and decision tree abstract domains, all with sound handling of rounding errors in floating point computations, widening strategies (with thresholds, delayed) and the automatic determination of the parameters (parametrized packing).

[15]  David Monniaux. Abstracting cryptographic protocols with tree automata. Science of Computer Programming, 47(23):177202, 2003. [ bib  http  .ps.gz  .pdf ] 
[16] 
David Monniaux.
Abstract interpretation of programs as Markov decision processes.
In Static Analysis Symposium (SAS '03), number 2694 in Lecture
Notes in Computer Science, pages 237254. Springer Verlag, 2003.
[ bib 
DOI 
.ps.gz 
.pdf ]
We propose a formal language for the specification of trace properties of probabilistic, nondeterministic transition systems, encompassing the properties expressible in Linear Time Logic. Those formulas are in general undecidable on infinite deterministic transition systems and thus on infinite Markov decision processes. This language has both a semantics in terms of sets of traces, as well as another semantics in terms of measurable functions; we give and prove theorems linking the two semantics. We then apply abstract interpretationbased techniques to give upper bounds on the worstcase probability of the studied property. We propose an enhancement of this technique when the state space is partitioned  for instance along the program points , allowing the use of faster iteration methods.

[17] 
David Monniaux.
Abstract interpretation of programs as Markov decision processes.
Science of Computer Programming, 58(12):179205, October
2005.
[ bib 
http 
.ps.gz 
.pdf ]
We propose a formal language for the specification of trace properties of probabilistic, nondeterministic transition systems, encompassing the properties expressible in Linear Time Logic. Those formulas are in general undecidable on infinite deterministic transition systems and thus on infinite Markov decision processes. This language has both a semantics in terms of sets of traces, as well as another semantics in terms of measurable functions; we give and prove theorems linking the two semantics. We then apply abstract interpretationbased techniques to give upper bounds on the worstcase probability of the studied property. We propose an enhancement of this technique when the state space is partitioned  for instance along the program points , allowing the use of faster iteration methods.

[18] 
David Monniaux.
Compositional analysis of floatingpoint linear numerical filters.
In Computeraided verification (CAV), number 3576 in Lecture
Notes in Computer Science, pages 199212. Springer Verlag, 2005.
[ bib 
DOI 
.pdf ]
Digital linear filters are used in a variety of applications (sound treatment, control/command, etc.), implemented in software, in hardware, or a combination thereof. For safetycritical applications, it is necessary to bound all variables and outputs of all filters. We give a compositional, effective abstraction for digital linear filters expressed as block diagrams, yielding sound, precise bounds for fixedpoint or floatingpoint implementations of the filters.

[19] 
Patrick Cousot, Radhia Cousot, Jérôme Feret, Laurent Mauborgne, Antoine
Miné, David Monniaux, and Xavier Rival.
The ASTRÉE analyzer.
In European symposium on programming (ESOP), number 3444 in
Lecture Notes in Computer Science, pages 2130, 2005.
[ bib 
DOI 
.ps.gz 
.pdf ]
Astrée is an abstract interpretationbased static program analyzer aiming at proving automatically the absence of run time errors in programs written in the C programming language. It has been applied with success to large embedded controlcommand safety critical realtime software generated automatically from synchronous specifications, producing a correctness proof for complex software without any false alarm in a few hours of computation.

[20] 
David Monniaux.
The parallel implementation of the Astrée static analyzer.
In Kwangkeun Yi, editor, Programming Languages and Systems
(APLAS), number 3780 in Lecture Notes in Computer Science. Springer Verlag,
2005.
[ bib 
DOI 
arXiv 
.ps.gz 
.pdf ]
The Astrée static analyzer is a specialized tool that can prove the absence of runtime errors, including arithmetic overflows, in large critical programs. Keeping analysis times reasonable for industrial use is one of the design objectives. In this paper, we discuss the parallel implementation of the analysis.

[21]  David Monniaux and JeanBaptiste Soufron. DRM as a dangerous alternative to copyright licences. Upgrade, 7(3), 2006. [ bib  .pdf ] 
[22] 
David Monniaux.
Optimal abstraction on realvalued programs.
In Gilberto Filé and Hanne Riis Nielson, editors, Static
analysis (SAS '07), number 4634 in Lecture Notes in Computer Science, pages
104120. Springer Verlag, 2007.
[ bib 
DOI 
.ps.gz 
.pdf ]
In this paper, we show that it is possible to abstract program fragments using real variables using formulas in the theory of real closed fields. This abstraction is compositional and modular. We first propose an exact abstraction for progr ams without loops. Given an abstract domain (in a wide class including intervals and octagons), we then show how to obtain an optimal abstraction of program fra gments with respect to that domain. This abstraction allows computing optimal fixed points inside that abstract domain, without the need for a widening operator .

[23] 
David Monniaux.
The pitfalls of verifying floatingpoint computations.
TOPLAS, 30(3):12, May 2008.
[ bib 
DOI 
arXiv 
http 
.pdf ]
Current critical systems commonly use a lot of floatingpoint computations, and thus the testing or static analysis of programs containing floatingpoint operators has become a priority. However, correctly defining the semantics of common implementations of floatingpoint is tricky, because semantics may change with many factors beyond sourcecode level, such as choices made by compilers. We here give concrete examples of problems that can appear and solutions to implement in analysis software.

[24] 
Patrick Cousot, Radhia Cousot, Jerome Feret, Antoine Miné, Laurent Mauborgne,
David Monniaux, and Xavier Rival.
Varieties of static analyzers: A comparison with ASTRÉE.
In Theoretical Aspects of Software Engineering (TASE '07).
IEEE, 2007.
[ bib 
DOI ]
We discuss the characteristic properties of ASTRÉE, an automatic static analyzer for proving the absence of runtime errors in safetycritical realtime synchronous control/command C programs, and compare it with a variety of other program analysis tools.

[25] 
David Monniaux.
Verification of device drivers and intelligent controllers: a case
study.
In Christoph Kirsch and Reinhard Wilhelm, editors, EMSOFT,
pages 3036. ACM & IEEE, 2007.
[ bib 
DOI 
.ps.gz 
.pdf ]
The soundness of device drivers generally cannot be verified in isolation, but has to take into account the reactions of the hardware devices. In critical embedded systems, interfaces often were simple “volatile” variables, and the interface specification typically a list of bounds on these variables. Some newer systems use “intelligent” controllers that handle dynamic worklists in shared memory and perform direct memory accesses, all asynchronously from the main processor. Thus, it is impossible to truly verify the device driver without taking the intelligent device into account, because incorrect programming of the device can lead to dire consequences, such as memory zones being erased. We have successfully verified a device driver extracted from a critical industrial system, asynchronously combined with a model for a USB OHCI controller. This paper studies this case, as well as introduces a model and analysis techniques for this asynchronous composition.

[26]  Patrick Cousot, Radhia Cousot, Jérôme Feret, Laurent Mauborgne, Antoine Miné, David Monniaux, and Xavier Rival. Combination of abstractions in the astrée static analyzer. In Advances in Computer Science  ASIAN 2006. Secure Software and Related Issues, number 4435 in Lecture Notes in Computer Science, pages 272300. Springer Verlag, 2008. [ bib  DOI ] 
[27] 
David Monniaux.
A quantifier elimination algorithm for linear real arithmetic.
In LPAR (Logic for Programming Artificial Intelligence and
Reasoning), number 5330 in Lecture Notes in Computer Science, pages
243257. Springer Verlag, 2008.
[ bib 
DOI 
arXiv 
.pdf ]
We propose a new quantifier elimination algorithm for the theory of linear real arithmetic. This algorithm uses as subroutines satisfiability modulo this theory and polyhedral projection; there are good algorithms and implementations for both of these. The quantifier elimination algorithm presented in the paper is compared, on examples arising from program analysis problems and on random examples, to several other implementations, all of which cannot solve some of the examples that our algorithm solves easily.

[28] 
David Monniaux.
Automatic modular abstractions for linear constraints.
In POPL (Principles of programming languages), pages 140151.
ACM, 2009.
[ bib 
DOI 
arXiv 
.pdf ]
We propose a method for automatically generating abstract transformers for static analysis by abstract interpretation. The method focuses on linear constraints on programs operating on rational, real or floatingpoint variables and containing linear assignments and tests.

[29] 
David Monniaux.
On using floatingpoint computations to help an exact linear
arithmetic decision procedure.
In Computeraided verification (CAV), number 5643 in Lecture
Notes in Computer Science, pages 570583. Springer Verlag, 2009.
[ bib 
DOI 
.pdf ]
We consider the decision problem for quantifierfree formulas whose atoms are linear inequalities interpreted over the reals or rationals. This problem may be decided using satisfiability modulo theory (SMT), using a mixture of a SAT solver and a simplexbased decision procedure for conjunctions. Stateoftheart SMT solvers use simplex implementations over rational numbers, which perform well for typical problems arising from modelchecking and program analysis (sparse inequalities, small coefficients) but are slow for other applications (denser problems, larger coefficients).

[30] 
David Monniaux.
Analyse statique : de la théorie à la pratique.
Habilitation to direct research, Université Joseph Fourier,
Grenoble, France, July 2009.
[ bib 
.pdf ]
Software operating critical systems (aircraft, nuclear power plants) should not fail  whereas most computerised systems of daily life (personal computer, ticket vending machines, cell phone) fail from time to time. This is not a simple engineering problem: it is known, since the works of Turing and Cook, that proving that programs work correctly is intrinsically hard.

[31] 
David Monniaux.
A minimalistic look at widening operators.
Higher order and symbolic computation, 22(2):145154, December
2009.
[ bib 
DOI 
.pdf ]
We consider the problem of formalizing in higherorder logic the familiar notion of widening from abstract interpretation. It turns out that many axioms of widening (e.g. widening sequences are ascending) are not useful for proving correctness. After keeping only useful axioms, we give an equivalent characterization of widening as a lazily constructed wellfounded tree. In type systems supporting dependent products and sums, this tree can be made to reflect the condition of correct termination of the widening sequence.

[32] 
David Monniaux.
Quantifier elimination by lazy model enumeration.
In Computeraided verification (CAV), number 6174 in Lecture
Notes in Computer Science, pages 585599. Springer Verlag, 2010.
[ bib 
DOI 
.pdf ]
We propose a quantifier elimination scheme based on nested lazy model enumeration through SMTsolving, and projections. This scheme may be applied to any logic that fulfills certain conditions; we illustrate it for linear real arithmetic. The quantifier elimination problem for linear real arithmetic is doubly exponential in the worst case, and so is our method. We have implemented it and benchmarked it against other methods from the literature.

[33] 
David Monniaux.
Automatic modular abstractions for template numerical constraints.
Logical Methods in Computer Science, June 2010.
[ bib 
DOI 
arXiv 
.pdf ]
We propose a method for automatically generating abstract transformers for static analysis by abstract interpretation. The method focuses on linear constraints on programs operating on rational, real or floatingpoint variables and containing linear assignments and tests. Given the specification of an abstract domain, and a program block, our method automatically outputs an implementation of the corresponding abstract transformer. It is thus a form of program transformation.

[34] 
Thomas Gawlitza and David Monniaux.
Improving strategies via SMT solving.
In Gilles Barthe, editor, ESOP, number 6602 in Lecture Notes in
Computer Science, pages 236255. Springer Verlag, 2011.
[ bib 
DOI 
arXiv 
.pdf ]
We consider the problem of computing numerical invariants of programs by abstract interpretation. Our method eschews two traditional sources of imprecision: (i) the use of widening operators for enforcing convergence within a finite number of iterations (ii) the use of merge operations (often, convex hulls) at the merge points of the control flow graph. It instead computes the least inductive invariant expressible in the domain at a restricted set of program points, and analyzes the rest of the code en bloc. We emphasize that we compute this inductive invariant precisely. For that we extend the strategy improvement algorithm of [Gawlitza and Seidl, 2007]. If we applied their method directly, we would have to solve an exponentially sized system of abstract semantic equations, resulting in memory exhaustion. Instead, we keep the system implicit and discover strategy improvements using SAT modulo real linear arithmetic (SMT). For evaluating strategies we use linear programming. Our algorithm has low polynomial space complexity and performs for contrived examples in the worst case exponentially many strategy improvement steps; this is unsurprising, since we show that the associated abstract reachability problem is Pip2complete.

[35] 
David Monniaux and Pierre Corbineau.
On the generation of Positivstellensatz witnesses in degenerate
cases.
In Marko Van Eekelen, Herman Geuvers, Julien Schmaltz, and Freek
Wiedijk, editors, Interactive Theorem Proving (ITP), volume 6898 of
Lecture Notes in Computer Science, pages 249264. Springer Verlag,
August 2011.
[ bib 
DOI 
arXiv 
.pdf ]
One can reduce the problem of proving that a polynomial is nonnegative, or more generally of proving that a system of polynomial inequalities has no solutions, to finding polynomials that are sums of squares of polynomials and satisfy some linear equality (Positivstellensatz). This produces a witness for the desired property, from which it is reasonably easy to obtain a formal proof of the property suitable for a proof assistant such as Coq.

[36] 
David Monniaux and Laure Gonnord.
Using bounded model checking to focus fixpoint iterations.
In Eran Yahav, editor, Static analysis (SAS), volume 6887 of
Lecture Notes in Computer Science, pages 369385. Springer Verlag,
2011.
[ bib 
DOI 
arXiv 
.pdf ]
Two classical sources of imprecision in static analysis by abstract interpretation are widening and merge operations. Merge operations can be done away by distinguishing paths, as in trace partitioning, at the expense of enumerating an exponential number of paths. In this article, we describe how to avoid such systematic exploration by focusing on a single path at a time, designated by SMTsolving. Our method combines well with acceleration techniques, thus doing away with widenings as well in some cases. We illustrate it over the wellknown domain of convex polyhedra.

[37] 
David Monniaux and Julien Le Guen.
Stratified static analysis based on variable dependencies.
In Third International Workshop on Numerical and Symbolic
Abstract Domains, 2011.
[ bib 
DOI 
arXiv 
.pdf ]
In static analysis by abstract interpretation, one often uses widening operators in order to enforce convergence within finite time to an inductive invariant. Certain widening operators, including the classical one over finite polyhedra, exhibit an unintuitive behavior: analyzing the program over a subset of its variables may lead a more precise result than analyzing the original program! In this article, we present simple workarounds for such behavior.

[38] 
David Monniaux and Martin Bodin.
Modular abstractions of reactive nodes using disjunctive invariants.
In Programming Languages and Systems (APLAS), pages 1933,
2011.
[ bib 
DOI 
arXiv 
.pdf ]
We wish to abstract nodes in a reactive programming language, such as Lustre, into nodes with a simpler control structure, with a bound on the number of control states. In order to do so, we compute disjunctive invariants in predicate abstraction, with a bounded number of disjuncts, then we abstract the node, each disjunct representing an abstract state. The computation of the disjunctive invariant is performed by a form of quantifier elimination expressed using SMTsolving.

[39] 
Diego Caminha Barbosa de Oliveira and David Monniaux.
Experiments on the feasibility of using a floatingpoint simplex in
an SMT solver.
In Pascal Fontaine, Renate A. Schmidt, and Stephan Schulz, editors,
Workshop on Practical Aspects of Automated Reasoning (PAAR), volume 21
of EPiC Series. Easychair, 2012.
[ bib 
http 
.pdf ]
SMT solvers use simplexbased decision procedures to solve decision problems whose formulas are quantifierfree and atoms are linear constraints over the rationals. Stateofart SMT solvers use rational (exact) simplex implementations, which have shown good performance for typical software, hardware or protocol verification problems over the years. Yet, most other scientific and technical fields use (inexact) floatingpoint computations, which are deemed far more efficient than exact ones. It is therefore tempting to use a floatingpoint simplex implementation inside an SMT solver, though special precautions must be taken to avoid unsoundness.

[40] 
AnhDung Phan, Nikolaj Bjørner, and David Monniaux.
Anatomy of alternating quantifier satisfiability (work in progress).
In Pascal Fontaine and Amit Goel, editors, 10th International
Workshop on Satisfiability Modulo Theories (SMT), 2012.
[ bib 
.pdf ]
We report on work in progress to generalize an algorithm recently introduced in for checking satisfiability of formulas with quantifier alternation. The algorithm uses two auxiliary procedures: a procedure for producing a candidate formula for quantifier elimination and a procedure for eliminating or partially eliminating quantifiers. We also apply the algorithm for Presburger Arithmetic formulas and evaluate it on formulas from a model checker for Duration Calculus. We report on experiments on different variants of the auxiliary procedures. So far, there is an edge to applying SMTTEST proposed in [Monniaux, CAV2010], while we found that a simpler approach which just eliminates quantified variables per round is almost as good. Both approaches offer drastic improvements to applying default quantifier elimination.

[41] 
Julien Henry, David Monniaux, and Matthieu Moy.
Succinct representations for abstract interpretation.
In Static analysis (SAS), volume 7460 of Lecture Notes in
Computer Science, pages 283299. Springer Verlag, 2012.
[ bib 
DOI 
arXiv 
.pdf ]
Abstract interpretation techniques can be made more precise by distinguishing paths inside loops, at the expense of possibly exponential complexity. SMTsolving techniques and sparse representations of paths and sets of paths avoid this pitfall.

[42] 
Julien Henry, David Monniaux, and Matthieu Moy.
PAGAI: a path sensitive static analyzer.
In Bertrand Jeannet, editor, Tools for Automatic Program
Analysis (TAPAS), 2012.
[ bib 
DOI 
arXiv 
.pdf ]
We describe the design and the implementation of PAGAI, a new static analyzer working over the LLVM compiler infrastructure, which computes inductive invariants on the numerical variables of the analyzed program.

[43] 
Thomas Gawlitza and David Monniaux.
Invariant generation through strategy iteration in succinctly
represented control flow graphs.
Logical Methods in Computer Science, September 2012.
[ bib 
DOI 
arXiv ]
We consider the problem of computing numerical invariants of programs, for instance bounds on the values of numerical program variables. More specifically, we study the problem of performing static analysis by abstract interpretation using template linear constraint domains. Such invariants can be obtained by Kleene iterations that are, in order to guarantee termination, accelerated by widening operators. In many cases, however, applying this form of extrapolation leads to invariants that are weaker than the strongest inductive invariant that can be expressed within the abstract domain in use. Another wellknown source of imprecision of traditional abstract interpretation techniques stems from their use of join operators at merge nodes in the control flow graph. The mentioned weaknesses may prevent these methods from proving safety properties.

[44]  David Monniaux. Applying the Ztransform for the static analysis of floatingpoint numerical filters. Draft from a possible extended version of CAV 2005., 2005. [ bib  arXiv ] 
[45] 
David Monniaux.
Introduction à la calculabilité.
Quadrature, 86:1728, 2012.
[ bib 
.pdf ]
Summary: Whether or not diophantine equations (that is, polynomials equations with integer coefficients and solutions) have solutions does not seem, a priori, to be related to bugs in computer programs. Computability theory establishes such a link: a classical result is that the problem of finding out whether a system of diophantine equations has a solution is the same as determining whether the execution a computer program will eventually stop. This article introduces computability theory: the difficulty of defining the class of computable functions (primitive recursive functions are insufficient), Turing machines, the Church thesis, and classical results (Halting problem, Rice's theorem, ect.). Keywords: computable functions; turing; diophantine equations; turing machines 
[46] 
Alexis Fouilhé, David Monniaux, and Michaël Périn.
Efficient generation of correctness certificates for the abstract
domain of polyhedra.
In Static analysis (SAS), 2013.
[ bib 
DOI 
arXiv 
.pdf ]
Polyhedra form an established abstract domain for inferring runtime properties of programs using abstract interpretation. Computations on them need to be certified for the whole static analysis results to be trusted. In this work, we look at how far we can get down the road of a posteriori verification to lower the overhead of certification of the abstract domain of polyhedra. We demonstrate methods for making the cost of inclusion certificate generation negligible. From a performance point of view, our singlerepresentation, constraintsbased implementation compares with stateoftheart implementations.

[47] 
Thomas Braibant, JacquesHenri Jourdan, and David Monniaux.
Implementing hashconsed data structures in Coq.
In Interactive theorem proving (ITP), volume 7998 of
Lecture Notes in Computer Science. Springer Verlag, 2013.
[ bib 
DOI 
arXiv ]
We report on three different approaches to use hashconsing in programs certified with the Coq system, using binary decision diagrams (BDD) as running example. The use cases include execution inside Coq, or execution of the extracted OCaml code. There are different tradeoffs between faithful use of pristine extracted code, and code that is finetuned to make use of OCaml programming constructs not available in Coq. We discuss the possible consequences in terms of performances and guarantees.

[48] 
Julien Henry, Mihail Asavoae, David Monniaux, and Claire Maiza.
How to compute worstcase execution time by optimization modulo
theory and a clever encoding of program semantics.
In Youtao Zhang and Prasad Kulkarni, editors, LCTES, pages
4352. ACM, 2014.
[ bib 
DOI 
arXiv ]
In systems with hard realtime constraints, it is necessary to compute upper bounds on the worstcase execution time (WCET) of programs; the closer the bound to the real WCET, the better. This is especially the case of synchronous reactive control loops with a fixed clock; the WCET of the loop body must not exceed the clock period. We compute the WCET (or at least a close upper bound thereof) as the solution of an optimization modulo theory problem that takes into account the semantics of the program, in contrast to other methods that compute the longest path whether or not it is feasible according to these semantics. Optimization modulo theory extends satisfiability modulo theory (SMT) to maximization problems. Immediate encodings of WCET problems into SMT yield formulas intractable for all current productiongrade solvers; this is inherent to the DPLL(T) approach to SMT implemented in these solvers. By conjoining some appropriate "cuts" to these formulas, we considerably reduce the computation time of the SMTsolver. We experimented our approach on a variety of control programs, using the OTAWA analyzer both as baseline and as underlying microarchitectural analysis for our analysis, and show notable improvement on the WCET bound on a variety of benchmarks and control programs.

[49] 
Thomas Braibant, JacquesHenri Jourdan, and David Monniaux.
Implementing and reasoning about hashconsed data structures in
Coq.
Journal of Automated Reasoning, pages 134, June 2014.
[ bib 
DOI 
arXiv ]
We report on four different approaches to implementing hashconsing in Coq programs. The use cases include execution inside Coq, or execution of the extracted OCaml code. We explore the different tradeoffs between faithful use of pristine extracted code, and code that is finetuned to make use of OCaml programming constructs not available in Coq. We discuss the possible consequences in terms of performances and guarantees. We use the running example of binary decision diagrams and then demonstrate the generality of our solutions by applying them to other examples of hashconsed data structures. Keywords: Coq; Hashconsing; Binary decision diagrams 
[50] 
David Monniaux and Peter Schrammel.
Speeding up logiconumerical strategy iteration.
In Markus MüllerOlm and Helmut Seidl, editors, Static
Analysis  21st International Symposium, SAS 2014, Munich, Germany,
September 1113, 2014. Proceedings, volume 8723 of Lecture Notes in
Computer Science, pages 253267. Springer, 2014.
[ bib 
DOI 
arXiv ]
We introduce an efficient combination of polyhedral analysis and predicate partitioning. Template polyhedral analysis abstracts numerical variables inside a program by one polyhedron per control location, with a priori fixed directions for the faces. The strongest inductive invariant in such an abstract domain may be computed by upward strategy iteration. If the transition relation includes disjunctions and existential quantifiers (a succinct representation for an exponential set of paths), this invariant can be computed by a combination of strategy iteration and satisfiability modulo theory (SMT) solving. Unfortunately, the above approaches lead to unacceptable space and time costs if applied to a program whose control states have been partitioned according to predicates. We therefore propose a modification of the strategy iteration algorithm where the strategies are stored succinctly, and the linear programs to be solved at each iteration step are simplified according to an equivalence relation. We have implemented the technique in a prototype tool and we demonstrate on a series of examples that the approach performs significantly better than previous strategy iteration techniques.

[51] 
Francesco Alberti and David Monniaux.
Polyhedra to the rescue of array interpolants.
In ACM Symposium on Applied Computing, software verification and
testing track, pages 17451750. ACM, 2015.
[ bib 
DOI 
arXiv ]
We propose a new approach to the automated verification of the correctness of programs handling arrays. An abstract interpreter supplies auxiliary numeric invariants to an interpolationbased refinement procedure suited to array programs. Experiments show that this combination approach, implemented in an enhanced version of the Booster software modelchecker, performs better than the pure interpolationbased approach, at no additional cost.

[52] 
Laure Gonnord, David Monniaux, and Gabriel Radanne.
Synthesis of ranking functions using extremal counterexamples.
In Programming Language Design and Implementation (PLDI), pages
608618. ACM, 2015.
[ bib 
DOI 
arXiv ]
We present a complete method for synthesizing lexicographic linear ranking functions (and thus proving termination), supported by inductive invariants, in the case where the transition relation of the program includes disjunctions and existentials (large block encoding of control flow). Previous work would either synthesize a ranking function at every basic block head, not just loop headers, which reduces the scope of programs that may be proved to be terminating, or expand large block transitions including tests into (exponentially many) elementary transitions, prior to computing the ranking function, resulting in a very large global constraint system. In contrast, our algorithm incrementally refines a global linear constraint system according to extremal counterexamples: only constraints that exclude spurious solutions are included. Experiments with our tool Termite show marked performance and scalability improvements compared to other systems.

[53] 
David Monniaux and Francesco Alberti.
A simple abstraction of arrays and maps by program translation.
In Static analysis (SAS), volume 9291 of Lecture Notes in
Computer Science, pages 217234. Springer Verlag, 2015.
[ bib 
DOI 
arXiv ]
We present an approach for the static analysis of programs handling arrays, with a Galois connection between the semantics of the array program and semantics of purely scalar operations. The simplest way to implement it is by automatic, syntactic transformation of the array program into a scalar program followed analysis of the scalar program with any static analysis technique (abstract interpretation, acceleration, predicate abstraction,.. .). The scalars invariants thus obtained are translated back onto the original program as universally quantified array invariants. We illustrate our approach on a variety of examples, leading to the “Dutch flag” algorithm.

[54] 
Egor George Karpenkov, David Monniaux, and Philipp Wendler.
Program analysis with local policy iteration.
In Barbara Jobstmann and K. Rustan M. Leino, editors,
Verification, Model Checking, and Abstract Interpretation (VMCAI), volume
9583 of Lecture Notes in Computer Science, pages 127146. Springer
Verlag, January 2016.
[ bib 
DOI 
arXiv 
http ]
We present a new algorithm for deriving numerical invariants that combines the precision of maxpolicy iteration with the flexibility and scalability of conventional Kleene iterations. It is defined in the Configurable Program Analysis (CPA) framework, thus allowing interanalysis communication. It uses adjustableblock encoding in order to traverse loopfree program sections, possibly containing branching, without introducing extra abstraction. Our technique operates over any template linear constraint domain, including the interval and octagon domains; templates can also be derived from the program source. The implementation is evaluated on a set of benchmarks from the Software Verification Competition (SVComp). It competes favorably with stateoftheart analyzers.

[55] 
Alexandre Maréchal, Alexis Fouilhé, Tim King, David Monniaux, and
Michaël Périn.
Polyhedral approximation of multivariate polynomials using
handelman's theorem.
In Barbara Jobstmann and K. Rustan M. Leino, editors,
Verification, Model Checking, and Abstract Interpretation (VMCAI), volume
9583 of Lecture Notes in Computer Science, pages 166184. Springer
Verlag, January 2016.
[ bib 
DOI 
arXiv 
http ]
Convex polyhedra are commonly used in the static analysis of programs to represent overapproximations of sets of reachable states of numerical program variables. When the analyzed programs contain nonlinear instructions, they do not directly map to standard polyhedral operations: some kind of linearization is needed. Convex polyhedra are also used in satisfiability modulo theory solvers which combine a propositional satisfiability solver with a fast emptiness check for polyhedra. Existing decision procedures become expensive when nonlinear constraints are involved: a fast procedure to ensure emptiness of systems of nonlinear constraints is needed. We present a new linearization algorithm based on Handelman's representation of positive polynomials. Given a polyhedron and a polynomial (in)equality, we compute a polyhedron enclosing their intersection as the solution of a parametric linear programming problem. To get a scalable algorithm, we provide several heuristics that guide the construction of the Handelman's representation. To ensure the correctness of our polyhedral approximation , our Ocaml implementation generates certificates verified by a checker certified in Coq.

[56] 
David Monniaux.
A survey of satisfiability modulo theory.
In Computer Algebra in Scientific Computing, volume 9890 of
Lecture Notes in Computer Science. Springer Verlag, September 2016.
[ bib 
arXiv 
http ]
Satisfiability modulo theory (SMT) consists in testing the satisfiability of firstorder formulas over linear integer or real arithmetic, or other theories. In this survey, we explain the combination of propositional satisfiability and decision procedures for conjunctions known as DPLL(T), and the alternative “natural domain” approaches. We also cover quantifiers, Craig interpolants, polynomial arithmetic, and how SMT solvers are used in automated software analysis.

[57] 
David Monniaux and Laure Gonnord.
Cell morphing: From array programs to arrayfree Horn clauses.
In Static analysis, volume 9837 of Lecture Notes in
Computer Science. Springer Verlag, September 2016.
[ bib 
arXiv 
http ]
Automatically verifying safety properties of programs is hard. Many approaches exist for verifying programs operating on Boolean and integer values (e.g. abstract interpretation, counterexampleguided abstraction refinement using interpolants), but transposing them to array properties has been fraught with difficulties. Our work addresses that issue with a powerful and flexible abstraction that morphes concrete array cells into a finite set of abstract ones. This abstraction is parametric both in precision and in the backend analysis used. From our programs with arrays, we generate nonlinear Horn clauses over scalar variables only, in a common format with clear and unambiguous logical semantics, for which there exist several solvers. We thus avoid the use of solvers operating over arrays, which are still very immature. Experiments with our prototype VAPHOR show that this approach can prove automatically and without user annotations the functional correctness of several classical examples, including selection sort, bubble sort, insertion sort, as well as examples from literature on array analysis.

[58] 
Egor George Karpenkov and David Monniaux.
Formula slicing: Inductive invariants from preconditions.
In Roderick Bloem and Eli Arbel, editors, Hardware and Software:
Verification and Testing (Haifa Verification Conference), volume 10028 of
Lecture Notes in Computer Science. Springer Verlag, November 2016.
[ bib 
http ]
We propose a “formula slicing” method for finding inductive invariants. It is based on the observation that many loops in the program affect only a small part of the memory, and many invariants which were valid before a loop are still valid after.

[59]  Valentin Touzeau, Claire Maïza, and David Monniaux. Model checking of cache for WCET analysis refinement. In 10th Junior Researcher Workshop on RealTime Computing, 2016. [ bib  .pdf ] 
[60]  Virginie Gautron and David Monniaux. De la surveillance secrète à la prédiction des risques : les dérives du fichage dans le champ de la lutte contre le terrorisme. Archives de politique criminelle, 38, 2016. [ bib ] 
[61]  Ahmed Bouajjani and David Monniaux, editors. Verification, Model Checking, and Abstract Interpretation (VMCAI). Springer Verlag, 2017. [ bib  DOI ] 
This file was generated by bibtex2html 1.98.