PADEC - Coq Library
Model and General Results about the Model
- Algorithm: network and algorithm definitions
- RelModel: semantics of the model (relational version)
- FunModel: semantics of the model (functional version and equivalence wrt relational semantics)
- Exec: execution of the system (type and support)
- Self_Stabilization: definition of the properties
- Fairness: definition of scheduling assumptions (daemon)
Tool for Termination or Convergence
- P_Q_Termination: tools for proving convergence of an algorithm. Relies on the Dershowitz-Manna order on finite multi-sets to define sufficient conditions on local potentials. In those tools, we use CoLoR Libray.
Tools for Composition
- Composition: collateral composition - definition, proof of correctness under weakly fair assumptions
- Compo_ex: example on how to use the composition operator, based on "Self-Stabilizing Small k-Dominating Sets"
Tools for Complexity
- Steps: step complexity. Tools to measure stabilization times (and other performances) in steps. Relies on Stream_Length
- Spanning_Tree: definition of tree topologies; in particular spanning tree.
- Ring: definition of ring topologies; in particular rooted unidirected ring.
Toy examples of Network and Algorithm
Step Complexity: Diskstra's first token ring algorithm
- Proof of the self-stabilization and specification
- Proof of a step complexity bound for stabilization time
- Proof that this complexity bound is exact (by building a worst case prefix of execution)
- Dijkstra_specifications: Token Circulation Specification: A unique token always eventually passes through each node.
- Dijkstra_algo: Dijkstra first algorithm for token circulation on a rooted ring.
- Dijkstra_common: Common Definitions and Results about Dijkstra's Token Ring Circulation.
- Dijkstra_Z: Definition and results about convexity. Definition and results about Potential Z.
- Dijkstra_sum_dist: Definition and results about Potential sum_dist.
- Dijkstra_complexity: Proof of the complexity bound; gathers previous results from sum_dist, Z and degenerated cases.
- Dijkstra_specifications_proved: Proof that the algorithm is self-stabilizing w.r.t. token ring circulation
- Dijkstra_worst_case: Builds a prefix of execution which reaches a legitimate configuration using exactly the number of steps announced by the bound in Dijkstra_complexity
k-Dominating Set Algorithm
Based on "Competitive Self-Stabilizing k-Clustering" . Focuses on the first rules of the second algorithms that build a k dominating set on a directed tree.
- KDomSet_specification: specification the problem, i.e. self-stabilizing solution of k-dominating set upon a directed tree.
- KDomSet_algo: definition of the algorithm and general results about a step of this algorithm.
- KDomSet_correctness: correctness of the algorithm, i.e. once the algorithm has converged, the specification is met, that is the variables represent a k-dominating set.
- KDomSet_termination: convergence of the algorithm, i.e., every execution of the algorithm is finite.
- KDomSet_count: we certify that the size of the computed k-hop dominating set is at most ⌊ (n-1)/(k+1) ⌋ + 1, where n is the number of nodes.
- KDomSet_specification_proved: the algorithm is indeed self-stabilizing w.r.t the assumption (the network is a directed tree) and the specification (k-hop dominating of at most ⌊ (n-1)/(k+1) ⌋ + 1 elements)
- Old Version, using an ad hoc definition for Spanning Tree: KDomSet_specification, KDomSet_alpha, KDomSet_correctness, KDomSet_termination, KDomSet_count, KDomSet_specification_proved
Based also on "Competitive Self-Stabilizing k-Clustering" . This proof extends the previous case study: we study the whole second algorithm (three rules) that builds a k clustering dominating set on of a spanning tree of the network.
- KClustering_specification: specification the problem, i.e. self-stabilizing solution of k clustering dominating set on a spanning tree of the network. The algorithm also requires that the network is identified and that the channels outgoing a node are locally ordered.
- KClustering_algo: definition of the algorithm and general results about a step of this algorithm.
- KClustering_def_termination: tools and definitions for proving convergence of the algorithm.
- KClustering_alpha_termination: proof of convergence of the first rule of the algorithm (computation of alpha that builds the k dominating set) (as in the previous case study).
- KClustering_pcl_termination: proof of convergence of the second rule of the algorithm (computation of the links for a node to reach its cluster head).
- KClustering_hd_termination: proof of convergence of the third rule of the algorithm (propagation of the cluster heads ids to the nodes).
- KClustering_correctness_alpha: correctness of the algorithm wrt the variables alpha, i.e. once the algorithm has converged, the variables alpha represent a k-dominating set (as in the previous case study).
- KClustering_correctness: correctness of the algorithm, i.e. once the algorithm has converged, the specification is met, that is the variables represent a k-dominating set.
- KClustering_count: we certify that the algorithm computes at most ⌊ (n-1)/(k+1) ⌋ + 1, where n is the number of nodes (as in the previous case study).
- KClustering_specification_proved: the algorithm is indeed self-stabilizing w.r.t the assumption (the network is identified, with locally ordererd channels and endowed with a spanning tree) and the specification (k-clustering of at most ⌊ (n-1)/(k+1) ⌋ + 1 clusters.
- SetoidUtil: setoid and partial setoid support - mainly for base types and function types.
- Stream: (finite or infinite) stream of at least one element. Partial setoid support. Tool support, in particular temporal logic operator.
- Relation_Stream: relational streams, definition and properties.
- Stream_Length: Tools to measure the length of a (finite) stream or a prefix of a stream before it reaches some predicate
- Simulation: simulation and squeezing. [Squeeze] is an operator on streams that removes any consecutive duplicated elements. It may remove finite numbers of duplicates. It also removes an infinite number of duplicates provided that they are at the end of the stream. [Simulation] defines an order on streams: X ≤ Y when X contains less consecutive duplicates thatn Y. (Squeeze X) is the minimum stream among the streams Y such that Y ≤ X.
- Count: tools for set cardinalities (comparison, couting, in case of finite sets).
- OptionUtil: option type, setoid and partial setoid definition.
- NatUtils: tools for natural numbers (bounded values, minimum, modulo...).
- Zutils: tools for integer numbers (in particular Z/kZ).
- Listutils: tools for lists, two equalities on lists (pointwise equality, set equality), setoid and partial setoid support, tools for: [InA], [NoDupA], [ForallA], [map], [filter], [map_filter], [RestrictA], [ConsistentA], [assocA], [rev], [existsb], [forallb].
- FoldRight: tools for [fold_right], in particular using [equivlistA].
- RelationA: includes tools for chains / paths of elements linked by a relation.
- TransClosAccDec: decidability of transitive closure within finite sets.
- WellfoundedUtil: tools for proving and using wellfoundedness.
- All_In: decidability properties over a finite type stored into a list.
- BoolSet: sets represented as a Boolean function.