See also notes on:

  • Model Checking
  • SMT
  • separation logic
  • CHC invariant generation

Hoare Logic I have heard somewhere that actually we should distinguish Floyd’s work on flow charts as something different. Maybe in a Lamport paper?

Axiom schema

Prpositionbal hoare logic - see KAT

Weakest Precondition


frama-C why3 implemtnation of stackify Stackifty algorithm of llvm for wasm



Model Checking takes in lustre. Multiple smt backends. ocaml implementations of ic3 and invarait SV-comp Based on these results, given that I don’t know shit, looks like cpachecker is a reasonable default if you’re gonna pick one

veriabs - wins in reachability. Can you even buy it?

ADA Spark






Using SMT arrays. A big block of 2^84 addresses or different models. Tagged memory separating heap and stack. Chaos when gets confused. Does pointer arithmetic get it confused?

Synthesizing an Instruction Selection Rule Library from Semantic Specifications A Formal Model of a Large Memory that Supports Efficient Execution


jitterbug serval

Static Program Analysis book. dataflow analysis etc.

modelling the heap

Why is structuring unstructured control flow so important?

Weakest-precondition of unstructured programs banrett leino what can you do with an isa spec - alastair reid

Imp stmt to stack machine Imp expr to stack machine

Expr as state? Expr + Context as state. Ok sure. List stack as state Try just binary operator Try booleans rather than nats

There is a single reflection step to a machine

forall s1 : S1, s2 : S2, (p : s1 ~ s2), (R1 s1 s1’) (R2 s2 s2’) : s1’ ~ s2’

Maybe verifying a pipelined processor (how hard could it be amirite?!?!) would be a fun concurrency example to attempt int ivy or tla+ or whatever instruction semantics for x86 in K Synthesiszing automatically learning semantics of x86

The Hamming Book. Gilbert Strang book. Numerical analysis Geometrical algorithms Stuff in CLRS?

limbs circuit. 3d printed snapping.

Use z3-wasm.

Specit - word based specification challenges. Prove equivalence using Z3?

let’s as assignemnt injecting block addigment variables after phi nodes gated ssa vs

smtlib preporcoesor - wp mode.

boogie why

Different styles of proving on CFGs.

The CFG is already giving you a lot, to pretend you know what jumps are possible. This does let you

Nand2Tetris style, we could model the gates of the hardware. And then unfold in time using BMC

  • Do we maintain the instruction pointer as a concept?
  • For every block, with every entrance and exit, one could manually state a summary entrance and summary exit predciates. For every edge linking an exit to and entrance one requires that P - Q. And in addition that the entrance predicates imply the exit predicates of the block itself
  • DAGs present no problems as CFGs. You can finitely produce a trasntiion relation for them, or run WP on them. So one perspective if that you need to cut enough edges to make the cfg a dag. And every time you cut an edge, you need a predicate associated with that edge or perhaps one with the entrance and one with the exit of that edge.
  • Lamport had some mention of ther Floyd method as being more general than the hoare method. Floyd seemed to be considering cfgs. TLA+ does explicitly model the program counter.
  • Symbolic execution branches at the logical level instead of at the logical level. This does not lend itself obviously to something that works in the presence of loops.

We could do the Micro-WP to demonstrate these styles. But it is a pain. Infer a CFG for Nand2Tetris? Perhaps hards because it can be difficult to know what locations you may jump to. We could instead work in a CFG intermediate representation that compiles.

class Block: code: list[instr] # no jumps jump: A1, A2, JMP

What changes do yoiu need to make to use arbitrary control flow graphs vs structured programs Rustan leino book

Djikstra monads - this might be a stretch F* Djikstra moand + interaciton trees Interaction trees ~ free monad rearranged for total language related to freer monads - kiselyov thing. This is what lexi king was working on yea?

General Monad mcbride from C to interaction trees li-yao xia

Disjkstra and Scholten That link off of Leino

Could I make an equation style system in Z3py? Probably, right? Take Agda as an example Backhouse Hehner

I’ve been feeling like i should be doing manual hoare logic/ imperative proofs

There is a vast assortment of tools out there that aren’t proof assistants.

Boogie, dafny, frama-c, viper, verifast, whyml, why3, liquidhaskell, spark/ada, spec# JML, ESC/java whiley esc/modula-3

dafny vs code plugin

viper vs code plugin


verifast tutorial vcc ZetZZ dafny discussion Verilog + symbiyosys, KeY, KeymaeraX CBMC, ESBMC , EBMC cpa-checker TLA might be in this category Event-B alloy god this list is nuts verify this sv-comp

Eiffel for pre post conditions chalice ATS

F*, Iris, VST, Bedrock Isabelle?

It’s interesting that logical psecs are so foreign, and somewhat longwinded when applied to imperative code, that they aren’t that much more understandable or high assurance. Really it might be about formally proving equaivlance between just specs in different languages. Python and C for example.

A good question is: what are interesting programs to prove?

  1. List manipulation
  2. sorts
  3. red black trees
  4. find

Fun old timey books.

If you go before 1980, a decent chunk of all books had assembly in mind.

  • discpline of programming - djikstra
  • Reynolds - The craft of programming
  • Knuth - The Art of Computer Programming
  • The science of programming - D Gries
  • Pascal, wirth
  • structured programming djikstra hoare
  • Eric Hehner
  • ACM classic books
  • lambda papers
  • per brinch hansen
  • some welevant EQD notes. Derivation of alogrithms
  • winskel

old model checking notes

author: philzook58 comments: true date: 2020-11-19 21:28:58+00:00 layout: post link: published: false slug: Model Checking - TLA+ title: Model Checking - TLA+ wordpress_id: 879 —

CFA control flow automata. Abstracting out control flow. An over approximation is that you can non detemirnstically take branches

Model checking Software Model Checking for People who Love Automata nested interpolants hiezmann

interpolant mcmillan BLAST and other jhala auomtating grammar compiarsno file:///home/philip/Documents/coq/game/3434298.pdf - verifying context free api protoclls java path finder - ultimate program analyzers CPAchecker

Ivy EPR a decidable fragment?

exists* forall * is decidable - cody mentions this is synthesis? Connection here? Other fragments too. Monadic full first order logic (without function symbols?) Goldfarb, Gurevich, Rabin, Shelah: all decidable and undecidable prefix classes completely characterized.

Modal logics are “robustly decidable” translation to first order has a particular guarded form

LTL vs CTL Ok, I think I’ve got it. When you’re model checking a CTL formula, you’re checking a single Tree |= CTL_formula, single entry in the entailment relation. But when you’re model checking an LTL formula you’re checking a family path paths p |= LTL_formula, a bunch of entries of the entailment relation.

Stuttering is important. Stuttering is when y’ = y. It is this option which allows refinement.

Using Z3 I feel has a decent shot of being more scalable than the checking used by TLC, which as I understand it is brute force search for countermodels.

Back on the TLA+ train

  • Specifying Systems

I find something very conceptually pleasing about making the program counter an explicit thing. Every language has things that are implicit. Powerful new languages constructs are often backed by runtime data structures. Even in “low level” languages like C, there is a whole lot of implicit stack shuffling. The stack exists. It is a data structure. However, even in assembly I tend to take the program counter for granted most of the time. When you thnk about what the insturction add does, you don’t tend to mention the movement of the program counter. It is so obvious that it increments to the next instruction that usually remains tacit.


We’re starting a reading group on TLA+ at work.

reifying the program counter

TLA+ is more of a spec or modelling language. No implicit temporal flow.

It actually is similar to verilog, which can be conisdered a layer underneath using a cpu. One needs to build a program counter to do what imperative programs do

pluscal is an imperative ish language that compiles to TLA+

Everything uses the TLA

If you don’t use x’, then tla is just a constraint satisfaction solver.

maybe x == x’ as a constraint?

Hmm. One NQueens example actually writes out the brute force search algorithm. That’s intriguing.

I have no idea how TLC is working

can encode relation R(x,x’) into contraint solver. and also induction principle P(x) => P(x’). Difficult to know if you are in a reachable state though. Can unroll in time.

Keep track of possible states using BDD sturctur





Kind model checker - SMT based and some other dtechnques. Takes in Lustre files which feel like some kind of verilog. I suppose you could model check with verilog too. Outputs rust (experimentally)

How does TLC work

It sounds like it does some kind of explicit state modelling?

Thesis about TLC

Exlplicit state model checker. It actually is BFS the entire reachable state space?

Symbolic state model checker - BDD based storage of the set of states

Refinement model checker - Holds state space via abstraction?

Seems lik it would be feasible to build an embedded model checker in haskell of this style

par combinators. Maybe lazily distribute out products.

Petri nets keep showing up.

Binary Decision Diagrams for model checking

state-nextstate relation. Relation Algebraic perspective?

Decision Diagrams have a quasi applicative instance. The domain needs an Ord constraint. Useful when the range is small? (i.e. binary), but domain can be quite large. Map data structures are useful mostly for partial functions or for small domains.

The applicative instance is how you build ocmplicated BDDs from simple ones.

Decision diagrams let you dominate questions about functions. All sorts of queries become possible.

Sharing in Haskell

Either store indices of shared locations

Could have a respresentable stored monad.

Fix of ExprF.

Use HOAS. Use explicit index locations (mockup of an internal pointer system). Use categorical DSL, use actual physical pointers

In catgoerical style, we need to explicilty state dup and join nodes. We will have both I think. Unless I build the trre from the ground up. I kind of have a stack.

Hillel Wayne’s short tutorial

He also has a book

Mary Sheeran et al. Checking safety properties using SAT solvers

Lava and Ruby papers

This is a really good talk on bounded model checking. How to show it is complete. loop free paths. Find property being dissatisfied. Weaken Transition relation to allow more states if gives easier to reason theory. interval arithmetic in z3?

Fast polyhedral domain library. Intervals, octagons, polyhedral. has python bindings

PPL Parma polyhedral library. - Apron I’m seeing a lot of ocaml bindings? – not in awesome shape though interesting blog isl – integer set library

Relaxation ~ abstraction. We might want to relax an LP to an outer box. A box is easier towork on. Or octagons.

Relax Sat instance to 2-Sat Bounded model checking - Transition relation, laid out in time. Check that you never rpeat a state. Induction I[x] => P[x], P[x]/\ T[x,x’] => P[x’] Can strengthen induction in various ways? By rolling out in time? Seahorn FDR4 IC3 algorithm? Operations Researhc/optimization - abstract interpetation relaxation - abstraction ? - galois connection

  • lattice feasible - satisfiable

TLA+, SPIN, NuSMV are all model checkers

Simple programming systems or simple robots or distributed system, or AI and things can be simulated using the technique of finite automata.

Finite Automata have finite states and labelled transitions. You can choose to make the transition deterministic or nondeterministic, meaning it is obvious which edge to take or you still have some choices.

This automata perspective is useful for a couple things. If you wanted to implement your own regular expression evaluator it is useful. It is also a good persepctive for writing fpgas, or converting a program into a state machine for the fpga.

Your process has a register called the program counter (roughly corresponds to the line number in your program in a sense). Most of the numbers it manipulates are 64 bits. The program counter paired with the variable values can give you a finite state you can work with (especially if you are dynamically building data structures a la malloc).

The labelled arrow is somewhat like the monad arrow (a -> m b). This also makes sense in the context of kmett’s comments during Foner’s comonad talk.

The IO arrow is a catch all for all the nondeterminism possible. But perhaps making (-> m) a more first class object you could prove more?

Constructions on NFA:

Concat: Take any edge that leads to an accepting and bring it into the start state of the second automata

Union: Just place the automata side by side

Intersection: Product construction. Make states pairs of original states. Make actions go at the same time. Only accept if both states are accepting.

Star: Take any incoming edge to accepting state and loop it back to start states

Complement: Determinize NFA (see below) and swap accept and reject states.

with these combinators you could easily construct NFA from a regular expression. Just interpret the regular expression syntax tree.

Some useful derived operations include:

do n times: can just concat the epxression n times

one or more (+): concat the star with one instance

complement with respect to: complement and then intersect

Determinization - Make states the power set of the original states. Only need to keep the ones you actually can visit from the start state

The Algebra of Regular Expressions:

Kleene Star is a geometric series. a* = 1 + a + a^2 + a^3 + … ~ 1/(1-a). As usual, it is not clear that the right hand side has an interpetation

Matrix regular expressions

Derivatives of regular expressions

Buchi Automata and $latex \omega$- regular expressions

Buchi automata accept infinite strings.

They are also represented by transition diagrams, but the acceptance condition is now that a string is accepted if it passes through the accepting state infinitely often. Basically, this means the string has to get into a cycle.

LTL - Linear Temporal Logic. This is one logic for describing properties of a transition system. It consists of ordinary boolean logic combined with temporal operators, X F G and U (which are not necessarily independent of the others). Each state in the transition systems is labelled with which propositions are true in that state. $latex G \phi$ is true if that statement $phi$ is true on all possible states you could end up on forever. $latex F \phi$ is true if you always enter a state at some point for which $latex \phi$ is true. In other words $latex \phi$ is eventually true. $latex X \phi$ is true if in the next state $latex \phi$ is true.