A classification

In this post, I propose a classification for expression transformations. This classification aims at generalness and is therefore rather abstract: Coq tactics won't be treated in too much depth yet.

1. Evaluation-based transformations

The bulk of expression transformations are based on evaluation. In this family of transformations, the fact that the value we end with is equivalent to the one we started from is obvious enough: evaluation preserves equality by definition.

There is an important distinction between reduction steps and more general evaluation-based expression transformations. Applying single reduction steps by hand is a rare occurrence — usually, people defer to automatic procedures which decide which reductions should be applied at which point in the term, in which order, as well as when to stop applying reductions.

1.1. Total concrete evaluation

In everyday talk, evaluation usually refers to total concrete evaluation: you start from a closed expression and you turn it into the value it stands for. The name is quite explicit — you keep applying reduction steps until you can't. In civilized rewriting systems, the order in which these reductions are applied doesn't matter, thanks to the confluence property.

Non-confluent rewriting systems are not strictly incompatible with total evaluation, however the equivalence relation can't just be equality, then. You could embrace nondeterminism and decide that your expression doesn't return a value, but the set of values compatible with the expression. However, whether this respects our definition of total concrete evaluation is at least questionable (we don't end up with a value). Alternatively, you could evaluate the expression down to one of the values it is compatible with, which is not the nicest equivalence property to work with but meets our criterion more manifestly.

As you will have guessed, total concrete evaluation is an evaluation procedure built out of concrete steps only.

However, the real world is a complex place where some expressions just can't be reduced down to values: 5 + 3 + 2 is 10, but what is 5 + 3 + a, when a is an abstract term? This expression can be reduced to 8 + a, but it can't be further simplified. This is not a value but it is a simpler expression, which is already interesting for many purposes. So evaluation-based expression transformations don't all have to be total.

Also, there are situations in which you want to evaluate only parts of your expression, be it because you need it to match a specific form for a proof you are writing, or because you want to control the reduction process to assure that you do not stumble into performance traps.

In the next two subsections, we will see what happens when we invert the constraints of concreteness and of totality, respectively.

1.2. Total symbolic evaluation

A symbolic transformation is a transformation which can handle abstract expressions. It is not to be conflated with abstract interpretation, a technique mostly used in static analysis based on approximation. Symbolic evaluation is what you want to use when you handle open terms.

For instance, you may want to prove that 0 + n is equal to n for any value of n. The addition may be implemented as follows:

Fixpoint add (m n: nat) : nat = match m with | 0 => n | S m' => add m' (S n) end.

Then, a way of proving what you want would be to specialize this function with the argument 0 (note how n remains abstract at this point). Then, the code is equivalent to this:

Fixpoint add (m 0 n: nat) : nat = match m 0 with | 0 => n (* This branch will be taken *) | S m' => add m' (S n) end.

As the comment points out, there is enough information to simplify the match:

Fixpoint add (0 n: nat) : nat = n.

At this point, the proof is trivial. Another example would be the following C program:

int main(int argc, char *argv[]) { return argc + <an expression of value 5 which takes a minute to compute>; }

Compiling this program with basic optimizations should take about a minute and give us something equivalent to:

int main(int argc, char *argv[]) { return argc + 5; }

As the slow computation was pure, it could be done at compile time. The resulting program then runs nearly instantaneously! In fact, most compilations include optimization passes which basically amount to total symbolic evaluation.

Just like total concrete evaluation, total symbolic evaluation is an evaluation procedure. The main difference between the two is that total symbolic evaluation procedures can handle abstract terms, which contain unbound variables. Just what is meant by handling abstract terms vary a lot. Just as there are concrete evaluation steps, there are symbolic ones. Think about a - a: this is clearly enough equivalent to 0, but it takes some reasoning to show that this indeed constitutes a legal reduction step. Some rewriting system would include rules for doing e.g. arithmetic with opaque variables in their core, but not all do — for instance, in Coq, it takes some justification.

Another related example is given by the expression 4 + 1 + a + 2 + a. The procedure we described hitherto would turn this into 5 + a + 2 + a, which falls short of the better 7 + 2*a. Some procedures would always get this expression and related ones into a sort of normal form where the better 7 + 2*a. Some procedures would always get their input expression to some sort of normal form, with everything as reduced as possible, but some would be more limited.

In Coq, reduction steps are defined at a low-level compared to that of e.g. arithmetic expressions. Still, you have symbolic steps which can deal with abstract input, for instance for turning (fun x -> S x) a into S a. Some steps remain hopelessly concrete, for instance the one that picks the appropriate branch of a pattern match. You need the head matchee to be concrete, there is no way around that. Deeper parts of the matchee can be opaque though, no trouble there.

1.3. Partial evaluation

Up to this point, we mentioned evaluation procedures taking terms to some normal form — taking arithmetic expressions to integers, for instance. We are now going to zoom-in on the individual steps out of which these procedures were built. This is where partial evaluation enter the picture.

Sometimes, less is more. Strict impossibility is not the only reason why you may want not to fully reduce a term: you may need your expression to match a specific form for a proof you are writing, or it may be that not all ways of reducing your expression are equally fast, for example. In these situations, it pays to show some parsimony.

Assume we want to prove that if <x> then <y> else <z>. Now, you know that <x> evaluates to true, but you aren't done proving that just yet, and the term can't be reduced by evaluation. You notice that applying tactics takes more time than it should, and you wonder if the size of the object you are reasoning about plays a role in that. Additionally, you can see that the subterm <y> is very large, but shouldn't be that slow to compute. On the other hand, term <z> looks like it takes a lot of time to reduce but it also doesn't use that much memory. We want to get a smaller object as efficiently as possible. What we would like to do, then, is to reduce term <y> only. What we would like, in other words, is partial interpretation — being able to apply reductions steps at targeted points in the term. Note that in my definition partial evaluation is not limited to the total evaluation of subterms.

I mentioned concrete and symbolic steps earlier. For instance, turning 2 + 2 into 4 would be a concrete step, whereas turning 2 + a into a + 2 or a + a into 2*a would be symbolic steps (symbolic steps can handle symbols, concrete ones can't). It takes a symbolic step to take 2 + a + 2 to a + 2 + 2, and then a concrete one to then take this to a + 4.

Partial evaluation is not just about running steps in isolation. Procedures for automatically applying series of steps fall under this umbrella — in fact, total evaluation procedures can be viewed as partial reduction procedures which just so happen to always return fully reduced terms.

1.4. Purity or lack thereof

As we've seen, it is possible to devise rewriting systems even for impure languages. These rewriting systems can deal with the pure fragments of terms of such languages, but they are not limited to that, as we shall see. Of course, these rewriting systems get much more complex than their pure counterparts (in particular due to the need to keep track of state), and they are based on less elegant equivalence relations (some sort of compatibility rather than equality).

Consider the following function:

int foo() { int min = 0; int max = get_user_int(); return get_rand_int(min, max) + get_rand_int(min, max); }

Simplifying its pure fragments leaves us with the following:

int foo() { int max = get_user_int(); return get_rand_int(0, max) + get_rand_int(0, max); }

If our relation of choice is about compatibility, then the following transformation is surely legal, as we can imagine situations in which the second run of get_rand_int indeed yields 4:

int foo() { int max = get_user_int(); return get_rand_int(0, max) + 4; }

However, this reduction step disallows other ones which would otherwise have been legal, such as the one leading to this term (take get_user_int to return 2):

int foo() { int max = 2; return get_rand_int(0, max) + 4; }

Nevertheless, these (also reduction based) transformations are valid expression transformations worthy of a place in the classification.

2. General transformations

The transformations that we considered up to this point were based on evaluation. That is, all of them took steps towards a normalized form for the expression (i.e., a fully evaluated form). However, this need not always be the case!

Expression transformations can move terms away from their normalized form (equality is symmetric, so any transformation can be undone!), or in fact, you can replace a term with one which may seem unrelated, but is still somehow equivalent to it.

In doing this, we stray away from the strict expression -> expression model of transformations. For instance, the following becomes a valid transformation: expression <e> -> <proof that a = b> -> <e where a is replaced with b> Note how arbitrary context elements can be used this way: maybe the proof that a = b comes from a Coq axiom or previously proven lemma! In fact, this prototype describes a common usecase of the rewrite tactic.

General transformation are an usual sight in theorem provers. You need to prove that your seemingly arbitrary transformation is legal, after all. Or do you? For instance, unrolling a loop in assembly is a form of general transformation. There is usually no proof of correctness of the implementation of this optimization pass, but that is to be expected of unverified systems. Another non-theorem prover related example of a general transformations is what happens when you jump from a language (intermediate representation or not) to an other in a compilation pipeline.

With general transformations, we stray away from reduction steps. Instead, we directly replace a whole term or subterms with something we assume — or prove — equivalent. In theorem prover, we usually have powerful building blocks (such as Coq's rewrite) for doing arbitrary replacements in exchange for some proof. These replacements usually are not limited to concrete terms or subterms (e.g. you can transform a - a into 0). Also, general transformations can be impure. For instance:

int sum = 0; for (int i = 0; i < 3; ++i) { sum += get_rand_int(0, 8); } printf("Total: %d\n", sum);

Can be rewritten into:

int sum = 0; sum += get_rand_int(0, 8); sum += get_rand_int(0, 8); sum += get_rand_int(0, 8); printf("Total: %d\n", sum);

Or even:

printf("Total: %d\n", get_rand_int(0, 8) + get_rand_int(0, 8) + get_rand_int(0, 8));

You could also choose to view reduction based expression transformations as special cases of general transformations. This is true and fine, but this does not mean that reduction based transformations are not special enough to deserve their own subcategory.

3. Recap

3.1. Reduction steps, rewriting systems and evaluation-based expression transformations

Reduction steps are expression transformations — usually very basic ones — that are local in nature, such as function definition unfolding or constant folding. They can be applied at a specified position in a given term, assuming the local context matches their prerequisites (i.e. don't try to apply definition unfolding on a constant).

The set of reduction steps available for the terms belonging to some language can be studied through the lens of rewriting systems. Transitions in a rewriting system usually directly correspond to the application of a reduction step.

Procedures for selecting which reduction steps to apply where and when (and also when to stop appling reduction steps) can be devised. For instance, it is often the case that sequentially picking reduction steps at random from the list of those compatible with the term at hand is guaranteed to lead to the same term, namely a normal form for the term in question. The rewriting system could then be said to be "confluent". A simple procedure for normalizing terms would then be to do just that. These procedures constitute evaluation-based expression transformations.

The order in which steps are taken sometimes matter. Even for confluent rewriting systems, where all paths are functionally equivalent, it may be that some order of evaluation drastically cuts down the complexity of the normalization. This is why being able to leave the user fine controls on how reductions are performed matters in some admittedly rare contexts, such as in proof assistants.

3.2. General transformations

Not all expression transformations are related to reduction. In fact, at times, seemingly unrelated terms end up being equivalent, counterintuitively. It may be that term A is 1 + 7 and that term B is opaque, but that we somehow have a proof of B = 8 in our context (local hypothesis?). We should be able to replace A with B, yet our rewriting system probably won't let us as these things are usually geared towards reductions, not expansions or more radical transformations such as the proposed ones. The larger set of expression transformations which includes procedures for doing such transformations is that of general transformations. Reduction steps and evaluation-based expression transformations are special cases of general transformations.

3.3. Transformation location

It is always legal to replace a subterm of a term with something equivalent to it. Expression transformations can be applied at given positions. However, this does not mean that each practical implementation of an expression transformation let's you pick where you apply them. For instance, procedures for total evaluation almost always systematically apply to the head of the term.

3.4. Concrete vs. symbolic transformations

Steps can be concrete or symbolic. Concrete steps may only deal with concrete inputs (i.e. do evaluation) and symbolic steps open the way to symbolic manipulations. This mostly depends on the language being treated. If the language supports symbolic variables, then the reduction steps are bound to be symbolic. For instance, in optimizing a function in some programming language, one has to deal as the future arguments can only be represented by abstract variables. It is not only steps which can be concrete or symbolic. This notion in fact extends to procedures. Procedures written for languages which have a notion of asbtract variables may be concrete when they content themselves with appling steps to concrete fragments of the term only. Otherwise, they are symbolic. The same goes for general expression transformations.

3.5. Purity vs. impurity

Impure expressions can also be transformed, but the side effects have to be preserved. Dealing with impure expressions forces contact with unpleasant equivalence relations.

4. A taxonomy of expression transformations

In this section, I finally give my classification of expression transformations.

4.1. The classification

4.2. Examples

Let's populate our classification with some concrete examples to ground this all.

Compiler optimization passes

Function evaluation strategies

Some Coq tactics

Lambda calculus reductions

In the next episode…

We will go through a whirlwind tour of Coq's tactics and we will insert them into our classification.