[Next] [up] [Previous]
Next: 2. Static algebras Up: EVOLVING ALGEBRAS:AN ATTEMPT TO Previous: Preface

1. Another computation model

Quisani: Somebody told me that you are doing semantics these days.

Author: Somebody was right.

Q: Sure, somebody is usually right. Tell me about your semantics.

A: The original idea was to provide operational semantics for algorithms by elaborating upon what may be called the implicit Turing's thesis: every algorithm is simulated by an appropriate Turing machine [Gu1]. Turing did not claim this explicitly; his thesis was: every computable function is computable by some Turing machine. But his informal proof of the thesis [Tu] gives the stronger version. In the sense of the stronger thesis, Turing machines give operational semantics to algorithms. Unfortunately, this semantics is not very good (and of course I do not claim that semantics was Turing's goal). Turing machine simulation may be very clumsy. In particular, one step of the algorithm may require a long sequence of steps of the simulating Turing machine. I was looking for machines able to simulate algorithms much more closely. In particular, the simulation should be lock-step so that the simulating machine makes only a bounded number of steps to simulate one step of the given algorithm. Evolving algebras, or EAs, are supposed to be such machines.

Q: There are abstract machines in the literature better suited to simulate algorithms than Turing machines: Kolmogorov-Uspensky machines [KU], storage modification machines of Schönhage [Sch], various random access machines.

A: Each of these machine models has a fixed level of abstraction which may be very low for some algorithms. An evolving algebra, on the other hand, may be tailored to an arbitrary abstraction level. One may have a whole hierarchy of evolving algebras of various abstraction levels for the same algorithm. See [GH] for an example where you will find EAs for various abstraction levels of the C programming language. A hierarchy of evolving algebras is constructed by Egon Börger and Dean Rosenzweig in [BRo1] where the technique of successive refinements is used to reconstruct the Warren Abstract Machine (a virtual machine model which underlies most of the current Prolog implementations and incorporates crucial optimization techniques) starting from a more abstract EA for Prolog developed by Börger in [Bo1-Bo3].

Q: How do you tailor an EA machine to the abstraction level of an algorithm whose individual steps are complicated algorithms all by themselves? For example, the algorithm may be written in a high level language that allows, say, multiplying integer matrices in one step.

A: You model the given algorithm modulo those algorithms needed to perform single steps. In your case, matrix multiplication will be built in as an operation.

Q: Coming back to Turing, there could be a good reason for him to speak about computable functions rather than algorithms. We don't really know what algorithms are.

A: I agree. Notice, however, that there are different notions of algorithm. On the one hand, an algorithm is an intuitive idea which you have in your head before writing code. The code then implements the algorithm. The same algorithm may be coded in different programming languages. It may have many different abstraction levels. In particular, a source program and the result of its compilation implement versions of the same algorithm that differ in their abstraction levels. We can argue which, if any, of the abstraction levels is the natural one. The question when two such intuitive algorithms are the same may be hard.

On the other hand, there is a more naive notion according to which an algorithm essentially is a program (together with some computing environment which is often implicit). In particular, different programs give different algorithms. I do not want to completely equate programs and algorithms though; one speaks usually about programs in particular programming languages. The notion of algorithm is a little more general. A programming language itself can be seen as an algorithm - a universal algorithm that takes a given program as a part of its data.

Q: Do you want to capture the naive notion of algorithm by means of evolving algebras?

A: The goal is good operational semantics, but it would be great to formalize properly the notion of algorithms, wouldn't it?

Q: To what extent is this formalization goal achieved?

A: Well, I will explain to you the notion of sequential evolving algebras. It seems to me that it captures the notion of sequential algorithms, that every sequential algorithm can be closely simulated by an appropriate sequential EA.

Q: What do you mean ``closely simulated''?

A: The simulating EA is on the same abstraction level and does not use much more resources than the given algorithm. In particular the simulation is lock-step.

Q: But who knows what kind of resources the given algorithm uses?

A: Well, give me a sequential algorithm and tell me which resources you care about. Then we will be able, I think, to construct a sequential EA simulating your algorithm closely with respect to the resources in question.

Q: What are your arguments?

A: Speculation and experimentation. In particular, EAs were constructed for a number of sequential programming languages, e.g., Modula-2 [GMr], Prolog [Bo1-Bo3], Prolog III [BS], Protos-L [BB] and C [GH]. In this modeling of programming languages, the goal was direct and natural operational semantics. But these models confirm the thesis as well.

Q: What about nonsequential, say distributed parallel, algorithms?

A: The notion of sequential EA has been generalized to the distributed parallel case [GMs]. In particular, EAs were constructed for Occam [GMs], Parlog [BRi1] and Concurrent Prolog [BRi2]. See also [GR]. I do not know any distributed parallel algorithm that presents a conceptual challenge for EA formalization. There is, however, a difference between the sequential and distributed parallel cases. An arbitrary sequential algorithm can be viewed as a sequential transition system; analyzing such systems, you discover sequential evolving algebras and may justify to an extent the thesis. The notion of distributed parallel algorithms seems open-ended at the moment.

Q: Even if one buys your thesis, he/she may not like your semantics.

A: I agree but hope that he/she will like it. EAs are relatively easy to understand and design. I use them in class. Even a very small program in an unusual language can be difficult to understand directly. Sketching an appropriate EA on a blackboard may help. You can define various machine models as special classes of evolving algebras.

Q: Do you have software to run an EA machine?

A: Yes. For this purpose, we use an EA interpreter written in C here at Michigan. You can run your EA for a specified number of steps and then examine the resulting state, you can run your EA until some predicate becomes true, and so on.

Q: Do you see any use for evolving algebras outside the classroom?

A: Yes. Here are some examples. ISO, the International Standards Organization, adapted Egon Börger's proposal [BD] of an EA based precise semantics of full Prolog [WG17]. (Actually, Egon Börger is not completely happy with the adaptation. In cooperation with Dean Rosenzweig, he prepared a simpler evolving algebra description of full Prolog [BRo2].) Georg Gottlob, Gerti Kappel, and Michael Schrefl used EAs to specify the semantics of characteristic features of object-oriented database models [GKS, KS]. An EA based manual for a programming language could be precise and relatively easy to read.

Q: Do you expect evolving algebras to be used for proving things about algorithms?

A: Yes. The possibility to tailor EAs to any given abstraction level is especially useful. I have already mentioned Börger-Rosenzweig's work on the Warren Abstract Machine [BRo1]. Starting from a Prolog evolving algebra on a high abstraction level (essentially the level of SLD-resolution), a hierarchy of more and more refined evolving algebras is constructed; the final algebra is the first formal abstract specification of WAM in the literature. It is proved that, under appropriate assumptions, every (i+1)-st algebra correctly implements the i-th algebra. This is a solid foundation for constructing provably correct compilers from Prolog to WAM; the mentioned proof assumptions give rise to conditions that guarantee correctness. Using similar techniques, my student Jim Huggins is attempting to prove the correctness of the Kermit file transfer protocol.

The EA approach is appropriate for the use of temporal and dynamic logics though we have only started to explore this. In a sense, the EA approach provides a foundation for the use of such logics. If you are looking for models of first-order temporal and dynamic logics, think about evolving algebras of appropriate kinds. (By the way, evolving algebras are called dynamic algebras sometimes, but the latter term is used in the dynamic logic area in a very different sense; see [Pr] for example.)

Q: Now tell me what evolving algebras are.

A: The basic idea is very simple, at least in the sequential case, when time is sequential (the algorithm starts in some initial state S(0) and goes through states S(1), S(2), etc.) and only a bounded amount of work is done on each step. Each state can be represented by a first-order structure: a set with relations and functions. (The term ``first-order structure'' is misleading. It is logic that is first-order, not structures. Models of second-order logic, for example, can be easily and naturally represented by ``first-order'' structures. But the term is common and we will use it.) Thus, the run can be seen as a succession of first-order structures, but this isn't a very fruitful way to see the process. How do we get from a state S(i) to the next state S(i+1)? Following the algorithm, we perform a bounded number of changes in S(i). It turns out that the whole algorithm can be rewritten as a finite number of transition rules of very simple form.

By the way, I am thinking about states of the algorithm as something feasible. Certainly, any computer executing the algorithm is able (all the time it is executing correctly) to represent the states. On the other hand, the whole process may be huge, unwieldy and infeasible to represent.

It isn't obvious how to generalize the basic idea to the case of asynchronous distributed algorithms; see [GMs] in this connection.

Q: It seems that you have an evolving first-order structure. Why do you call it an evolving algebra?

A: For technical reasons, it is convenient to replace relations by appropriate functions. In universal algebra, a first-order structure without relations is called an algebra [Gr]; we will use the term ``static algebra''.


[Next] [up] [Previous]
Next: 2. Static algebras Up: EVOLVING ALGEBRAS:AN ATTEMPT TO Previous: Preface


huggins@acm.org
Fri Dec 9 14:18:02 EST 1994