Q 00! - Semantic Scholar

2 downloads 0 Views 232KB Size Report
Apr 7, 1994 - weak bisimulation in terms of Hennessy-Milner Logic goes through with the follow- ..... [CH88] Rance Cleaveland and Matthew Hennessy.
A CALCULUS OF VALUE BROADCASTS K. V. S. PRASAD

April 7, 1994

Abstract. Computation can be modelled as a sequence of values, each broadcast by one agent and instantaneously audible to all those in parallel with it. Listening agents receive the value; others lose it. Subsystems interface via translators; these can scramble values and thus hide or restrict them. Examples show the calculus describing this model to be a powerful and natural programming tool. Weak bisimulation, a candidate for observational equivalence, is de ned on the basis that receiving a value can be matched by losing it. 1.

Introduction

This paper presents a new version of a calculus of broadcasting systems, CBS [Pra91b], a CCS-like calculus [Mil89] with broadcast communication instead of handshake. No knowledge of older versions of CBS is necessary, but familiarity with CCS will be helpful. [Pra91b] should be consulted for motivation of the communication model, discussion of design decisions, and comparison with CCS. 1

1.1. Preview. CBS agents evolve by transmitting (i.e., broadcasting), receiving, or losing data values. The expression 0 denotes the agent that transmits nothing, and whose only response to transmissions by others is to lose the transmitted value. 5: 0 00 !0

The expression x? P denotes an agent that is listening. It receives any value v transmitted by the environment, and subsequently acts like the agent P [v=x]. x? P 00! P [5=x]

x? x! 0 00! 6! 0

5?

6?

The expression 5! Q denotes an agent that transmits the value 5 and subsequently acts like Q. This agent loses all values transmitted by others. 5! Q 00! Q

5! Q 00! 5! Q

5!

6:

Key words and phrases. Broadcast, parallel computation, distributed computing, process calculi, CCS, communicating processes, bisimulation, observational equivalence. Author's address: Department of Computer Sciences, Chalmers University of Technology, S412 96 Gothenburg, Sweden. E-mail: [email protected]. Reference: This paper was presented at PARLE'93, Springer Verlag LNCS 694. 1 The version of [Pra91b] is referred to as Channelled CBS. The Patterned CBS that has been privately circulated is an intermediate version.

1

2

K. V. S. PRASAD

Transmission is autonomous, while reception and loss are both responses to transmissions by the environment, and are therefore controlled. Thus 5! 6! 0 will transmit 5 and then 6, while x? (x + 1) ! 0 is stuck until its environment transmits (8, say), when it will reply (9). There is no pre x corresponding to loss|there is no agent v: P . Transmitted values are received or lost instantaneously by each of the agents composed in parallel with the transmitter. CBS uses synchronous cooperation in that every agent has to transmit or respond at every step. But transmissions cannot be combined, only interleaved. The expression x? P j 5! Q j y? y! 0 denotes a system of three agents in parallel. The derivations below show two possible evolutions, the rst being a transmission by the system, and the second a response.

x? P j 5! Q j y? y! 000! P [5=x]j Qj5! 0 x? P j 5! Q j y? y! 000!P [4=x]j5! Qj4! 0 Communication is synchronous: it takes place only if transmission and reception are performed simultaneously. But the transmitter does not wait for the receiver. The composition of independently designed components is facilitated by trans" lation and by noise. A translator  is a pair of functions,  and # . The agent " P transmits the value  5 if P transmits 5, and receives or loses 4 according as P receives or loses # 4. Values are hidden and restricted by translation to noise by " and # respectively. Noise, denoted by the distinguished value  , is always lost. 5!

4?

 x? P 00 ! x? P :

 ? P is not part of the syntax of agents. Translating functions must map  to  . 2.

The Syntax and Operational Semantics of CBS

Assume a set E of data expressions containing a set of values V and a set of variables X. Let v range over V, x over X, and e over E. Let v~ and x~ be nite sequences of values and variables respectively. Let e[~v=~x] denote the result of substituting v~ for x~ in e. Assume also that E contains a syntactic category of boolean expressions ranged over by b and taking values in ftt; ffg. Assume  2 E but  2= V, and let V = V [ f g. Let w, w0 range over V . Let  be a pair of functions " , # : V ! V satisfying "  =  and #  =  . Let A range over agent constants. Then CBS agent expressions are given by













P : : = 0 x? P e! P P + P if b then P else P A (~v ) P P j P Occurrences of x in P become bound in the pre x term x? P , and the scope of x in x? P is P . Bound variables are assumed to be renamed as necessary to avoid clashes under substitution. Let fv (P ) denote the set of free variables in P . For example, fv (x! 0) = fxg. An agent P is closed if fv (P ) = ;. Thus x? x! 0 is a closed agent but x! 0 is not. The set of all agents is denoted P, and the set of closed agents Pcl . The substitution of data expressions for data variables extends from E to P in the evident way. Agent constants are declared in de nitions of the form A (~x) = P def

A CALCULUS OF VALUE BROADCASTS

Operator

Transmit

Receive

0

1

w: 000!

 x? P 00 !

?

w w! P 00 !P

Transmit

:

w w! P 00! 0:

!

w\ 00! P0 w\ P0 P + P 00! w\ P0 P 00! w\ P0 if tt then P else P 00! w\ P [~v=~x] 00! P0 w\ A (~v ) 00! P0 w P 00! P0 " w P 0000 ! P 0

P

1

Sum2

1

2

A (~x) = P def

1

1

Compose4

1

1

2

(

)

2

00w! :

:

:

# w[ P 0000 ! P0 w[ P 00! P 0

2

1

2

w P [~v=~x]00 ! w A (~v) 00 !

w]2 w]1 ! P0 000 ! P 0 P 000 w ]1 ]2 P0 j P0 P j P 000000! 1

:

1

!

P

:

2

w P 00! if tt then P else P

!

Translate

:

2

1

1

2

:

1

1

1

De ne

w ! P 00w! P 00 w P + P 00!

1

1

Branch3

3 4

Lose

v x? P 00! P [v=x]

Listen

1 2

3

]1



! ! ? ? ! : !

 2 6= ? ]

2

? ! ? ?

: ! ? :

w: w: P 00! is an abbreviation for P 00! P . There is a symmetric \ rule. Note that \ 2 f! ; ? g, ] 2 f! ; ? ; : g, and [ 2 f? ; : g. There are similar rules for if ff then P1 else P2 . ? means \unde ned" in the synchronisation algebra (boxed) for . Table 1.

Transition rules for closed agents

where x~ = x ; : : : ; xn and it is assumed that fv (P )  fx ; : : : ; xng. The scope of the parameters x~ is P . In these de nitions, assume that A does not occur in P except as a pre xed subexpression: that is, assume that recursion is guarded. De nitions may be mutually recursive. An agent is a term containing no unde ned agent names. Dealing with such agent variables is standard, and is largely ignored in this paper. An open agent here is one with free data variables. The actions of transmitting, receiving and losing the value w are denoted by w!, w? and w: respectively. It is convenient to use variables that range over punctuation. Let \ range over f! ; ? g, ] over f! ; ? ; : g and [ over f? ; : g. Punctuation variables are treated in the same way as other variables: thus if v\ and v0 \ both occur in the same context, they can stand for v? and v0 ?, or for v! and v0 ! but not for v! and v0 ?. This last pair is an instance of v\ and v0 \ . The operational semantics of closed agents is presented in Table 1. Open agents perform no actions. The rst two sum rules, the rst branch rule, and the rst de ne rule apply to both transmission and reception. The second translate rule applies to both reception and loss, and the compose 0 rule1 to all three kinds of action. Note that " w] and # w] mean unambiguously "w ] and (# w) ]. 1

1

1

2

4

K. V. S. PRASAD

2.1. Properties of the transition system. Let  denote syntactic equality. w\ w\ w] Let P 00! abbreviate \there exists P 0 such that P 00! P 0 ", and let P 00 != abbrew] viate \there is no P 0 such that P 00! P 0". The rst proposition below has been w w anticipated by the abbreviation P 00! for P 00! P. :

:

w 00! P 0 then P  P 0. w w! P 00! i P 00 =.

Proposition

1. If P

Proposition

2.

:

:

?

The proofs are by induction on the structure of P . Note that neither P  P j P ! nor P  P + P can lose a value unless both P and P can. Also, x? P 00 =. Replacing losses by the predicate \cannot receive a" yields essentially the same calculus. A transition system with negative premises is well de ned if these can be derived rst, independently of the positive transitions [Gro90]. Losses are similar. 1

1

2

1

2

?

2

w w0 v 00! is independent of 000 ! and 00! . w 8P , the set fw P 00!g is nite.

Proposition

3. The relation

Proposition

4.

:

!

?

!

Proof. By induction on the structure of P . There are only nite sums in CBS. Proposition

5 (Image

)

finite .



w] 8P; w, the set fP 0 P 00! P 0g is nite.

Proof. By induction on the structure of P . The result has already been proved for losses. Again, it is important that sums are nite. For P j P , there are several v cases to consider. For example, w! can be the result of P 00v! P 0 and P 00! P 0. 0 0 There are only nitely many P and P , by induction hypothesis. w For P 00! P 0, we have to consider all v such that " v = w. But there are only nitely many such v's, by the previous proposition. 1

1

1

!

2

!

2

1

?

2

2

Design notes. Let A = x? A. Then the usual recursion rule, if used also for losses,  would allow A 00 ! x? A. The new rule, like that for conditional, prevents changes of state upon loss. The restriction to guarded recursion disallows X = X. Here X would do neither v? nor v:. Synchronisation between actions is possible only when they refer to the same transmission. j is commutative and associative because  is. Interleaving is expressed as synchronisation between a \ action and a loss. Since any agent can do exactly one of w: and w?, reception is enforced whenever possible. def

:

def

3.

Strong Bisimulation

The rest of this paper assumes that V does not contain P, thereby restricting attention to a rst order version of CBS. Definition 6 (Strong bisimulation for closed agents). strong bisimulation if whenever P RQ, w] w] (i) if P 00! P 0 then 9Q0 such that Q 00! Q0 and P 0RQ0, w] w] 0 0 (ii) if Q 00! Q then 9P such that P 00! P 0 and P 0RQ0

R  Pcl 2 Pcl is a

A CALCULUS OF VALUE BROADCASTS

5

As in CCS, the largest strong bisimulation is an equivalence, denoted . The next lemma says that losses can be ignored when proving strong bisimulation. Thus losses are only signi cant in derivations when they contribute to transmission or reception.

Lemma 7. If R  Pcl 2 Pcl is a relation such that whenever P RQ, w\ w\ (i) if P 00! P 0 then, for some Q0 , Q 00! Q0 and P 0RQ0, w\ w\ 0 0 (ii) if Q 00! Q then, for some P , P 00! P 0 and P 0RQ0 then R is a strong bisimulation. w Proof. Suppose P RQ and R satis es (i) and (ii). Now if P 00 ! P , then by Propow w w sition 2, P 00! = . Then Q00! = . Otherwise P 00!, since receptions are covered by w the conditions on R. By another application of Proposition 2, Q 00! Q. Similarly for the other direction. Definition 8 (Strong bisimulation for open agents). Let P , Q 2 P contain free variables x~ at most. Then P  Q if for all indexed sets v~ of values, P [~v=~x]  Q[~v=~x]. It is now possible to state that P  Q implies x? P  x? Q. In fact, Proposition 9.  is a congruence for CBS Proof. By adapting the corresponding proof in [Mil89]. Definition 10. For any L  V, if the transmissions w! of P and all its derivatives are such that w 2 L [ f ! g then P has sort L, written P : L. Translating functions " , #: V ! V can be extended to E in the evident way. De ne   to be the pair h"  " ; #  # i. Proposition 11 (Strong bisimulation laws). (1) (a) (P= ; +; 0) is a commutative monoid. (b) P + P  P (2) (P= ; j; 0) is a commutative monoid. (3) (a)  P    P (b)  e! P  " e! P (c)  (P + P )  P + P (d)  (P j P )  P j P if 8v 2 L [ L , #" v = v, where P : L , P : L :

?

?

?

:

1

1

2

2

1

2

1

2

1

2

1

1

2

2

3.1. Axioms for nite agents. The equations 1(a) and 1(b) of Proposition 11 constitute a complete axiom system for strong bisimulation, as they do for CCS. For pure CBS, the proof is identical to that of [Mil89]; this re ects the fact that strong bisimulation ignores the communication model. Value passing CBS needs in addition an inference system for reasoning about data [Hen91], but the process content of the proof is essentially the same. The standard form is P



k X i=1

x? Pi +

l X

wi ! Pi

i=1

where each Pi is also in standard form. By convention, the empty sum is 0.

6

K. V. S. PRASAD

Proposition

12 (Expansion Pr



kr X

) For r 2 f1; 2g let

theorem .

vir ! Sir +

i=1

Now let r, s 2 f1; 2g and r 6= s. Then P jP 1

2



X

r;i;j

lr X

 ! Tir +

X r;i

x? Uir

i=1

i=1

vir ! (Sir j Ujs[vir =x]) +

mr X

 ! (Tir j Ps) +

X r;i;j

x? (Uir j Ujs )

where i ranges from 1 to kr , lr , and mr respectively in the three sums, and j ranges from 1 to ms . If ms = 0 then j takes the value 0. The convention is that U s = U s[vir =x] = Ps. 0

0

4.

Weak Bisimulation

Definition 13 (Weak bisimulation for closed agents). R weak bisimulation if whenever P RQ, 3 3  v (i) if P 00v! P 0 then 9Q0 such that Q 000000 ! Q0 and P 0RQ0 ,  3 0 0 (ii) if P 00! P then 9Q such that Q 00! Q0 and P 0RQ0 ,  3 v[2  3 v[1 ! P 0 then 9Q0 such that Q 000000! (iii) if P 000 Q0 and P 0RQ0 and similarly with P and Q interchanged. !

!

!

 Pcl 2 Pcl is a

! !

!

!

!

The largest weak bisimulation is an equivalence, denoted . Note that v? and v: match each other; both are silent responses to a v! by the environment. In Law 1 below, the responses to a 5! are 0 00! 0 and x? 0 00! 0. Law 2 complements Proposition 11. If # v =  , a v: on the left is matched by a v? on the right. 5:

5?

14 (Some Weak bisimulation laws). 0  x? 0  x? P  X where X = x? if # x =  then X else P [#x=x]  ! v! P  v! P P  ! P + P  ! P   ! P + P + x? P w! (P +  ! Q)+w! Q  w! (P +  ! Q) and x? (P +  ! Q)+x? Q  x? (P +  ! Q)

Proposition

(1) (2) (3) (4) (5) (6)

def

 !'s are autonomous and unobservable as in CCS. But note that  ! P 6 P in general. See Ex. 6 in Table 2, which also gives other examples of 6. Law 3 generalises to  ! P  P for any P with no v? actions. This is the closest CBS gets to the CCS law : P  P . Laws 4 and 5 are the closest to the second CCS  law, P + : P  : P . Only the third CCS  law holds for CBS (Law 6).

4.1. Weak congruence.  is preserved by all the operators of CBS except +. Law 6 above is a congruence, but not Laws 1|5 (see Ex. 1, 7, 8 of Table 2). The largest congruence in , denoted c , can be characterised operationally as for CCS by restricting matching options for the rst transitions. This congruence is of questionable use, since no axiomatisation has been found yet. As with CCS,  is often enough in practice, since speci cations usually require a j context, not a + context. Further, if P  Q then w! P c w! Q and x? P c x? Q. But most important are the developments of [Pra93b], which suggest that c is a non-issue.

A CALCULUS OF VALUE BROADCASTS

A 6 B 1. x? 0 + 3! 0 6 3! 0 2. x? if x = 1 then 3! 0 else 0 6 x? if x = 2 then 3! 0 else 0 3. x? 3! P 6 3! P 4. 1! 2! 0 + 1! 3! 0 6 1! (2! 0 + 3! 0) 5. 1! x? 0 + 1! 3! 0 6. 7. 8. 9.

 ! x? x! 0 1! P + x? 2! Q x? x! 0 + 5! 0 +  ! 5! 0 x? x! 0 +  ! x? x! 0

7

T est 1! 3? >

A B fails may

1! 3? > must fails 3? > fails must Equal by de Nicola-Hennessy testing 1? (2? > ^ 3? >) fails may 6 1! (x? 0 + 3! 0) 1? 3? > may must 1? (3? ? _ 2! 3? >) may fails 6 x? x! 0 3! 3? > may must 6  ! 1! P + x? 2! Q 1? ? _ 3! 1? > fails may 6 x? x! 0 + 5! 0 5? > ^ 3! 3? > may must 6  ! x? x! 0 5! 5? > must may

The atomic tests are > (success), ? (failure), v! (may fail if the tested agent diverges, i.e., does an in nite sequence of transmissions), and v? (succeeds if the tested agent does a v!, possibly preceded and followed by  !'s, and fails if it transmits any other value rst, or diverges, or deadlocks). Table 2. Examples of testing for 6

4.2. Hennessy-Milner Logic. The modal characterisation theorem [Mil89] of weak bisimulation in terms of Hennessy-Milner Logic goes through with the followimg de nition of satisfaction for the hw] i operator. P j= hv! i A i for some P 0, P P j= h ! i A i for some P 0, P P j= hv[ i A i for some P 0, P 1

 3v  3 000000 ! P0 3 00! P0  3 v[2  3 000000! P 0 !

! !

!

!

!

and P 0 j= A and P 0 j= A and P 0 j= A

4.3. Alternative de nitions of . The weak bisimulation of [Pra91a] allows  's only before a matching transmission, and none surrounding a matching response.  then di ers from the present relation in some cases (e.g., Laws 5 and 6 fail), but agrees in most. The advantage is that losses can be ignored in the reasoning (e.g., in Ex. 6 of Table 2). But are Laws 3 and 5 desirable? Characterising weak bisimulation as a testing equivalence [Abr87] would settle the issue formally. This is yet to be done. Table 2 is a start. 5.

Channels, guards and sums

Let i be patterns, and v= be the substitution that results from matching v against . Then the construct x? case  : P or  : P or : : : else P end = x? if x matches  then P [x= ] else if x matches  then P [x= ] else : : : P 1

1

2

2

def

approximates the

P

i

1

1

1

2

2

i ? Pi of Patterned CBS, where ? P is a primitive.

? P 00! P [v=] if v matches  v?

2

? P 00v! ? P if v does not match  :

Examples below show that the approximation is only , not c or , but the greater expressive power of Patterned CBS seems not to be necessary for programming.

8

K. V. S. PRASAD

Channelled CBS [Pra91b] o ers only the patterns ha; vi where a is a channel, and v a value. Pure CBS restricts further to patterns a; [Pra91b] interprets these traditionally as channels without values. A better interpretation is that 3? P awaits a 3 on a single (unnamed) channel, and loses everything until then. Example

15. [Input guards] Below, X (3; P )  3? P and Y

 3? P + 4? Q.

X (v; p) = x? case v: p else X (v; p) end def

Y = x? case 3: P or 4: Q else Y end But 3? P + 4? Q 00! R implies R  P , whereas X (3; P ) + X (4; Q) 00! X (4; Q). Also Y + 2! 0 00! Y but 3? P + 4? Q + 2! 000!. The approximation is not c . def

3?

3?

5?

Example

5:

16. [Channels] Suppose V consists of two types of integers, \ordinary"

hOrd; ni and \tagged" hTag; ni. Y below is a Patterned CBS agent. Y = hTag; xi? hTag; 2 3 xi! Q X = x? case hTag; ni: hTag; 2 3 ni! Q else X end P = x? (2 3 x) ! Q0 " = fn 7! hTag; nig # = fhTag; ni 7! n; hOrd; ni 7!  g Then X  Y . The agent P o ers a better solution under a constraint: P  Y , but only if Q0  Q. Translating functions are speci ed by their graphs. They are def

def

def

def

def

taken to leave values unchanged except if otherwise speci ed.

5.1. A simpli ed CBS. [Pra93b], which deals with an implementation of CBS, also replaces + by a guarded sum construct with the input pattern x? and a nite number of output guards. This permits loss and reception to be identi ed. The x? case end construct then approximates a sum of patterns up to . Further, the de nition of  is then exactly that of CCS, and Laws 1 and 6 of Proposition 14 hold for . Lastly,  is preserved by guarded sums, and is a congruence. These radical advantages depend on the assumption, supported by experience with CBS so far, that x? P + v! Q (timeout) is the only kind of sum needed in programming. Sums of output guards are still needed for reasoning; consider the expansion of 3! P j 4! Q. The non-determinism in 3! P j 4! Q can be seen as arising from di erent speeds of computation in parallel components. By contrast, the nondeterminism in x? P + y? Q seems hard to motivate. It is not clear whether both are needed in a speci cation language. 6.

Programming examples

6.1. Milner's scheduler. Agents Pi, i 2 1: : n, each perform a task repeatedly, and are to be scheduled cyclically by signals hGo; ii. The end of each task is signalled by hDone; ii. The speci cation of the scheduler is S (i; X) if i 2 X S (i; X) if i 2= X

= x? case hDone; j i: S (i; X 0 fj g) else S (i; X) end = x? case hDone; j i: S (i; X 0 fj g) else S (i; X) end +hGo; ii! S (i + 1; X [ fig)

def def

Here i says whose turn it is, and X is the set of active agents; i + 1 and i 0 1 below are calculated modulo n. This is close to Milner's speci cation [Mil89].

A CALCULUS OF VALUE BROADCASTS

9

The scheduler can be implemented as a set of cells Ai , which schedule their respective wards and then wait for hDone; ii and hGo; i01i to happen in either order. To start with, only A is ready to schedule its ward; the others wait for scheduling signals. Since no agents are active as yet, there cannot be any termination signals. 1

Q

Ai Bi Ci Di Sched

= = = = =

def def def def def

hGo; ii! Bi x? case hDone; ii: Di or hGo; i 0 1i: Ci else Bi end x? case hDone; ii: Ai else Ci end x?case hGo; i 0 1i: Ai else Di end Q  A j i6 Di where # = fhGo; vi 7!  g def

1

=1

i2I Pi is a standard abbreviation for the parallel composition of Pi for i 2 I. The di erence from the CCS implementation is that the hGo; ii can be heard both by Pi and by Bi or Di , so there is no need to relay the information by new signals. The following relation is a strong bisimulation, and so Sched  S (1; ;).

[

D



D



S (i; X) ;  Ci j

S (i; X) ;  Ai j

E i j 2X;j 6=i Bj j2 = X Dj E Q Q i j 2X Bj j2 = X;j 6=i Dj

Q

j

Q

j

2X 2= X





Because of the relaying, CCS has weak bisimulation here, and a proof by bisimulation upto bisimulation.

6.2. Broadcast sort. The speci cation for a sorting machine below is given in terms of lists xs of numbers. > is the end marker for the input list. x? if x = > then H (xs) else S (sort (x: : xs)) S ([]) (hd xs) ! H (tl xs) > S ([5]) 00!S ([5; 8]) 00! S ([5; 7; 8])00! H ([5; 7; 8]) 00! H ([7; 8])00!H ([8])00!S ([]) We consider only the case where all the numbers input by the user are distinct. The more general case needs a little more detail but is almost as easy. S (xs) = H ([]) = H (xs) = S ([]) 00!

def

def

def

5?

8?

7?

5!

?

7!

8!

Sorter =  (In (?; >)) where " = fhout; ui 7! ug and # = fhout; ui 7!  g In (l; u) = x? if x = > then Out (l; u) else if l < x and x < u then In (l; x) j In (x; u) else In (l; u) Out (?; >) = In (?; >) Out (?; u) = hout; ui! 0 Out (l; u) = x? case hout; li: Out (?; u) else Out (l; u) end def

def

def

def

def

Broadcast sort is a parallelised insertion sort. The input so far is held in a sorted list, maintained by cells each holding a number u and a \link" l, the next lower number. Let ? and > be sentinel values, respectively less than and greater than all numbers. There is always exactly one cell with l = ?, and exactly one with u = >.

10

K. V. S. PRASAD

The next input number splits exactly one cell into two. At the end of input, output is initiated by the cell with ? transmitting its u. Each cell (l; u) changes to (?; u) when it hears l, thus continuing output. Let augxs = [x ; x ; : : : ; xn; xn ] be any list with at least two elements, and 0

1

+1

Inaug (augxs) =  (In (x ; x ) j In (x ; x ) j : : : j In (xn ; xn )) Outaug (augxs) =  (Out (x ; x ) j Out (x ; x ) j : : : j Out (xn; xn )) aug (xs) = [?]++sort (xs) ++[>] where ++ is list concatenation. The function aug sorts a list of numbers and decorates it with ? and >. Then the relation f8xs; hS (xs) ; Inaug (aug (xs))ig [ f8xs; hH (xs) ; Outaug (aug (xs))ig is a strong bisimulation. It proves that S ([])  Sorter. 6.3. The alternating bit protocol. Signals can be lost in CBS if the intended receiver is not listening. An alternating bit protocol can be used to achieve a measure of synchronisation between agents that do not wait for each other, rather than to deal with lossy media. In the CCS formulation, agents wait for a while and then timeout autonomously. In CBS, this is the natural behaviour of competing transmissions. The program below is a simpli ed version of that in [Mil89]|no media or timeouts|but the sequences of (re)transmission and acknowledgement are similar. Further, the sender manufactures the messages itself, rather than receive them from outside. def

0

1

1

2

+1

def

0

1

1

2

+1

def

hb; ni! S (b; n) 1 0 +x? case b: hb; n + 1i! S b; n + 1 else S (b; n) end R (b) = b! R (b) +x? case hb; ni: hrelay ; ni! b! R (b) else R (b) end 0 0 11 SY S (b; n) =  hb; ni! S (b; n) j b! R b where " = fhrelay ; ni 7! n; x 7!  g and # = fx 7!  g Spec (n) = n! Spec (n + 1) S (b; n) =

def

def

def

def

def

def

Then SY S (parity (n) ; n)  Spec (n). This is an easy exercise, with the system settling down rapidly into a 4-state loop. 7.

Conclusions, Related work, Future work

[Pra91b] had ports (channels), input guards, summation, restriction/hiding distinct from translation, and dealt with value passing only by encoding to the pure calculus. These concepts, carried over from CCS, are foreign to the physical model underlying CBS. [Pra91b] therefore gave unsatisfactory accounts of hiding/restriction; worse, it failed to discover the programming power in CBS. This paper uses the natural model for broadcast that eludes [Pra91b], the single unnamed channel. This makes for elegance of notation: e! P and ? P . Channels are seen as special kinds of structured values, and Pure CBS as the sub-calculus that uses only constant patterns (3? P ). More general patterns, and sums of input guards (i.e., 3? P + 4? Q), can be mimicked up to  by case analysis following the

A CALCULUS OF VALUE BROADCASTS

11

single pattern x?. Thus the absence of channels as a basic concept enables a smooth interface to functional programming. Other bene ts are a natural generalisation to a higher order calculus (yet to be studied), and the formulation of hiding and restriction as translation to  , thus simplifying the calculus. [Pra93b], as yet seen as experimental, completes the development, permitting only sums with one input summand, x? P , and a nite number of output summands. This permits loss and reception to be identi ed.  is then de ned exactly as for CCS, and is a congruence. General patterns ? P can be mimicked up to . CBS is natural and powerful; it expresses concisely several programs that would be tedious in CCS. It can handle problems, such as sorting, not tailor made for process calculi. More, it suggests a new paradigm of programming, though its range is unclear. [Pra93b] reports several other examples. Shortage of space precludes a comparison of CCS and CBS. Please see [Pra91b]. [Pra91b] had no de nition of weak bisimulation. This has now been de ned, and produces testing results that justify describing  as observational equivalence.

Implementations. Several implementations for CBS exist [Pet93, Pra93b, Jon92]. Several examples, including those in this paper, have been run on them. All the implementations are small, and none need any change to the language they use, Lazy ML [AJ92]. That of [Pra93b] is less than two pages long, and seems capable of parallel implementation. It types the calculus simply but satisfactorily, within the ML type system. CBS thus compares well with the many other attempts [Hol83, GMP89, BMT92] to combine functional and concurrent programming. Message priority. CBS separates autonomous actions from controlled ones, rather than manufacture autonomous  's out of controlled actions as in CCS. Because of this, a prioritised version of CBS is easy to develop [Pra93a]. This compares favourably with the diculty of putting priority into CCS [CH88]. I/O Automata. Input/output automata [LT87] use a model of computation tantalisingly similar to that of CBS. One di erence is that I/O automata are input enabled. So are agents in [Pra93b], so that paper provides a process calculus formulation of I/O automata [Vaa91]. A technical di erence is that models of I/O automata use quiescent and fair traces, not bisimulation. Process calculi. The handshake model predominates overwhelmingly even though it appears incapable of distributed implementation [Sjo91] and is a low level primitive entailing very detailed code. Nor has it proved a fruitful paradigm for new algorithms. There is increasing willingness to look at other models. CBS o ers one. Algorithms. Broadcast has almost always been treated as something to be implemented rather than to be used. This is true of literature on hardware, on distributed systems and on algorithms. Even literature that describes it as a primitive [BC91] gives no examples of use. I had therefore re-invented the sorting algorithm in this paper, and several others, when I saw [HT92] and then discovered [YLC90] and [DK86]. This is clearly only a small eld of research, but its neglect is sobering.

12

K. V. S. PRASAD

Future work. More examples are needed to establish the applicability of CBS and to test whether the language of [Pra93b] suces. A parallel implementation has to be explored. Theoretical work includes the formulation of bisimulation as testing, ecient methods for checking bisimulation, axiomatisations, a study of higher order CBS and timed CBS, and the relation to other models such as I/O automata. Acknowledgements. CBS has been developed over several years, and owes something to almost everyone I know in the eld of concurrency, particularly at Chalmers. References

[Abr87] Samson Abramsky. Observation equivalence as a testing equivalence. Theoretical Computer Science, 53, 1987. [AJ92] Lennart Augustsson and Thomas Johnsson. Lazy ML user's manual. Technical report, Department of Computer Science, Chalmers University of Technology, 1992. [BC91] Kenneth Birman and Robert Cooper. The ISIS project: Real experience with a fault tolerant programming system. Operating Systems Review, 25(2), April 1991. [BMT92] Dave Berry, Robin Milner, and David Turner. A semantics for ML concurrency primitives. In Symposium on Principles of Programming Languages. ACM, 1992. [CH88] Rance Cleaveland and Matthew Hennessy. Priorities in process algebras. In Symposium on Logic in Computer Science. IEEE, 1988. [DK86] Rina Dechter and Leonard Kleinrock. Broadcast communications and distributed algorithms. IEEE Trans. on Computers, 35(3):418, Mar 1986. [GMP89] Alessandro Giacalone, Prateek Mishra, and Sanjeev Prasad. Facile: A symmetric integration of functional and concurrent programming. International Journal of Parallel Programming, 18(2), 1989. [Gro90] J.F. Groote. Transition system speci cations with negative premises. In CONCUR '90, 1990. Springer Verlag LNCS 458. [Hen91] Matthew Hennessy. A proof system for communicating processes with value-passing. Formal Aspects of Computer Science, 3:346{366, 1991. [Hol83] Soren Holmstrom. PFL: A functional language for parallel programming. Technical Report 7, Dept. of Computer Sciences, Chalmers Univ. of Tech., 1983. [HT92] Tzung-Pei Hong and Shian-Shyong Tseng. Parallel perceptron learning on a singlechannel broadcast communication model. Parallel Computing, 18:133{148, 1992. [Jon92] Simon Jones. Translating CBS to LML. Technical report, Department of Computer Science, University of Stirling, 1992. [LT87] Nancy Lynch and Mark Tuttle. Hierarchical correctness proofs for distributed algorithms. Technical Report MIT/LCS/TR-387, Laboratory for Computer Science, Massachusetts Institute of Technology, 1987. [Mil89] Robin Milner. Communication and Concurrency. Prentice Hall, 1989. [Pet93] Jenny Petersson. Tools for CBS. Licentiate thesis, Department of Computer Science, Chalmers University of Technology, 1993. In preparation. [Pra91a] K. V. S. Prasad. Bisimulations induced by preorders on action sequences. In Chalmers Workshop On Concurrency, May 1991. [Pra91b] K. V. S. Prasad. A calculus of broadcasting systems. In TAPSOFT'91 Volume 1: CAAP, April 1991. Springer Verlag LNCS 493. [Pra93a] K. V. S. Prasad. Broadcasting with priority. Technical report, Department of Computer Science, Chalmers University of Technology, 1993. [Pra93b] K. V. S. Prasad. Programming with broadcasts. Technical report, Department of Computer Science, Chalmers University of Technology, 1993. [Sjo91] Peter Sjodin. From LOTOS speci cations to distributed implementations. PhD thesis, Uppsala University, December 1991. [Vaa91] Frits Vaandrager. On the relationship between process algebra and input/output automata. 6th Annual Symposium on Logic in Computer Science, 1991. [YLC90] Chang-Biau Yang, R. C. T. Lee, and Wen-Tsuen Chen. Parallel graph algorithms based upon broadcast communications. IEEE Trans. on Computers, 39(12):1468, Dec 1990.