Why sometimes probabilistic algorithms can be more ... - Springer Link

0 downloads 0 Views 601KB Size Report
WHY SOMETIMES PROBABILISTIC ALGORITHMS. CAN BE MORE EFFECTIVE. Farid M.Ablaev. Department of Mathematics. Kazan State University ul.Lenina ...
WHY SOMETIMES PROBABILISTIC

ALGORITHMS

CAN BE MORE EFFECTIVE

Farid M.Ablaev Department of Mathematics Kazan State University ul.Lenina 18 420000 Kazan USSR

For several problems effective

R ~ s i ~ Freivalds Computing Center Latvian State University Blvd. Rai~a 29 226250 Riga U S S R

there exist probabilistic

than any deterministic

lems probabilistic derstanding,

algorithms

algorithms which are more

solving these problems.

algorithms do not have such advantages.

For other prob-

We are interested

why it is so and how to tell one kind of the problems

in un-

from another.

Of course, we are not able to present a final answer to these questions. have just chosen a sequence of examples predictions

We

that can increase our ability to make right

whether or not the advantages

of probabilistic

algorithms

can be proved

for a given problem. Since probabilistic whether probabilistic

algorithms

algorithms

are widely used, the question arise naturally,

can solve a problem such that no deterministic

algo-

rithm can solve it. The very first answer to such a question was found by de Leeuw et al.

[20], and it was negative.

the probabilistic recursive

parameters

functions,

i.e. functions

tive answers were found as well Probabilistic counterparts

They proved that under reasonable

of the machine,

probabilistic

computable by deterministic

restrictions

on

can compute only

machines.

Later posi-

[28, 4, 7],

machines used in our paper are the same as their deterministic

but the probabilistic

machines

can in addition make use at each step of

a generator of random numbers with a finite alphabet, equal probabilities

machines

and in accordance with a Bernoulli

yielding

its output values with

distribution,

i.e. independently

of the values yielded at other instants. We say that the probabilistic

machine M recognizes

p if M, when working on an arbitrary x, yields a I if x a L with a probability

language L with probability and yields a 0 if x C Z

,

no less than p. We say in the case p>I/2 that L is recognized with

an isolated cut-point.

I. PROBLEMS ALLOWING MULTIPLE-VALUED

RESULTS

The very first results on advantages

of probabilistic

algorithms

over deter-

ministic ones were proved for problems where the result is not determined uniquely

but can be chosen from a certain set. Let us consider some examples. S.V.Yablonskij

[28] proved that a probabilistic Turing machine can produce

an infinite sequence of Boolean functions such that: a) the running time to produce the n-th element of the sequence is polynomial, and b) probability of the following event equals I: all but a finite number of the elemenes in the sequence are Boolean functions with nearly maximel circuit complexity.

(Different performances of the

probabi~istic machine produce different sequences of the Boolean functions). J.M. Barzdin [4] constructed a recursively enumerable set such that no deterministic Turing machine can enumerate an infinite subset of its complement but there is a probabilistic Turing machine which can enumerate with arbitrarily high probability |-e infinite subsets in the complement of the given set. (Different performances of the probabilistic machine produce different subsets). A.V.Vaiser

[27] proved that the function logloglx I can be approximated by a

probabilistic Turing machine in time Ixllogloglxl , i.e. in less time than any deterministic Turing machine can do this. R.Freivalds

[10] improved this upper bound and

showed that loglx I can be approximated by a probabilistic Turing machine in time Ixl loglog ~I , the function loglog Ix; can be approximated in time

Ix~ logloglog ~xl ,

etc. (Different performances of the probabilistic machine can produce different results but with high probability they are nearly the same). R.Freivalds

[13] considered probabilistic counterparts of the notion of m-

reducibility of sets. Let A C N ,

BeN.

We say that A is m-reducible to B (A ~m B) if there is a total

recursive function f of one argument such that x 6 A

~-~f(x) a B

holds for all x ~ N .

We say that A is probabilistically m-reducible to B (A ~m-prob B) if there is a probabilistic Turing machine such that for arbitrary x and i it produces results y so that the probability of the event ( x G A

~--~y~B)

exceeds I-I/i. (Different p e r -

formances of the probabilistic machine may produce different y's for the same x and i). Let z6{m,m-prob}.

We say that a set L is z-complete if: I) L is recursively

enumerable, and 2) A ~z L for arbitrary recursively enumerable set A. Since A ~ m B implies A ~m-prob B, every m-complete set is also m-prob-complete but not vice versa.

THEOREM 1.1. (R.Freivalds,

[13]). There is an m-prob-complete set which is not

m-complete. For readers familiar with the notions and the standard notation of the recursive function theory (see [24]) we additionally note a new result by R.Ereivalds: m-prob-complete sets always are tt-complete but they can be not btt-complete. Putting the matter in a highly imprecise way, we claim: if your kind of problems is such that the result is not determined uniquely and it can be chosen from an infinite set then there is a problem of your kind which can be solved by a probabilistic algorithm more easily than by any deterministic algorithm.

Of course, this claim is not a precise mathematical there are clear counterexamples,

statement. Moreover,

and we produce some of them below in this Section.

Nevertheless we are sure that precise theorems can be proved which say nearly the same thing as our claim. Now the promised counterexamples. E.M.Gold

Inductive inference machine as defined by

[19] is essentially any algorithmic device which attempts to infer rules

from examples.

In this model, a "rule" is any partial recursive function f, and an

"example" is a pair (x, f(x)) for some x in the domain of f. A predictive explanation for the rule f is simply a program p which computes f. Thus the machine takes as input the values of some partial recursive function f, and attempts to output a program p which computes f, based on the examples it has seen. Note that if after seeing some finite number of examples,

the machine gulsses the program p, the very next example

might be inconsistent with p. For this reason, the inference is seen as an infinite process, which occurs "in the limit".

It is required that the sequence of gulsses of

the machine converge to a single program computing f. We say that the set of functions U is identifiable if there is an inductive inference machine which identifies correctly in the limit every function from the set U. We say that the set of functions U is identifiable with probability r if there is a probabilis~ic

inductive inference machine which identifies correctly in the limit

every function from the set U with probability at least r (different correct programs are allowed for the function at different performances of the probabilistic

inductive

inference machine).

THEOREM 1.2. (R.Freivalds,

[12]). If U is a set of total recursive functions,

and U is identifiable with probability r > I/2 then U is identifiable by a deterministic inductive inference machine.

THEOREM 1.3. (L.Pitt,

[22]). If U is a set of partial recursive functions,

and U is identifiable with probability

r > 2/3 then U is identifiable by a determi-

nistic inductive inference machine.

THEOREM 1.4.

(R.Freivalds, new result).

(I) If

n+1 ~

n < rl < r2 < -i-~'

and

U is identifiable with probability r I then U is identifiable with probability r 2 as n (2) If r I < ~ - ~ < r 2 then there is a set of partial recursive functions

well.

identifiable with probability r I but not identifiable with probability r 2. A result similar to Theorem 1.4 was proved by R.Freivalds for so called finite identification of functions in [11].

2. THRESHOLD BOOLEAN OPERATIONS

The following theorem holds for all natural types of automata and machines

(e.g., for multitape

l-way or 2-way finite automata,

l-way or 2-way k-counter

THEOREM 2.1. by deterministic

(R.Freivalds,

l-way or 2-way pushdown machines,

real time machines,

etc.).

[8]). Let L I and L 2 he two languages

automata of a type W. Then there is a probabilistic

type W which recognizes Intersection in this Theorem.

or mulricounter

recognizable

automaton of the

the language L I N L 2 with isolated cut-point.

can be replaced by union and some other Boolean set operations

What operations namely?

Boolean function f(x I . . . . , x n) is called threshols

if there are real numbers

a I, ..., an, b such that I

I, if alx1+ ... +a n x n > b;

f(x I . . . . , x n) = 0, if otherwise. Boolean set operation

is called threshold

threshold function as intersection

corresponds

if it corresponds

to conjunction

to a Boolean

and union corresponds

to disjunction.

THEOREM 2.2. by deterministic

(R.Freivalds,

[16]). Let LI, ..., L n be languages recognizable

automata of a type W and f be a Boolean threshold set operation.

Then there is a probabilistic

automaton of the type W which recognizes

the language

f(L I, ..., Ln ) with isolated cut-point. Of course,

to make Theorems

2.1. and 2.2 to be precise mathematical

we need a formal notion of the type of automata.

statements

Such a notion is introduced

in [16].

For the sake of brevity we omit the definition here because the two proofs are merely combinatorial

constructions

valid for all the natural

needed to prove that only threshold contains

set operations

types. More formal notion is

can be used in Theorem 2.2.

such a proof but the theorem proved there has a rather unpleasant

restriction.

[16]

additional

It is still an open problem to prove that only threshold set operations

can be used in Theorem 2.2. We shall see in the subsequent language recognizable

Sections more complicated

by the probabilistic

automaton or machine

by a Boolean set operation from a constant number of languages ministic

automata.

Surprisingly

a bit more complicate operations

enough,

representation.

cases when the is not representable

recognizable

by deter-

in all these cases there is a similar though

One can informally

from a growing number of languages

say that Boolean threshold

are used. Unfortunately

we are not able

to make this assertion precise.

3. MORE SERIOUS ADVANTAGES

OF PROBABILISTIC

We consider multihead automata in this Section.

AUTOMATA

l-way finite automata and l-way real time multicounter

We define the following sequence of languages. A word x E ( 0 , 1 ~

is in D

b

if it is a palindrome, and it contains exactly 2b-I occurences of the symbol I.

THEOREM 3.1. (R.Freivalds,

[9]).

(I) Given any integer b and any e > 0, there

is a probabilistic 2-head l-way finite automaton which accepts every x ~ D b with probality I, and rejects every x E D b with probability 7-e. (2) No deterministic b-head l-way finite automaton can recognize D b. PROOF OF (I). Let n > b/e. We consider the following family of deterministic 2-head l-way finite automata. They differ only in the value of a 6 ( 1 , 2 ,

..., n}.

At first, the head h I goes to the (b+1)-th block of zeroes. Then the heads are performing .

.

s p e c i a I .

steps alternately.

.

.

If the automaton is processing a string

.

01110121 ... 10ZbloJbl .,, I0J210 jl, then the heads are performing a 0 steps on every symbol of the blocks 0 a

I

i2

steps on every symbol of 0

ib

and 0

J2

ab-1

, ...,

ii

and

0j

I,

steps on every symbol of the blocks

Jb

0

and

. The string is accepted if and only if b-1

I iI + a

i2

b-1 .

a 2 i3 + ... + a

+

ib

= a

2 .

]b

+

"'"

+

a

]3

+ a

J2

+

I

J1"

If a string is accepted by b distinct automata from this family, and these automata are characterized by values al, a2, ..., ab, then I (il-Jl) + a I (i2-J2) + a~ (i3-J3) + ... + a~ -I (ib-Jb) = 0, I (il-J I) + a 2 (i2-J2) + a22 (i3-J 3) + ... + a~ -I (ib-J b) = 0, .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

I (il-J I) + ab (i2-J2) + ab2 (i3_J3) + ... +a b-1 (ib-Jb) =

0.

The determinant of this system of equations is the Vandermonde determinant. For pairwise distinct al, a2, ..., ab it is not equal 0. Hence the system has only one solution il-Jl = ... = ib-Jb = 0 and the string is in D b. Thus if xlED b then no more than b distinct automata accept this string.

THEOREM 3.2. (R.Freivalds,

[17]). There is a language E such that: I) given

any e > 0, there is a probabilistic 3-head l-way finite automaton which accepts every x G E with probability I, and rejects every x ~ E

with probability l-e, 2) no deter-

ministic multihead l-way finite automaton can recognize E. IDEA OF PROOF. The language E is a modification of languages in the family D b from Theorem 3.1. Blocks Code(b), C(b, k, ik) are defined, and it is demanded that strings are of the form Code(b) * C(b, I, i I) * C(b, 2, i2) * ... * C(b, b, ib) ** C(b, b, jb ) * ... * C(D, 2, j2 ) * C(b, I, jl ), where b is an arbitrary natural number, and i I = J1' i2 = J2' "''' ib = Jb" OPEN PROBLEM. Can a probabilistic 2-head l-way finite automaton recognize

with isolated cut-point a language not recognizable by deterministic multihead l-way finite automata? The same language E can be used to prove that a probabilistic 3-counter l-way automaton can recognize in real time with arbitrarily high probability 1-e a language not recognizable in real time by any deterministic multicounter l-way automata. The language E can be modified to prove

THEOREM 3.3. (R.Freivalds,

[15]). There is a language F such that: I) given

any e > o, there is a probabilistic 2-counter l-way automaton which accepts every x~F

with probability I, and rejects every x G F

with probability l-e, 2) no deter-

ministic multicounter l-way automaton can recognize F. OPEN PROBLEM. Can a probabilistic l-counter l-way automaton recognize with isolated cut-point a language not recognizable by deterministic multicounter l-way automata? Why probabilistic multicounter or multihead automata are more powerful than their deterministic counterparts? A partial answer to this problem is given by the following theorems. F.M.Ablaev [2] proved several theorems on the number of equivalence classes of words modulo languages recognizable by deterministic, nondeterministic and probabilistic automata of rather general type. We formulate here only very much restricted cases of these theorems. Let L be a language, L ~ X * . v

For arbitrary n $ I we consider the equivalence

~v' which holds if vr and v'r are or are not in the language L simultaneously for

all possible r

X l, i ~ n. Let fL(n) denote the number of such equivalence classes.

THEOREM 3.4. (F.M.Ablaev, [2]). If a language L is recognizable in real time k by a deterministic k-counter l-way automaton then fL(n) ~ const.n .

THEOREM 3.5. (F.M.Ablaev,

[2]). If a language L is recognizable in real time

by a probabilist~c k-counter l-way automaton with isolated cut-point then fL (n) < (const) n .

THEOREM 3.6. (F.M.Ablaev,

[2]). If a language L is recognizable in real time nk .

by a nondeterministic k-counter t-way automaton then fL(n) ~ (const)

These theorems and examples of languages recognizable by probabilistic multicounter automata and having large functions fL(n) help to prove advantages of any type of automata over another type. For instance, the following hierarchy theorems are proved.

THEOREM 3.7. (F.M.Ablaev,

[2]). For arbitrary positive integer k there is a

language L k recognizable in real time by a probabilistic

(k+1)-counter l-way automaton

with non-isolated k-counter

cut-point but not recognizable

l-way automaton with non-isolated

Counterparts

of Theorems

in real time by a probabilistic

cut-point.

3.4, 3.5, 3.6, 3.7 are proved also for multihead

finite automata. OPEN PROBLEM.

Prove or disprove Theorem 3.7 for probabilistic

multicounter

automata with isolated cut-point. OPEN PROBLEM.

Prove or disprove Theorem 3.7 for probabilistic

multihead

finite automata with isolated cut-point.

4. PALINDROMES

AND SIMILAR LANGUAGES

More complicated probabilistic palindromes.

These algorithms

of the same kind.

can be modified

Strangely enough,

nevertheless

stay unaffected:

recognition

of these languages.

THEOREM 4.1.

algorithms

are invented £or recognition

to recognize

several other languages

some languages very much similar to palindromes

no effective probabilistic

(R.Freivalds,

of

algorithms

are known for

[7, 10]. For arbitrary e > o, there is a probabi-

listic off-line Turing machine recognizing

palindromes

with probability

1-e in

const n log n time. IDEA OF THE PROOF. The input word and its mirror image are compared modulo m where m is a specially chosen random number.

The number m contains

bits where n is the length of the input word and c is an absolute using the prime number theorem by Cebishev.

independence

constant

found by

The c log n random bits are generated

rather many times until the number m tuens out to be prime. to ensure statistical

c log n random

of different possible

(The primality

comparisons

is needed

modulo random

m).

A rather old result by J.Barzdin Turing machine recognizing probabilistic

[3] shows that every deterministic

off-line

palindromes

needs at least const n 2 running time. For

off-line Turing machines

Theorem 4.1 cannot be improved because of

the following

THEOREM 4.2. machine recognizing

(R.Freivalds, palindromes

[7, 10]). Every probabilistic

off-line Turing

with isolated cut-point uses at least const n log n

running time. Essentially language

~xbx~

the same results as for palindromes

can be obtained

where x is a word and b is a special separating

Let M denote the language

~xby~

for the

symbol.

where x and y are words of the same length

and it is required that y is a larger number,

i.e. there is a bit x(i) such that

x = x(1)x(2) = y(2),

... x(i)

..., x ( i - 1 )

THEOREM 4.3. off-line

... x(n), y = y(1)y(2)

= y(i-1),

x(i)

(R.Freivalds,

Turing machine

... y(n), x(1) = y(1), x(2) =

[10]). For arbitrary

recognizing

It is not known whether

... y(i)

< y(i).

M with probability

this upper bound

e > o, there

is a probabilistic

1-e in const n (logn) 2 time.

can be improved.

Most probably,

it

is. Let K denote the language and it is required

~xby}

where x and y are words of the same length

that for every i, y(i) > x(i).

No probabilistic

off-line

Turing machine

is known to recognize K in o(n 2)

time. Let S denote the language

{xby}

the y-th bit of the word x equals We do not k n o w a precise

where

Ixl = 2 I~ , and it is required

reference but rumours have reached us about a con-

ference paper by A.Yao with a lower bound for the running w h i c h exceeds

the upper bound

fective probabilistic

algorithm must use some global characteristics

On the other hand,

to a bit-to-bit

there is a type of machines

can have advantages

type S. We consider

time for this language

in T h e o r e m 4.1. This result would show that every ef-

under comparison which are not reduced

machines

that

I.

over deterministic

l-way finite automata

of the words

comparison. for which probabilistic

ones when recognizing

languages

of

and use the number of states as the measure

of complexity. Let S required

n

denote the language

where

that the y-th bit of the word x equals

T H E O R E M 4.4. automaton

(F.M.Ablaev,

n e w result).

IYl = n,

n lyl = 2 , and it is

I.

(I) Every deterministic

l-way finite

recognizing

a probabilistic bility

{xby)

S has at least 22n states; (2) For arbitrary e > o, there is n l-way finite automaton with o(22n) states recognizing S with proban

1-e. IDEA OF THE PROOF.

x. The probabilistic of the segment

The deterministic

automaton

is determined

remembers

automaton has to remember

all the word

only a segment of the word x where the length

by e and all possible

start- and end-points

are equi-

probable. Let T denote the language

T H E O R E M 4.5.

on~ n .

(R.Freivalds,

2-way finite automaton recognizing We note for the contrast 2-way finite automaton 2-way Turing machines

[14]). For arbitrary e > o, there is a probabilistic T with probability

that no deterministic,

can do this. Moreover,

1-e. nondeterministic

it is proved

need at least const log n tape to recognize

A further development

or alternating

in [25] that deterministic T.

of the proof of T h e o r e m 4.5 allows us to prove the

following

theorem.

We consider finite automata on binary trees. directions:

up, left-hand-down,

the ordinary vertices automaton

right-hand-down.

Such an automaton can move 3

The automaton

in the tree and the extremal

can distinguish between

ones (the root, the leaves).

The

starts at the root, moves up and down and finally stops either in accepting

or in rejecting

state.

THEOREM 4.6.

(R.Freivalds,

new result).

For arbitrary e > o, there is a proba-

bilistic 3-way finite automaton on binary trees recognizing with probability whether or not the given tree is complete,

1-e

i.e. whether all the leaves are equidistant

from the root. No deterministic,

nondeterministic

or alternating

automata can do this.

5. ~ - W O R D S

Infinite behaviour abstract

of automata and machines

is usually regarded as too

to have something to do with the real life. On the other hand, there are

algorithms which are intended never to stop, for instance, Automata on words.

W-words

operating

are rather similar to corresponding

This makes us believe

that their properties

systems.

automata on finite

and the most important results

are likewise similar for automata on K~-words and for automata on finite words. theless the properties Deterministic, cut-point)

nondeterministic,

finite automata

called regular languages. for automata on

alternating

(on finite words)

that a similar result can be true

ones, and this advantage

can be proved in a very strong form, namely,

(with isolated

accept the same class of languages,

This makes us believe

more powerful than deterministic

finite automata are

of probabilistic

automata

even in the case of probability

I of

(so called Las Vegas algorithms).

Deterministic

finite

W-automaton

differs from deterministic

on finite words only in the mechanism of acceptance. states it has the accepting automaton

and probabilistic

G>-words. However for ~>-words probabilistic

the right result

Never-

of the two kinds of automata differ strikingly.

finite automata

Instead of the set of accepting

set of sets of states. An U - w o r d

is accepted by the

if the set of those states which are entered infinitely many times is in

the accepting

set of sets of states.

We say that the automaton accepts the (p > I/2) if the automaton accepts every p, and the automaton accepts every

of sets of states

{{q|,

L with probability

in L with probability

K~-word in L with probability

Consider the following probabilistic O, 1 , the set of states

W-language

W-word

finite ~ - a u t o m a t o n

p

no less than

at most

1-p.

M with input alphaDet

{q1' q2' q3' q4~' the initial state q1' the accepting q3~'

{q1' q2' q3)'

{q2' q3~'

{q|' q3' q4)'

set

{q1'q2'q3'q4)'

10

{q2' q3' q4} } and the following transition probabilities.

t

input letter

l

1

to ql

0

q2

q3

q4

from

ql

I/2

I/2

0

0

q2

0

I

0

0

q3

I/2

I/2

0

0

q4

I/2

I/2

0

0

ql

q2

q3

q4

ql

0

0

~

0

q2

0

0

0

I

q3

0

0

0

I

q4

0

0

0

I

input

to

letter I from

THEOREM 5.1. (R.Freivalds, D.Taimi~a, new result). There is an V-language L such that: (I) the probabilistic automaton M accepts L with probability I, (2) there is no deterministic finite automaton which accepts L. The proof of this theorem is based on classical result by Borel and Cantelli. Let A t 1 A 2 ,

...

be an infinite sequence of independent Bernoulli events. Let

ak be the probability of A k.

LEMMA 5.2. (Borel, Cantelli,

[6]). If the sum ~ ' a k converges then only finite

number of events A k take place with probability I. If the sum 7" a k diverges then infinitely many events Ak take place with probability If the input ~ - w o r d

I.

has only finite number of ones then the state q3 can be

entered only finite number of times and the V - w o r d

is rejected. If the input V - w o r d

has only finite number of zeros then only the state q4 is repeated infinitely often

11

and again the 6#-word is rejected.

For all the other possible

K-words

let n. denote

--n.

the length of the i-th segment of zeros. rejected,

and if the sum diverges

nondeterministic

or alternating

If 3" 2

1

z

converges

then the ~ - w o r d

then the &)-word is accepted.

finite automaton

This theorem came rather unexpected

results which were interpreted When analyzing

nizable by probabilistic general reduction input alphabet

finite automata

theorem for automata

~" with

: S ~ ~- ~ S ,

as impossibility

the proof of Rabin's

and the initial

and automata

since there were rather general

of such theorem.

theorem on regularity

[23] R.G.Bukharaev A = < ~,

(possibly infinite)

or

can make this distinction.

Theorem 5.1 exposes a distinction between automata on W - w o r d s on finite words.

is

No deterministic

S,

~(s

set of states

of languages

recog-

proved the following , ~ ), s o

> over finite

S, transition

function

state s . o

THEOREM 5.3.

(R.G.Bukharaev,

[5]). Let automaton A have the propert-ies:

I. S is a compact subset of a metric 2. The transition function

where S! = o~(Si, ~ )

~(s 1 $2))

3. R ~ S is such that Then the language

i = 1

V(SI,

"{p g ~ *

space with metrics ~ ;

~ is such that

~($I,

$2, ~ )

!

(~($I,

S~)

2;

S 2) ((S I • R) & (S 2 ~ R) -~ j~(SI,S 2) > G

: ~(S

, p) E R

).

is regular.

o

The Rabin's theorem is a special case of this theorem. N.R.Nigmatullin

[21] and F.M.Ablaev

of Theorem 5.3 for &)-languages.

F.M.Ablaev's

[I] proved a bit different theorem provides

of the number of states of the minimal deterministic nizes the language. a probabilistic

Comparison

finite

counterparts

even an upper bound

finite automaton which recog-

of this theorem and Theorem 5.1 shows that there is

W-automaton

which is not a netric automaton

in the sense of

[1]. OPEN PROBLEM.

Does Turakainen's

theorem

[26] hold for V - a u t o m a t a ?

6. SINGLE-LETTER ALPHABET

A serious obstacle for advantages a restriction advantages

of probabilistic

of the size of the input alphabet.

of probabilistic

letter alphabet.

machines

machines

and automata

It is very difficult

in the case of recognition

is

to prove any

languages

in single-

On the other hand, until recently there were no results on such

difficulties. The single-letter

alphaber

is a serious obstacle

ministic machines as well. There is a well-known showing that context-free

languages

[18]. Hence nondeterministic deterministic

for advantages

result in the formal

in a single-letter

alphabet

of nondeter-

language

are regular languages

l-way pushdown machines are no more powerful

finite automata for single-letter

alphabet

theory

languages.

than

(For at least

12

2-1etter alphabet languages pushdown machines are much more powerful than finite automata).

THEOREM 6.1. (J.Ka~eps, new result).

If a language in single-letter alphabet

is recognized by a probabilistic l-way pushdown machine with an isolated cut-point then

the language is regular. ~This result can be mistakenly seen as trivial but it is not. The probabilistic

pushdown machines are very near to the ability to recognize a non-regular language. This can be shown by considering approximate computation of functions by probabilistic l-way pushdown machines. We consider l-way pushdown machines with output. The function is computed by transforming the input word in single-letter alphabet into the output word in singleletter alphabet using the work tape (being pushdown> in an arbitrary alphabet. If the machine is deterministic or nondeterministic then only linear functions can be computed.

THEOREM 6.2. (R.Freivalds, new result).

There is a function f(n) of the order

of magnitude log n which can be approximately computed by a probabilistic l-way pushdown machine with output.

THEOREM 6.3. (J.Ka~eps, new result).

If a language in single-letter alphabet

is recognized by a probabilistic 2-way finite automaton with an isolated cut-point then the language is regular. This latter result contrast nicely to the Theorem 4.5 above.

REFERENCES I.

F.M.Ablaev, On the problem of reduction of automata, Izvestija VUZ. Matematika, 1980, No.3, 75-77 (Russian).

2.

F.M.Ablaev, Capabilities of probabilistic machines to represent languages in real time, Izvestija VUZ. Matematika, 1985, No.7, 32-40 (Russian).

3. J.M.Barzdin, Complexity of recognition of palindromes by Turing machines, Problemy kibernetiki, v.15, Moscow, Nauka, 1965, 245-248 (Russian). 4. J.M.Barzdin, On computability by probabilistic machines, Doklady AN SSSR, 1969, v. 189, No.4, 699-702 (Russian). 5. R. GoBukharaev, Foundations of the theory of probabilistic automata, Moscow, Nauka, 1985. 6. W.Feller, An introduction to probability theory and its applications, v.1, New York et al., John Wiley, 1957. 7. R.Freivalds, Fast computation by probabilistic Turing machines, Ucenye Zapiski Latvijskogo Gosudarstvennogo Universiteta, 1975, v.233, 201-205 (Russian)

13

8.

R.Freivalds, Probabilistic machines can use less running time, in: Information Processing '77, IFIP (North Holland, 1977), 839-842.

9.

R.Freivalds, Language recognition with high probability by various classes of automata, Doklady AN SSSR, 1978, v.239, No.l, 60-62 (Russian).

10. R.Freivalds, Speeding of recognition of languages by usage of random number generators, Problemy kibernetiki, v.36, Moscow, Nauka, 1979, 209-224 (Russian). 11. R.Freivalds, Finite identification of general recursive functions by probabilistic strategies, Proc. Conference FCT, 1979, 138-145. 12. R.Freivalds, On principal capabilities of probabilistic algorithms in inductive inference, Semiotika i informatika, 1979, No.12, 137-140 (Russian). 13. R.Freivalds, A probabilistic reducibility of sets, Proc. USSR Conference on Mathematical Logics, 1979, 137 (Russian). 14. R.Freivalds, Probabilistic two-way machines, Lecture Notes in Computer Science, Springer, 1981, v°118, 33-45. 15. R.Freivalds, Capabilities of various models of probabilistic one-way automata, Izvestija VUZ. Matematika, 1981, No.5, 26-34 (Russian). 16. R.Freivalds, Characterization of capabilities of the simplest method for proving the advantages of probabilistic automata over deterministic ones, Latvijskij matematiceskij ezegodnik, 1983, v.27, 241-251 (Russian). 17. R.Freivalds, Advantages of probabilistic 3-head finite automata over deterministic multi-head ones, Latvijskij matematiceskij ezegodnik, 1985, v.29, 155-163 (Russian). 18. A.V.Gladkij, Formal grammars and languages, Moscow, Nauka, 1973. 19. E.M. Gold, Language identification in the limit, Information and Control, 1967, v. 10, 447-474. 20. K. de Leeuw, E.F.Moore, C.E.Shannon and N.Shapiro, Computability by probabilistic machines, Automata Studies, Princeton University Press, 1956, 183-212. 21. N.R.Nigmatullin, Towards the problem of reduction of W-automata, Verojatnostnye metodi i kibernetika, Kazan University Press, 1979, 61-67 (Russian). 22. L.Pitt, Probabilistic inductive inference, Yale University, YALEU /DCS/ TR-400, June 1985. 23. M.O.Rabin, Probabilistic automata, Information and Control, 1963, v.6, No.3, 230-245. 24. H.Rogers Jr., Theory of recursive functions and effective computability, New York, McGraw Hill, 1967. 25. R.E.Stearns, J.Hartmanis and P,M.Lewis II, Hierarchies of memory limited computation, Proc. IEEE Symposium on Switching Circuit Theory and Logical Design, 1965, 179-190. 26. P.Turakainen, On probabilistic automata and their generalizations, Suomalais, tiedenkat, toimituks., 1968, v°53. 27. A.V.Vaiser, Notes on complexity measures in probabilistic computations, in: Control systems, v.1, Tomsk University Press, 1975, 182-196 (Russian).

14

28. S.V.Yablonskiy, On lagorithmic difficulties in minimal circuit synthesis, Problemy kibernetiki, v.2, Moscow, Fizmatgiz, 75-121 (Russian).