Orthogonal Evolution and Anticipation

Orthogonal Evolution and Anticipation

by Hans-Rudolf Thomann October 2008

Author address

Im Altried 1h, 8051 ZΓΌrich, [email protected]

All rights reserved 1/32

Orthogonal Evolution and Anticipation

CONTENTS 0

Introduction ................................................................................................................. 4

1

Orthogonal Evolution ................................................................................................. 6

2

3

4

5

1.1

Spectrum ......................................................................................................................... 6

1.2

Fundamental Frequency Bound ..................................................................................... 7

1.3

Representation of Anticipation Amplitudes .................................................................. 8

Random Sampling..................................................................................................... 10 2.1

Sampling Scheme ......................................................................................................... 10

2.2

Distribution of the Spectral Difference ........................................................................ 12

2.3

Anticipation Statistics................................................................................................... 12

2.4

Multiple Periods ............................................................................................................ 13

Anticipation (Point Spectrum) ................................................................................. 15 3.1

Minimum Anticipation ................................................................................................... 15

3.2

Maximum Anticipation .................................................................................................. 16

3.3

General Case ................................................................................................................. 16

Anticipation (Continuous Spectrum)....................................................................... 19 4.1

Minimum Anticipation ................................................................................................... 19

4.2

Maximum Anticipation .................................................................................................. 19

4.3

General Case ................................................................................................................. 20

Conclusions .............................................................................................................. 23 5.1

Summary........................................................................................................................ 23

5.2

Statistical Permises ...................................................................................................... 23

5.3

Features and Limitations .............................................................................................. 23

5.4

Applications................................................................................................................... 24

5.5

Open Problems.............................................................................................................. 24

6

Bibliography .............................................................................................................. 26

7

Appendix: Proofs of Proposition 8 .......................................................................... 27 7.1

Mean and Variance of ππ .............................................................................................. 27

7.2

Mean and Variance of ππ΅.............................................................................................. 30

All rights reserved 2/32

Orthogonal Evolution and Anticipation

Abstract Quantum states evolving, for some 0 < π β€ β, into a sequence of mutually orthogonal states at times π‘ + ππ π = 0, β¦ , π β 1 exhibit an interesting physical effect. The analysis of the anticipation probabilities ππ =

ππ‘+π , ππ‘+ππ

2

shows, that for randomly chosen states π

2

measurements of the state at time π‘ + π 2 reveal information about the state at time π‘ Β± ππ π = 0, β¦ , π β 1 , anticipating future states and reflecting past states with significant π probability. For π < β and any 0 < πΏ < 1, the probability to measure at time π‘ + 2 states from π

π

the set ππ‘+ππ π β 2 < 2 πΏ exceeds π 2 πΏ with certainty as π β β, where π 2 is a constant π

independent from the period. For π = β and any π, the probability to measure at time π‘ + 2

states from the set ππ‘Β±ππ π > π is β₯ π 2 . We characterize the spectrum, establish an analog to Planck's relation, define a random sampling scheme, analyze the resulting distribution of the anticipation probabilities and point out applications.

All rights reserved 3/32

Orthogonal Evolution and Anticipation

0

Introduction ππ»

Under a Hamiltonian π», a quantum state π evolves into an orbit ππ‘ = ππ‘ π, where ππ‘ = π β β π‘ . Due to the unitarity of ππ‘ , the amplitude πππ^ π = ππ‘ , ππ‘+π = π βπππ πππ π is independent from π‘. If for some integer 0 < π β€ β: πππ^ ππ = πΏπ πππ π π β β€ , then ππ‘+ππ π = 0, β¦ , π β 1 is a set of mutually orthogonal states, for every π‘. The subject of our study are states with this property, named orthogonal evolution of period π and step size π, and the anticipation amplitudes πΌπ = πππ^ π + 1 2 π , i.e. the inner product π

of states at time ππ and 2 , as well as the anticipation probabilities ππ = πΌπ 2 . It turns out, that π

for randomly chosen states π, measurements of the state at time 2 reveal information about the states Β±ππ π = 0, β¦ , π β 1 . Future states are anticipated and past states reflected with significant probability. These probabilities and the strength of this effect, which we name Quantum-Mechanical anticipation, are quantified in this paper. Chapter 1 shows that orthogonally evolving states have pure point or absolutely continuous spectrum πππ π , but neither mixed nor singular continuous spectrum. The fundamental 2

upper bound βπ β1 β€ Ο infπ 0 π» in terms of the spectral quantity π» is established, which is an analog to Planck's relation. The spectral difference is defined in terms of the reduced spectrum and used to represent the anticipation amplitudes. In chapter 2 we justify from the symmetries of the set of self-adjoint Hamiltonians the random sampling scheme underlying our analysis, derive the statistical distribution of the spectral difference as π is randomly sampled from all orthogonally evolving states with period π and step size π, identify the statistical quantities used in the subsequent chapters to assess anticipation strength and show that multiples of π are unlikely to occur by chance. In chapter 3 the anticipation effect is analyzed under point spectrum. The distribution of the ππ and of various observables is derived for some model states as well as under random π sampling. Theorem 1 states, that for any 0 < πΏ < 1, the probability to measure at time 2 a π

π

state from the set πππ π β 2 < 2 πΏ exceeds π 2 πΏ with certainty as π β β, where π 2 is a constant independent from the period. The operator ππ expectation β ππ .

(defined in section 2.3) has

Chapter 4 determines anticipation under absolutely continuous spectrum. Theorem 2 states π the constant lower bound π 2 for the probability to measure at time 2 a state from the set πΒ±ππ π > π , and infinite expectation for the operator ππ . The statistics in the continuous case are the limiting values of those in the periodic case as π β β.

In the final chapter we draw conclusions, discuss some open questions and point out applications. Anticipation in general type quantum evolution will be analyzed in a future paper.

All rights reserved 4/32

Orthogonal Evolution and Anticipation

All rights reserved 5/32

Orthogonal Evolution and Anticipation

1

Orthogonal Evolution

After introducing some terminology, we characterize the spectrum of orthogonally evolving states, derive necessary and sufficient conditions for their existence, establish a fundamental bound on the frequency and represent the anticipation amplitudes in terms of the reduced spectrum. Definition 1 Let π» be a Hamiltonian, π a state and 0 < π β€ β an integer. Throughout the paper we will assume that π 22 = 1. ππ»

a) π is evolving under π» into ππ‘ = ππ‘ π, where ππ‘ = π β β π‘ , has spectral measure πππ π . b) π has orthogonal evolution of period π and step size π, iff πππ^ ππ = πΏπ πππ π π β β€ . Here πππ^ π = ππ‘ , ππ‘+π = π βπππ β πππ π is the Fourier transform of πππ π . c) πΌπ = πππ^ π + 1 2 π is the anticipation amplitude of πππ . d) ππ = πΌπ 2 is the anticipation probability of πππ . e) The reduction of πππ π modulo π is πππ,π π β π =ππ πππ π πππ π . β

f)

The ΞΊ-cut of πππ π modulo π is the function ππ ,π π =

g) The standard scale is defined to make

π β

π β π

ππ π ππ +

ππ π ,π π

πββ€ .

= 1.

1.1 Spectrum Orthogonal evolution constrains the spectrum to either pure point or absolutely continuous, as stated by the following proposition. Proposition 1: Characterization of the Spectrum (Orthogonal Evolution Constraint) Let π have orthogonal evolution with period π and step size π. In standard scale, a) If π < β, then ππ,2π

2ππ π

1

=π

π = 0, β¦ , π β 1 is pure point.

ππ

b) If π = β, then πππ,2π π = 2π 0 β€ π β€ 2π is absolutely continuous. Proof: ππ»

The spectrum of π» and ππ are related by ππ‘ = π β β π‘ . ππ is a bounded operator with spectrum on the complex unit circle [RS80]. Its spectral measure is uniquely determined by π ππ π0 , π0 π β πΆ β β . These functions in turn are uniquely determined by the monomials π§ π , which due to mutual orthogonality satisfy πππ π0 , π0 = πππ , π0 = πΏπ πππ π . Thus the spectral measure of ππ is equally distributed, in the periodic case over the unit roots, in the infinite case over the whole complex unit circle. Singular continuous spectrum is excluded. Mixed spectrum cannot occur, as in the periodic case the initial state must recur, whereas in the aperiodic case πππ^ π‘ β π 1 (see [YL96]). The numerical values (in standard scale) follow from

2π 0

π βπππ πππ,2π π = πΏπ πππ

definition 1b) and the normalization of π. The integral becomes π 2π 0

ππ π βπππ 2π

β1

π πβ1 β2ππππ π π=0 π

(see and

for π < β and π = β, respectively, which is easily verified to yield the correct

values.

q.e.d. All rights reserved 6/32

Orthogonal Evolution and Anticipation

A more elaborate proof facing the various intricacies and the explicit determination of the Fourier expansion of a continuous function π on 0,2π such that πππ,2π π = D2 f will be given in a future paper. Proposition 1 has obvious consequences for the existence of orthogonal evolution. Corollary 1: Existence Necessary and sufficient for the existence of a state π evolving orthogonally with period π under Hamiltonian π» is that either π < β and the reduced point spectrum of π» contains π equidistant points (such as the π 2 operator), or π = β and π» has non-empty absolutely continuous spectrum (such as the ββ operator). Proof: In either case states meeting proposition 1 are easily constructed. 1.2 Fundamental Frequency Bound In terms of the spectral quantity π» we derive here the fundamental frequency bound 2 βπ β1 β€ Ο π» , an analog to Planck's πΈ = βπ. Definition 2 For any Hamiltonian π» and state π, π» β

π β π0 πππ π .

Proposition 2 Let π be any state evolving under Hamiltonian π». Then a) π» < β iff π» < β. b) π» > 0 iff π β 0 and π not an eigenstates with zero eigenvalue. c) π, ππ‘ β₯ π π π, ππ‘ β₯ 1 β π» π‘ β. d) If πππ^ π = 0, then π β₯ supπ 0 β π» β π0 . e) The supremum is attained when π0 equals the median of π, i.e. π π0 , β π ββ, π0 . f) infπ 0 π» β π0 = 0 iff π is an eigenstate.

=

Proof: a) By elementary measure theory,

π» =

0 ππππ ββ

ππππ π

exists iff

β ππππ 0

π < β and

π < β, i.e. iff π» < β. b) As π» is a sum of two non-negative quantities, it vanishes only, if πππ π = 0 or π π’ππ ππ π = 0 c) By the SchrΓΆdinger equation and the triangle inequality, π ππ‘

ππ‘ β π

2 2

=

Therefore ππ‘ β π d) By orthogonality, follows.

π»

π»

βπ β ππ‘ , π + π, βπ β ππ‘ 2 2

β€

2 β

= 2

π ππ‘ sin β β

πππ π

2

β€β π» .

π» π‘. The result follows from π π π, ππ‘ = 1 β ππ‘ β π

ππ β π

2 2

2 2

2

= 2. As π is the same for π» β π0 , the claimed result

All rights reserved 7/32

Orthogonal Evolution and Anticipation

ππ¦

e) From ππ f)

0

β πππ 0

π» β π0 = β

π + π0 +

β πππ 0

π0 β π = 0.

Elementary. q.e.d. π

2c) yields a lower bound on the modulus of the anticipation amplitude at π‘ = 2 . shows that at given π» the frequency of the orthogonal evolution cannot be arbitrarily high, or equivalently the step size arbitrarily small. A bound for the passage time based on π» 2 has been established by Fleming [Fl73] and discussed by various authors. The bound given in 2d) for the passage time can be strengthened to a bound for the step size of orthogonal evolution: Corollary 2: Fundamental Frequency Bound The frequency of an orthogonal evolution is bounded by 2

βπ β1 β€ Ο π» . Proof: Proposition 2d) and 2e) imply βπ β1 β€ π» β π0 . We conclude from proposition 1 that among all Hamiltonians providing orthogonal evolution those with spectrum evenly π distributed in β π, π minimize π» , which is easily found to be 2 . q.e.d.

1.3 Representation of Anticipation Amplitudes From now on all spectra are in standard scale. In proposition 1 we have determined the reduced spectrum πππ,2π π . Using a notation for the reduced spectrum on the doubled interval 0,4π we establish a representation of πΌπ which we use throughout the paper. Definition 3: Spectral difference a) Let π < β. Then π₯π,π = ππ,4π

π

2π π + π

π = 0, 1; π = 0, β¦ , π β 1 .

π¦π = π₯π,0 β π₯π,1 is the spectral difference. π

b) Let π = β. Then π₯π ,π = ππ ππ,4π 2ππ + π π = 0,1; 0 β€ π β€ 2π . π¦π = π₯π ,0 β π₯π ,1 is the spectral difference. Proposition 3: Representation of anticipation amplitudes 1 π 1 2π

1 1 π π 1 1 , β 2π β€ π¦π β€ 2π 1 β2ππ πβ π π

a) If π < β, then 0 β€ π₯π,π β€ , β β€ π¦π β€ , π₯π,0 + π₯π,1 = b) If π = β, then 0 β€ π₯π ,π β€ c) If π < β, then πΌπ = d) If π = β, then πΌπ =

πβ1 2 π=0 π¦π π 1 2π βπ πβ π ππ 2 π¦π π 0 2π

π πππ‘ππππ .

All rights reserved 8/32

. 1

, π₯π ,0 + π₯π ,1 = 2π .

π πππ‘ππππ .

Proof:

1 π

Orthogonal Evolution and Anticipation

a) From proposition 1a) and definition 3a). b) From proposition 1b) and definition 3b). c) By definition 1c) and 3a), πβ1

πΌπ =

π₯π,0 π

1 β2ππ πβ π π 2

π=0 πβ1

=

π₯π,0 β π₯π,1 π

+ π₯π,1 π

1 β2ππ πβ 1+π π 2

1 β2ππ πβ π π 2 .

π=0

The claim follows from definition 3a). d) By definition 1c) and 3b), 2π 1 1 βπ πβ π ππ βπ πβ 2 2 πΌπ = π₯π ,0 π + π₯π ,1 π 2π 0 2π 1 βπ πβ π ππ 2 = π₯π ,0 β π₯π ,1 π , 2π 0 The claim follows from definition 3b).

2π+π

ππ 2π

q.e.d.

All rights reserved 9/32

Orthogonal Evolution and Anticipation

2

Random Sampling

Corollary 1 implies that orthogonally evolving states exist under infinitely many dynamics. In the remaining chapters we investigate, how likely such states are to exhibit anticipation of significant strength. To this end, we define in this chapter a random sampling scheme which will underlie all our subsequent analyses. We justify this scheme by the symmetry properties of the set of self-adjoint Hamiltonians. Under this scheme we derive the statistical distribution of the spectral difference, which determines the anticipation amplitudes. We define the set of statistical quantities used in the subsequent chapters to assess anticipation strength, and finally show that multiples of the period are unlikely to occur by chance. 2.1 Sampling Scheme In order to investigate the likelihood of physical states to exhibit anticipation of a certain strength we must consider the set of all orthogonally evolving states and the way such states arise. We focus here on a laboratory situation, where the experimentalist prepares an experimental setup and an orthogonally evolving target, and then studies its evolution under the laws of nature. The setup is modeled by a Hamiltonian and the target by a physical state. What Hamiltonians and states are preparable depends on the experimentalist's skills and capabilities and the constituents available in nature. Abstracting from these practical limitations, we only constrain the experimentalist to prepare self-adjoint Hamiltonians and states exhibiting orthogonal evolution, and analyze the outcome if these are randomly chosen, letting freely fluctuate all degrees of freedom except those fixed by the orthogonal evolution constraint (proposition 1). We are going to define a sampling scheme, where, for a given step size and period, first a self-adjoint Hamiltonian is chosen from the set of all self-adjoint Hamiltonians admitting orthogonal evolution, and then a state is chosen from the set of states orthogonally evolving under that Hamiltonian. Proposition 3 shows that orthogonal evolution and anticipation are completely determined by the spectral properties of the Hamiltonians and states involved, more specifically by the spectral difference. Thus, for our purpose, only constraints on and degrees of freedom of spectra need consideration. Inverse theory tells us that, for any bounded real measure π and any positive integer π, there are uncountably many self-adjoint operators π» and states π β πΏ2 βπ with spectral measure π. This holds as well for at least one SchrΓΆdinger operator ββ + π with self-adjoint extension π». This enables us to define the sampling scheme in terms of probability spaces of spectral measures. The core feature of spectral measures of orthogonally evolving states are the π -cuts modulo 2π (see definition 1f), related to the spectral measure π by ππ 2ππ + π = ππ ,2π π πππ,2π π . The quantities π₯π,π from definition 3 are π₯π,π = ππ ,2π 2π + π πππ,2π π . Whereas the reduced spectral measure is fixed by the orthogonal evolution constraint, the π cuts are arbitrary doubly-infinite sequences with non-negative values, summing up to 1. Let's firstly consider their supports π π’ππ ππ ,2π , i.e. the patterns of points π β β€ where ππ ,2π π β 0. This pattern is a property of the Hamiltonian, whereas the size of ππ ,2π π is a determined by the state.

All rights reserved 10/32

Orthogonal Evolution and Anticipation

Self-adjointness allows for any such pattern at any index π , and to choose patterns at different indices π independently. This holds for point spectrum as well as for continuous spectrum, with the only constraint that for fixed π the dependence from π must be absolutely continuous. While the following arguments will be given for point spectrum, they are easily extended to continuous spectrum as the limiting case. This symmetry of the set of self-adjoint Hamiltonians allows us to require the sampling scheme to make the π π’ππ ππ ,2π a family of independent identically distributed random variables. As these symmetries hold as well when constraining the Hamiltonians to admit orthogonal evolution, we demand the same property in this case. Given any Hamiltonian, the value of ππ ,2π π on π π’ππ ππ ,2π depends only on the state π. When randomly sampling states π β πΏ2 βπ meeting the orthogonal evolution constraint, then ππ ,2π can attain any non-negative function π β π1 π π’ππ ππ ,2π meeting this constraint, independently for any index π , because orthogonal components of these states belonging to different indices can be sampled separately in the discrete case. As by the preceding requirement the supports are i.i.d., we impose the same requirement on the sampling distribution of the π -cuts. As all distributions as well as the reduced spectrum are index-independent, we will omit indices in the following. Let ππ be sampling distribution of the π -cuts ππ ,2π at period π. Notice that ππ is not obtained by conditioning from another, unconditional distribution. The orthogonal evolution constraint is met by normalization of the π -cuts, not by conditioning of some distribution. Practical sampling procedures will be designed to meet the constraint approximately, but not by selection of "good" states from unconstraint samples. Only the latter procedure would yield a conditional distribution, whereas the real procedure is sufficiently modeled by a procedure where orthogonal components are sampled and normalized independently (index by index). The period π only determines the orthogonal evolution constraint. As this constraint is met by π normalization, it only enters into ππ as a scaling factor: ππ ππ ,2π = ππ π ππ ,2π for any π, π and ππ ,2π . As our reasoning did not depend otherwise from the period π, we finally postulate the sampling distribution of π -cuts to be independent from π, up to the orthogonal evolution constraint and the scaling factor. Our requirements are summarized in the following definition. Definition 4: Random Samples A family of sets ππ of measures π is a family of random samples of measures of orthogonally evolving states with period π β€ β iff the following holds for all π: a) The reduced spectrum ππ,2π meets proposition 1 above. b) For π < β, the π -cut-valued random variables ππ are i.i.d. under probability measure ππ π = π1 ππ independent from π . c) For π = β, the π -cut-valued random variables ππ are i.i.d. under probability measure πβ π = π1 2ππ independent from π .

All rights reserved 11/32

Orthogonal Evolution and Anticipation

From the preceding arguments and definition 4 follows: Corollary 3: Existence The class of self-adjoint Hamiltonians admits for sampling schemes yielding random samples as defined above. Further evidence justifying our sampling scheme will be provided by the main results and discussed in the final chapter. 2.2 Distribution of the Spectral Difference Random sampling according to the above scheme yields a probability distribution of the spectral difference function π¦π . It is no surprise that these are again i.i.d.. Proposition 4 Consider the sampling distribution as π is randomly sampled under the sampling scheme of definition 4. Then for some distribution function πΉ π§ on β1, +1 the following holds: a) If π < β, then π¦π π = 0, β¦ , π β 1 is a sequence of i.i.d. random variables with 1 1 distribution πΉ ππ¦ with density ππ ππ¦ for β π β€ π¦ β€ π .

b) If π = β, then π¦π 0 < π β€ 2π is a family of i.i.d. random variables with distribution 1 1 πΉ 2ππ¦ with density 2ππ 2ππ¦ for β β€ π¦ β€ . 2π 2π c) If the random sampling scheme is invariant under translations of the energy scale, then the density π is an even function. Proof: a) For π < β π‘he distribution of π₯π,s is ππ π₯ = π1 π2π ππ₯ β₯ π2π 2π + π , from which πΉ is obtained. The scaling is due to proposition 1. The i.i.d. and period dependence are direct consequences of definition 4. b) For π = β π‘he distribution of π₯π,s is ππ π₯ = π1 π2π 2ππ₯ β₯ π2π 2π + π , from which πΉ is obtained. The scaling is due to proposition 1. The i.i.d. and period dependence are direct consequences of definition 4. c) Translations of the energy scale by 2ππ effect a shift by βπ on the π -cuts, transforming them to ππ ,2π π β π . If π is odd, the sign of the spectral difference is toggled. q.e.d.

2.3 Anticipation Statistics We identify here the statistical quantities used to assess anticipation strength in the subsequent chapters, and recall some statistical terminology and basic facts. The following anticipation statistics will be analyzed in the subsequent chapters, with π as by definitions 5a and 5b below. π

a) ππ , the probability to measure πππ at time 2 , and ππ β π π > π . All rights reserved 12/32

Orthogonal Evolution and Anticipation

π

b) ππ‘ππ‘ = ππ , the probability to measure one of the states πππ at time . 2 c) Observable ππ π β₯ 0 , with value ππ in state πππ π β₯ 0 and expected value ππ β ππ ππ . We perform the analysis for minimum and maximum anticipation model states as well as under biased and unbiased random sampling according to definition 4 above. In the sampling scenarios, we provide the sample mean and variance as well as stochastic asymptotics of the probabilities. The following definitions will be used: Definition 5 π πππ π π πππ π β€ π + 1 2 . π + 1 β π πππ π ππ‘ππππ€ππ π If π = β, then π = π . ππ = πΈ π§ π π = 0, 1, β¦ is the ππ‘π moment. π 2 π§ = πΈ π§ β π1 2 is the variance. π§ β ππ 1 iff βπ > 0 βπ: lim sup π π§ > π < π. π§ β ππ 1 iff βπ > 0: lim π π§ > π = 0 .

a) If π < β, then π = b) c) d) e) f)

ππ and ππ are Pratt's stochastic asymptotics [Pr59], the former expressing convergence in distribution, or stochastic boundedness, the latter stochastic convergence to zero [BFH75]. Recall the following facts: Proposition 5 a) If the probability density is an even function, then π2π+1 = 0. b) πΈ ππ§ π = π π ππ . c) π 2 = π2 β π12 . d) π π§ > πΈ π§ π§βπ§ π

π

1 π

1 , β2

1

β < βπ

π > 0, β > 0 .

e) π >β < where π§ = π1 . f) π§ = π§ + ππ π . π§ g) If π β π π§ then π§ = 1 + ππ 1 . Proof: Parts a-c) are elementary. d) is the BienaymΓ©-Chebychev inequality [TJ71], e) its special case π = 2, f) and g) direct consequences of e) and definition 5e-f). 2.4 Multiple Periods Random sampling of orthogonally evolving states with period π does admit states with period ππ for some π > 1. Such states have π¦π = 0 βπ, thus πΌπ = 0. However, they are not likely to occur by chance in random samples.

All rights reserved 13/32

Orthogonal Evolution and Anticipation

Proposition 6 Let π be a state with orthogonal evolution of period π. a) π < β. For any π > 0, let π be the cardinality of the set π ππ¦π < π, 0 β€ π < π . Then π has the Binomial distribution with parameter π and πΉ π . b) π = β. For any π > 0, then the set π π¦π < π, 0 < π β€ 2π has Lebesgue measure πΉ βπ, +π . Proof: a) Immediately from proposition 4. b) From proposition 4, as the variables are i.i.d. q.e.d. This result rules out the occurrence by chance of period ππ or, equivalently, step size π π when randomly sampling states with period π and step size π. However it does not guarantee strong anticipation. This will be provided by the proof of constant probabilistic lower bounds on ππ in the next two chapters.

All rights reserved 14/32

Orthogonal Evolution and Anticipation

3

Anticipation (Point Spectrum)

Here we determine, for model and random states with π < β, the values and distributions of the anticipation statistics identified and defined in section 2.3, and prove theorem 1. 3.1 Minimum Anticipation We consider here the extremal case of constant π¦π . Notice that these model states have zero occurrence probability. Proposition 7 Let ππ¦π = π¦ π = 0, β¦ , π β 1 for some β1 β€ π¦ β€ 1. Then a) ππ =

π¦2 π 2 sin 2 π

1 πβ 2

β πβ2 π¦ 2 at π = b) ππ β π π β1

with maximum value β

π π 2

and ππ β π πβ2

πβ

2

π 2

πβ

π 2

4π¦ 2 π2

at π = 0 and π = 1, minimum value

.

.

c) ππ‘ππ‘ = π¦ . d) π1 = ln π + π 1 , ππ = π ππβ1

π>1 .

Proof: a) By proposition 3c) and the premise πβ1

πΌπ = π

=

β1

πβ1 π¦

π¦

π

1 β2ππ πβ π π 2

π=0 1 β2ππ πβ 2 π 1 β2ππ πβ2 π

β1 π

=

β1

β2πβ1 π¦ 1 β2ππ πβ2 π

π

= ππ¦ β1

β1

π

1 βππ πβ 2

π

1 π sin π π β 2

π π¦ The maxima, minima and asymptotics are elementary. b) From a) by the Euler summation formula. c) From Lemma 1 below. d) From a) by the Euler summation formula.

π

2π β 1 β p otherwise

q.e.d.

Lemma 1 πβ1

πβ2 sinβ2 π π β π=0

1 2

π=1

Proof: Let π be a superposition of exactly π eigenfunctions. Then ππππ ππ‘ π‘ β β = ππππ πππ π β 2 β€ . Therefore ππ‘ππ‘ = ππ 2 2 = 1. As in this case π¦ = ππ¦π = 1, the claimed result follows. The alternatively, direct summation, is easily achieved by adapting theorem 4.9a of [He88] to periodic functions. All rights reserved 15/32

Orthogonal Evolution and Anticipation

3.2 Maximum Anticipation We consider here the extremal case of alternating π¦π = β1 π π¦. Notice that these model states have zero occurrence probability. Proposition 8 Let π be even, ππ¦π = β1 π π¦ π = 0, β¦ , π β 1 for some β1 β€ π¦ β€ 1. Then π¦2

a) ππ = βπ

π 2 cos 2 π β2 2

1 πβ 2

π

, which has maximum value β

π¦ at π = 0 and π = 1, and ππ β π πβ2

b) ππ = ππ‘ππ‘ β π π

β1

4π¦ 2 π2

at π =

π 2

, minimum value

πβπ π .

πβπ π .

c) ππ‘ππ‘ = π¦ 2 . d) ππ = π ππ + ππβ1 ln π . For odd π, this evolution degenerates to an orthogonal evolution with period π and step size π 2. Proof: a) By proposition 3c) and the premise πβ1

πΌπ = π

β1

β1 π π

π¦

1 β2ππ πβ π π 2

π=0

=

π

β1

π¦

1 β2ππ π β 2 1β β1 π π 1 β2ππ π β 2 1+π

π

=

2π β1 π¦ 1 β2ππ π β 2 1+π

π

=π¦

π

βππ π β

1 2

π 1 2

p cos π πβ

π

0

π ππ£ππ

.

π πππ

The maxima, minima and asymptotics are elementary. For odd π and π¦ = Β±1, the support of πππ,4π π consists of π equidistant points. Thus the evolution is of minimum anticipation type with period π and step size π 2. b) From a) by the Euler summation formula. c) By Lemma 1. d) From a) and the Binomial theorem one obtains ππ β

π ,π

π πβπ 2

βπ

π

π β2 π , π!

from

which the result follows by the Euler summation formula. The notation ππ denotes the π π‘π falling power of π. q.e.d. 3.3 General Case We consider now sampling distributions such that π¦π π = 0, β¦ , π β 1 is a sequence of i.i.d. 1 1 random variables with distribution πΉπ π¦ β π β€ π¦ β€ π with moments πβπ ππ .

All rights reserved 16/32

Orthogonal Evolution and Anticipation

Proposition 9 a) πΈ ππ = πβ1 π 2 +

π 12

1 2

π 2 sin 2 π πβ

= πβ1 π 2 +

π

πβΞ© π

π πβ2 π πβ2

πβπ π

. 4

πΈ ππ has maximum value πβ1 π 2 + π 2 π12 at π = 0 and π = 1. πΈ ππ has minimum value πβ1 π 2 + =0 β ππ πβ1

π 12 π2

π 2

.

ππ = π1π 2π β 1 = π

2

β1 2 β1 π b) ππ = πΈ ππ + β ππ π β1 β ππ π

at π =

2

πβπ π . π β Ξ© π , 2π β 1 β π

β ππ πβ1

π1 β π πβ1

2

c) πΈ ππ‘ππ‘ = π2 .

d)

πΈ ππ = 1 β 2πβ1 π π 2 + π12

1

π=0

2 π2π β1

π β π π ,π β 0 .

π e) ππ = πΈ ππ + f)

πΈ ππ =

π 2 π π+1

=0 β ππ πβ1

2

1 β 2πβ1 π

otherwise

ππ = π1π . ππ‘ππππ€ππ π

π2 + π ππβ1 .

Proof: a) b) c) d) e) f)

From proposition 3c) and the premises, as detailed in the appendix. From proposition 5f, where πππ ππ is calculated in the appendix. See appendix. See appendix. From proposition 5f, where πππ ππ is calculated in the appendix. From proposition 8a and the Euler summation formula. q.e.d.

Proposition 9 shows that anticipation is fairly strong. Whereas the expected probability is independent of the period, the standard deviation is π πβ1 2 . We summarize these findings by

All rights reserved 17/32

Orthogonal Evolution and Anticipation

Theorem 1: Anticipation (Point Spectrum) a) When randomly sampling with mean π1 and π 2 , then, for any 0 < πΏ < 1, π measurements at time yield a state from the set πππ π < π < π + 1 β π with 2

π

probability ππ = π 2 πΏ + ππ 1 for π = 2 1 β πΏ . b) For any 0 < πΏ < π 2 and any π, the probability ππ > πΏ with certainty as π β β. c) The operator ππ has expectation π ππ . Proof: a) By proposition 9b. b) By a) and definition 5f). c) By proposition 9f). q.e.d. Notice that for unbiased sampling π1 = 0 and π 2 = π2 , which by proposition 4c) is easily achievable.

All rights reserved 18/32

Orthogonal Evolution and Anticipation

4

Anticipation (Continuous Spectrum)

Here we determine, for model and random states with π = β, the values and distributions of the anticipation statistics identified and defined in section 2.3, and prove theorem 2. 4.1 Minimum Anticipation We consider here the extremal case of constant π¦π . Notice that these model states have zero occurrence probability. Proposition 10 Let π¦π = π¦ 0 β€ π β€ 2π for some β1 β€ π¦ β€ 1. Then 1 β2

a) ππ = π¦ 2 π β2 π β 2

with maximum value β

β1

b) ππ β π π πββ . 2 c) ππ‘ππ‘ = π¦ . d) ππ = β, π β₯ 1 .

4π¦ 2 π2

at π = 0 and π = 1.

Proof: a) By proposition 3d) and the premise 2π 1 1 β1 βπ πβ π ππ β1 2 πΌπ = π¦π π πππ‘ππππ = βππ¦π πβ 2π 2 0 The maxima, minima and asymptotics are elementary. b) From a) by the Euler summation formula. c) By a well known result, first discovered by Euler. See also [He88]. d) From a). q.e.d. 4.2 Maximum Anticipation We consider here the extremal case of π¦π alternating between π¦ and βπ¦ in π equidistant intervals. Notice that these model states have zero occurrence probability. Proposition 11 2ππ π

Let π¦π = β1 π π¦

a) ππ =

π¦ 1 2

π πβ

β€π

Orthogonal Evolution and Anticipation

by Hans-Rudolf Thomann October 2008

Author address

Im Altried 1h, 8051 ZΓΌrich, [email protected]

All rights reserved 1/32

Orthogonal Evolution and Anticipation

CONTENTS 0

Introduction ................................................................................................................. 4

1

Orthogonal Evolution ................................................................................................. 6

2

3

4

5

1.1

Spectrum ......................................................................................................................... 6

1.2

Fundamental Frequency Bound ..................................................................................... 7

1.3

Representation of Anticipation Amplitudes .................................................................. 8

Random Sampling..................................................................................................... 10 2.1

Sampling Scheme ......................................................................................................... 10

2.2

Distribution of the Spectral Difference ........................................................................ 12

2.3

Anticipation Statistics................................................................................................... 12

2.4

Multiple Periods ............................................................................................................ 13

Anticipation (Point Spectrum) ................................................................................. 15 3.1

Minimum Anticipation ................................................................................................... 15

3.2

Maximum Anticipation .................................................................................................. 16

3.3

General Case ................................................................................................................. 16

Anticipation (Continuous Spectrum)....................................................................... 19 4.1

Minimum Anticipation ................................................................................................... 19

4.2

Maximum Anticipation .................................................................................................. 19

4.3

General Case ................................................................................................................. 20

Conclusions .............................................................................................................. 23 5.1

Summary........................................................................................................................ 23

5.2

Statistical Permises ...................................................................................................... 23

5.3

Features and Limitations .............................................................................................. 23

5.4

Applications................................................................................................................... 24

5.5

Open Problems.............................................................................................................. 24

6

Bibliography .............................................................................................................. 26

7

Appendix: Proofs of Proposition 8 .......................................................................... 27 7.1

Mean and Variance of ππ .............................................................................................. 27

7.2

Mean and Variance of ππ΅.............................................................................................. 30

All rights reserved 2/32

Orthogonal Evolution and Anticipation

Abstract Quantum states evolving, for some 0 < π β€ β, into a sequence of mutually orthogonal states at times π‘ + ππ π = 0, β¦ , π β 1 exhibit an interesting physical effect. The analysis of the anticipation probabilities ππ =

ππ‘+π , ππ‘+ππ

2

shows, that for randomly chosen states π

2

measurements of the state at time π‘ + π 2 reveal information about the state at time π‘ Β± ππ π = 0, β¦ , π β 1 , anticipating future states and reflecting past states with significant π probability. For π < β and any 0 < πΏ < 1, the probability to measure at time π‘ + 2 states from π

π

the set ππ‘+ππ π β 2 < 2 πΏ exceeds π 2 πΏ with certainty as π β β, where π 2 is a constant π

independent from the period. For π = β and any π, the probability to measure at time π‘ + 2

states from the set ππ‘Β±ππ π > π is β₯ π 2 . We characterize the spectrum, establish an analog to Planck's relation, define a random sampling scheme, analyze the resulting distribution of the anticipation probabilities and point out applications.

All rights reserved 3/32

Orthogonal Evolution and Anticipation

0

Introduction ππ»

Under a Hamiltonian π», a quantum state π evolves into an orbit ππ‘ = ππ‘ π, where ππ‘ = π β β π‘ . Due to the unitarity of ππ‘ , the amplitude πππ^ π = ππ‘ , ππ‘+π = π βπππ πππ π is independent from π‘. If for some integer 0 < π β€ β: πππ^ ππ = πΏπ πππ π π β β€ , then ππ‘+ππ π = 0, β¦ , π β 1 is a set of mutually orthogonal states, for every π‘. The subject of our study are states with this property, named orthogonal evolution of period π and step size π, and the anticipation amplitudes πΌπ = πππ^ π + 1 2 π , i.e. the inner product π

of states at time ππ and 2 , as well as the anticipation probabilities ππ = πΌπ 2 . It turns out, that π

for randomly chosen states π, measurements of the state at time 2 reveal information about the states Β±ππ π = 0, β¦ , π β 1 . Future states are anticipated and past states reflected with significant probability. These probabilities and the strength of this effect, which we name Quantum-Mechanical anticipation, are quantified in this paper. Chapter 1 shows that orthogonally evolving states have pure point or absolutely continuous spectrum πππ π , but neither mixed nor singular continuous spectrum. The fundamental 2

upper bound βπ β1 β€ Ο infπ 0 π» in terms of the spectral quantity π» is established, which is an analog to Planck's relation. The spectral difference is defined in terms of the reduced spectrum and used to represent the anticipation amplitudes. In chapter 2 we justify from the symmetries of the set of self-adjoint Hamiltonians the random sampling scheme underlying our analysis, derive the statistical distribution of the spectral difference as π is randomly sampled from all orthogonally evolving states with period π and step size π, identify the statistical quantities used in the subsequent chapters to assess anticipation strength and show that multiples of π are unlikely to occur by chance. In chapter 3 the anticipation effect is analyzed under point spectrum. The distribution of the ππ and of various observables is derived for some model states as well as under random π sampling. Theorem 1 states, that for any 0 < πΏ < 1, the probability to measure at time 2 a π

π

state from the set πππ π β 2 < 2 πΏ exceeds π 2 πΏ with certainty as π β β, where π 2 is a constant independent from the period. The operator ππ expectation β ππ .

(defined in section 2.3) has

Chapter 4 determines anticipation under absolutely continuous spectrum. Theorem 2 states π the constant lower bound π 2 for the probability to measure at time 2 a state from the set πΒ±ππ π > π , and infinite expectation for the operator ππ . The statistics in the continuous case are the limiting values of those in the periodic case as π β β.

In the final chapter we draw conclusions, discuss some open questions and point out applications. Anticipation in general type quantum evolution will be analyzed in a future paper.

All rights reserved 4/32

Orthogonal Evolution and Anticipation

All rights reserved 5/32

Orthogonal Evolution and Anticipation

1

Orthogonal Evolution

After introducing some terminology, we characterize the spectrum of orthogonally evolving states, derive necessary and sufficient conditions for their existence, establish a fundamental bound on the frequency and represent the anticipation amplitudes in terms of the reduced spectrum. Definition 1 Let π» be a Hamiltonian, π a state and 0 < π β€ β an integer. Throughout the paper we will assume that π 22 = 1. ππ»

a) π is evolving under π» into ππ‘ = ππ‘ π, where ππ‘ = π β β π‘ , has spectral measure πππ π . b) π has orthogonal evolution of period π and step size π, iff πππ^ ππ = πΏπ πππ π π β β€ . Here πππ^ π = ππ‘ , ππ‘+π = π βπππ β πππ π is the Fourier transform of πππ π . c) πΌπ = πππ^ π + 1 2 π is the anticipation amplitude of πππ . d) ππ = πΌπ 2 is the anticipation probability of πππ . e) The reduction of πππ π modulo π is πππ,π π β π =ππ πππ π πππ π . β

f)

The ΞΊ-cut of πππ π modulo π is the function ππ ,π π =

g) The standard scale is defined to make

π β

π β π

ππ π ππ +

ππ π ,π π

πββ€ .

= 1.

1.1 Spectrum Orthogonal evolution constrains the spectrum to either pure point or absolutely continuous, as stated by the following proposition. Proposition 1: Characterization of the Spectrum (Orthogonal Evolution Constraint) Let π have orthogonal evolution with period π and step size π. In standard scale, a) If π < β, then ππ,2π

2ππ π

1

=π

π = 0, β¦ , π β 1 is pure point.

ππ

b) If π = β, then πππ,2π π = 2π 0 β€ π β€ 2π is absolutely continuous. Proof: ππ»

The spectrum of π» and ππ are related by ππ‘ = π β β π‘ . ππ is a bounded operator with spectrum on the complex unit circle [RS80]. Its spectral measure is uniquely determined by π ππ π0 , π0 π β πΆ β β . These functions in turn are uniquely determined by the monomials π§ π , which due to mutual orthogonality satisfy πππ π0 , π0 = πππ , π0 = πΏπ πππ π . Thus the spectral measure of ππ is equally distributed, in the periodic case over the unit roots, in the infinite case over the whole complex unit circle. Singular continuous spectrum is excluded. Mixed spectrum cannot occur, as in the periodic case the initial state must recur, whereas in the aperiodic case πππ^ π‘ β π 1 (see [YL96]). The numerical values (in standard scale) follow from

2π 0

π βπππ πππ,2π π = πΏπ πππ

definition 1b) and the normalization of π. The integral becomes π 2π 0

ππ π βπππ 2π

β1

π πβ1 β2ππππ π π=0 π

(see and

for π < β and π = β, respectively, which is easily verified to yield the correct

values.

q.e.d. All rights reserved 6/32

Orthogonal Evolution and Anticipation

A more elaborate proof facing the various intricacies and the explicit determination of the Fourier expansion of a continuous function π on 0,2π such that πππ,2π π = D2 f will be given in a future paper. Proposition 1 has obvious consequences for the existence of orthogonal evolution. Corollary 1: Existence Necessary and sufficient for the existence of a state π evolving orthogonally with period π under Hamiltonian π» is that either π < β and the reduced point spectrum of π» contains π equidistant points (such as the π 2 operator), or π = β and π» has non-empty absolutely continuous spectrum (such as the ββ operator). Proof: In either case states meeting proposition 1 are easily constructed. 1.2 Fundamental Frequency Bound In terms of the spectral quantity π» we derive here the fundamental frequency bound 2 βπ β1 β€ Ο π» , an analog to Planck's πΈ = βπ. Definition 2 For any Hamiltonian π» and state π, π» β

π β π0 πππ π .

Proposition 2 Let π be any state evolving under Hamiltonian π». Then a) π» < β iff π» < β. b) π» > 0 iff π β 0 and π not an eigenstates with zero eigenvalue. c) π, ππ‘ β₯ π π π, ππ‘ β₯ 1 β π» π‘ β. d) If πππ^ π = 0, then π β₯ supπ 0 β π» β π0 . e) The supremum is attained when π0 equals the median of π, i.e. π π0 , β π ββ, π0 . f) infπ 0 π» β π0 = 0 iff π is an eigenstate.

=

Proof: a) By elementary measure theory,

π» =

0 ππππ ββ

ππππ π

exists iff

β ππππ 0

π < β and

π < β, i.e. iff π» < β. b) As π» is a sum of two non-negative quantities, it vanishes only, if πππ π = 0 or π π’ππ ππ π = 0 c) By the SchrΓΆdinger equation and the triangle inequality, π ππ‘

ππ‘ β π

2 2

=

Therefore ππ‘ β π d) By orthogonality, follows.

π»

π»

βπ β ππ‘ , π + π, βπ β ππ‘ 2 2

β€

2 β

= 2

π ππ‘ sin β β

πππ π

2

β€β π» .

π» π‘. The result follows from π π π, ππ‘ = 1 β ππ‘ β π

ππ β π

2 2

2 2

2

= 2. As π is the same for π» β π0 , the claimed result

All rights reserved 7/32

Orthogonal Evolution and Anticipation

ππ¦

e) From ππ f)

0

β πππ 0

π» β π0 = β

π + π0 +

β πππ 0

π0 β π = 0.

Elementary. q.e.d. π

2c) yields a lower bound on the modulus of the anticipation amplitude at π‘ = 2 . shows that at given π» the frequency of the orthogonal evolution cannot be arbitrarily high, or equivalently the step size arbitrarily small. A bound for the passage time based on π» 2 has been established by Fleming [Fl73] and discussed by various authors. The bound given in 2d) for the passage time can be strengthened to a bound for the step size of orthogonal evolution: Corollary 2: Fundamental Frequency Bound The frequency of an orthogonal evolution is bounded by 2

βπ β1 β€ Ο π» . Proof: Proposition 2d) and 2e) imply βπ β1 β€ π» β π0 . We conclude from proposition 1 that among all Hamiltonians providing orthogonal evolution those with spectrum evenly π distributed in β π, π minimize π» , which is easily found to be 2 . q.e.d.

1.3 Representation of Anticipation Amplitudes From now on all spectra are in standard scale. In proposition 1 we have determined the reduced spectrum πππ,2π π . Using a notation for the reduced spectrum on the doubled interval 0,4π we establish a representation of πΌπ which we use throughout the paper. Definition 3: Spectral difference a) Let π < β. Then π₯π,π = ππ,4π

π

2π π + π

π = 0, 1; π = 0, β¦ , π β 1 .

π¦π = π₯π,0 β π₯π,1 is the spectral difference. π

b) Let π = β. Then π₯π ,π = ππ ππ,4π 2ππ + π π = 0,1; 0 β€ π β€ 2π . π¦π = π₯π ,0 β π₯π ,1 is the spectral difference. Proposition 3: Representation of anticipation amplitudes 1 π 1 2π

1 1 π π 1 1 , β 2π β€ π¦π β€ 2π 1 β2ππ πβ π π

a) If π < β, then 0 β€ π₯π,π β€ , β β€ π¦π β€ , π₯π,0 + π₯π,1 = b) If π = β, then 0 β€ π₯π ,π β€ c) If π < β, then πΌπ = d) If π = β, then πΌπ =

πβ1 2 π=0 π¦π π 1 2π βπ πβ π ππ 2 π¦π π 0 2π

π πππ‘ππππ .

All rights reserved 8/32

. 1

, π₯π ,0 + π₯π ,1 = 2π .

π πππ‘ππππ .

Proof:

1 π

Orthogonal Evolution and Anticipation

a) From proposition 1a) and definition 3a). b) From proposition 1b) and definition 3b). c) By definition 1c) and 3a), πβ1

πΌπ =

π₯π,0 π

1 β2ππ πβ π π 2

π=0 πβ1

=

π₯π,0 β π₯π,1 π

+ π₯π,1 π

1 β2ππ πβ 1+π π 2

1 β2ππ πβ π π 2 .

π=0

The claim follows from definition 3a). d) By definition 1c) and 3b), 2π 1 1 βπ πβ π ππ βπ πβ 2 2 πΌπ = π₯π ,0 π + π₯π ,1 π 2π 0 2π 1 βπ πβ π ππ 2 = π₯π ,0 β π₯π ,1 π , 2π 0 The claim follows from definition 3b).

2π+π

ππ 2π

q.e.d.

All rights reserved 9/32

Orthogonal Evolution and Anticipation

2

Random Sampling

Corollary 1 implies that orthogonally evolving states exist under infinitely many dynamics. In the remaining chapters we investigate, how likely such states are to exhibit anticipation of significant strength. To this end, we define in this chapter a random sampling scheme which will underlie all our subsequent analyses. We justify this scheme by the symmetry properties of the set of self-adjoint Hamiltonians. Under this scheme we derive the statistical distribution of the spectral difference, which determines the anticipation amplitudes. We define the set of statistical quantities used in the subsequent chapters to assess anticipation strength, and finally show that multiples of the period are unlikely to occur by chance. 2.1 Sampling Scheme In order to investigate the likelihood of physical states to exhibit anticipation of a certain strength we must consider the set of all orthogonally evolving states and the way such states arise. We focus here on a laboratory situation, where the experimentalist prepares an experimental setup and an orthogonally evolving target, and then studies its evolution under the laws of nature. The setup is modeled by a Hamiltonian and the target by a physical state. What Hamiltonians and states are preparable depends on the experimentalist's skills and capabilities and the constituents available in nature. Abstracting from these practical limitations, we only constrain the experimentalist to prepare self-adjoint Hamiltonians and states exhibiting orthogonal evolution, and analyze the outcome if these are randomly chosen, letting freely fluctuate all degrees of freedom except those fixed by the orthogonal evolution constraint (proposition 1). We are going to define a sampling scheme, where, for a given step size and period, first a self-adjoint Hamiltonian is chosen from the set of all self-adjoint Hamiltonians admitting orthogonal evolution, and then a state is chosen from the set of states orthogonally evolving under that Hamiltonian. Proposition 3 shows that orthogonal evolution and anticipation are completely determined by the spectral properties of the Hamiltonians and states involved, more specifically by the spectral difference. Thus, for our purpose, only constraints on and degrees of freedom of spectra need consideration. Inverse theory tells us that, for any bounded real measure π and any positive integer π, there are uncountably many self-adjoint operators π» and states π β πΏ2 βπ with spectral measure π. This holds as well for at least one SchrΓΆdinger operator ββ + π with self-adjoint extension π». This enables us to define the sampling scheme in terms of probability spaces of spectral measures. The core feature of spectral measures of orthogonally evolving states are the π -cuts modulo 2π (see definition 1f), related to the spectral measure π by ππ 2ππ + π = ππ ,2π π πππ,2π π . The quantities π₯π,π from definition 3 are π₯π,π = ππ ,2π 2π + π πππ,2π π . Whereas the reduced spectral measure is fixed by the orthogonal evolution constraint, the π cuts are arbitrary doubly-infinite sequences with non-negative values, summing up to 1. Let's firstly consider their supports π π’ππ ππ ,2π , i.e. the patterns of points π β β€ where ππ ,2π π β 0. This pattern is a property of the Hamiltonian, whereas the size of ππ ,2π π is a determined by the state.

All rights reserved 10/32

Orthogonal Evolution and Anticipation

Self-adjointness allows for any such pattern at any index π , and to choose patterns at different indices π independently. This holds for point spectrum as well as for continuous spectrum, with the only constraint that for fixed π the dependence from π must be absolutely continuous. While the following arguments will be given for point spectrum, they are easily extended to continuous spectrum as the limiting case. This symmetry of the set of self-adjoint Hamiltonians allows us to require the sampling scheme to make the π π’ππ ππ ,2π a family of independent identically distributed random variables. As these symmetries hold as well when constraining the Hamiltonians to admit orthogonal evolution, we demand the same property in this case. Given any Hamiltonian, the value of ππ ,2π π on π π’ππ ππ ,2π depends only on the state π. When randomly sampling states π β πΏ2 βπ meeting the orthogonal evolution constraint, then ππ ,2π can attain any non-negative function π β π1 π π’ππ ππ ,2π meeting this constraint, independently for any index π , because orthogonal components of these states belonging to different indices can be sampled separately in the discrete case. As by the preceding requirement the supports are i.i.d., we impose the same requirement on the sampling distribution of the π -cuts. As all distributions as well as the reduced spectrum are index-independent, we will omit indices in the following. Let ππ be sampling distribution of the π -cuts ππ ,2π at period π. Notice that ππ is not obtained by conditioning from another, unconditional distribution. The orthogonal evolution constraint is met by normalization of the π -cuts, not by conditioning of some distribution. Practical sampling procedures will be designed to meet the constraint approximately, but not by selection of "good" states from unconstraint samples. Only the latter procedure would yield a conditional distribution, whereas the real procedure is sufficiently modeled by a procedure where orthogonal components are sampled and normalized independently (index by index). The period π only determines the orthogonal evolution constraint. As this constraint is met by π normalization, it only enters into ππ as a scaling factor: ππ ππ ,2π = ππ π ππ ,2π for any π, π and ππ ,2π . As our reasoning did not depend otherwise from the period π, we finally postulate the sampling distribution of π -cuts to be independent from π, up to the orthogonal evolution constraint and the scaling factor. Our requirements are summarized in the following definition. Definition 4: Random Samples A family of sets ππ of measures π is a family of random samples of measures of orthogonally evolving states with period π β€ β iff the following holds for all π: a) The reduced spectrum ππ,2π meets proposition 1 above. b) For π < β, the π -cut-valued random variables ππ are i.i.d. under probability measure ππ π = π1 ππ independent from π . c) For π = β, the π -cut-valued random variables ππ are i.i.d. under probability measure πβ π = π1 2ππ independent from π .

All rights reserved 11/32

Orthogonal Evolution and Anticipation

From the preceding arguments and definition 4 follows: Corollary 3: Existence The class of self-adjoint Hamiltonians admits for sampling schemes yielding random samples as defined above. Further evidence justifying our sampling scheme will be provided by the main results and discussed in the final chapter. 2.2 Distribution of the Spectral Difference Random sampling according to the above scheme yields a probability distribution of the spectral difference function π¦π . It is no surprise that these are again i.i.d.. Proposition 4 Consider the sampling distribution as π is randomly sampled under the sampling scheme of definition 4. Then for some distribution function πΉ π§ on β1, +1 the following holds: a) If π < β, then π¦π π = 0, β¦ , π β 1 is a sequence of i.i.d. random variables with 1 1 distribution πΉ ππ¦ with density ππ ππ¦ for β π β€ π¦ β€ π .

b) If π = β, then π¦π 0 < π β€ 2π is a family of i.i.d. random variables with distribution 1 1 πΉ 2ππ¦ with density 2ππ 2ππ¦ for β β€ π¦ β€ . 2π 2π c) If the random sampling scheme is invariant under translations of the energy scale, then the density π is an even function. Proof: a) For π < β π‘he distribution of π₯π,s is ππ π₯ = π1 π2π ππ₯ β₯ π2π 2π + π , from which πΉ is obtained. The scaling is due to proposition 1. The i.i.d. and period dependence are direct consequences of definition 4. b) For π = β π‘he distribution of π₯π,s is ππ π₯ = π1 π2π 2ππ₯ β₯ π2π 2π + π , from which πΉ is obtained. The scaling is due to proposition 1. The i.i.d. and period dependence are direct consequences of definition 4. c) Translations of the energy scale by 2ππ effect a shift by βπ on the π -cuts, transforming them to ππ ,2π π β π . If π is odd, the sign of the spectral difference is toggled. q.e.d.

2.3 Anticipation Statistics We identify here the statistical quantities used to assess anticipation strength in the subsequent chapters, and recall some statistical terminology and basic facts. The following anticipation statistics will be analyzed in the subsequent chapters, with π as by definitions 5a and 5b below. π

a) ππ , the probability to measure πππ at time 2 , and ππ β π π > π . All rights reserved 12/32

Orthogonal Evolution and Anticipation

π

b) ππ‘ππ‘ = ππ , the probability to measure one of the states πππ at time . 2 c) Observable ππ π β₯ 0 , with value ππ in state πππ π β₯ 0 and expected value ππ β ππ ππ . We perform the analysis for minimum and maximum anticipation model states as well as under biased and unbiased random sampling according to definition 4 above. In the sampling scenarios, we provide the sample mean and variance as well as stochastic asymptotics of the probabilities. The following definitions will be used: Definition 5 π πππ π π πππ π β€ π + 1 2 . π + 1 β π πππ π ππ‘ππππ€ππ π If π = β, then π = π . ππ = πΈ π§ π π = 0, 1, β¦ is the ππ‘π moment. π 2 π§ = πΈ π§ β π1 2 is the variance. π§ β ππ 1 iff βπ > 0 βπ: lim sup π π§ > π < π. π§ β ππ 1 iff βπ > 0: lim π π§ > π = 0 .

a) If π < β, then π = b) c) d) e) f)

ππ and ππ are Pratt's stochastic asymptotics [Pr59], the former expressing convergence in distribution, or stochastic boundedness, the latter stochastic convergence to zero [BFH75]. Recall the following facts: Proposition 5 a) If the probability density is an even function, then π2π+1 = 0. b) πΈ ππ§ π = π π ππ . c) π 2 = π2 β π12 . d) π π§ > πΈ π§ π§βπ§ π

π

1 π

1 , β2

1

β < βπ

π > 0, β > 0 .

e) π >β < where π§ = π1 . f) π§ = π§ + ππ π . π§ g) If π β π π§ then π§ = 1 + ππ 1 . Proof: Parts a-c) are elementary. d) is the BienaymΓ©-Chebychev inequality [TJ71], e) its special case π = 2, f) and g) direct consequences of e) and definition 5e-f). 2.4 Multiple Periods Random sampling of orthogonally evolving states with period π does admit states with period ππ for some π > 1. Such states have π¦π = 0 βπ, thus πΌπ = 0. However, they are not likely to occur by chance in random samples.

All rights reserved 13/32

Orthogonal Evolution and Anticipation

Proposition 6 Let π be a state with orthogonal evolution of period π. a) π < β. For any π > 0, let π be the cardinality of the set π ππ¦π < π, 0 β€ π < π . Then π has the Binomial distribution with parameter π and πΉ π . b) π = β. For any π > 0, then the set π π¦π < π, 0 < π β€ 2π has Lebesgue measure πΉ βπ, +π . Proof: a) Immediately from proposition 4. b) From proposition 4, as the variables are i.i.d. q.e.d. This result rules out the occurrence by chance of period ππ or, equivalently, step size π π when randomly sampling states with period π and step size π. However it does not guarantee strong anticipation. This will be provided by the proof of constant probabilistic lower bounds on ππ in the next two chapters.

All rights reserved 14/32

Orthogonal Evolution and Anticipation

3

Anticipation (Point Spectrum)

Here we determine, for model and random states with π < β, the values and distributions of the anticipation statistics identified and defined in section 2.3, and prove theorem 1. 3.1 Minimum Anticipation We consider here the extremal case of constant π¦π . Notice that these model states have zero occurrence probability. Proposition 7 Let ππ¦π = π¦ π = 0, β¦ , π β 1 for some β1 β€ π¦ β€ 1. Then a) ππ =

π¦2 π 2 sin 2 π

1 πβ 2

β πβ2 π¦ 2 at π = b) ππ β π π β1

with maximum value β

π π 2

and ππ β π πβ2

πβ

2

π 2

πβ

π 2

4π¦ 2 π2

at π = 0 and π = 1, minimum value

.

.

c) ππ‘ππ‘ = π¦ . d) π1 = ln π + π 1 , ππ = π ππβ1

π>1 .

Proof: a) By proposition 3c) and the premise πβ1

πΌπ = π

=

β1

πβ1 π¦

π¦

π

1 β2ππ πβ π π 2

π=0 1 β2ππ πβ 2 π 1 β2ππ πβ2 π

β1 π

=

β1

β2πβ1 π¦ 1 β2ππ πβ2 π

π

= ππ¦ β1

β1

π

1 βππ πβ 2

π

1 π sin π π β 2

π π¦ The maxima, minima and asymptotics are elementary. b) From a) by the Euler summation formula. c) From Lemma 1 below. d) From a) by the Euler summation formula.

π

2π β 1 β p otherwise

q.e.d.

Lemma 1 πβ1

πβ2 sinβ2 π π β π=0

1 2

π=1

Proof: Let π be a superposition of exactly π eigenfunctions. Then ππππ ππ‘ π‘ β β = ππππ πππ π β 2 β€ . Therefore ππ‘ππ‘ = ππ 2 2 = 1. As in this case π¦ = ππ¦π = 1, the claimed result follows. The alternatively, direct summation, is easily achieved by adapting theorem 4.9a of [He88] to periodic functions. All rights reserved 15/32

Orthogonal Evolution and Anticipation

3.2 Maximum Anticipation We consider here the extremal case of alternating π¦π = β1 π π¦. Notice that these model states have zero occurrence probability. Proposition 8 Let π be even, ππ¦π = β1 π π¦ π = 0, β¦ , π β 1 for some β1 β€ π¦ β€ 1. Then π¦2

a) ππ = βπ

π 2 cos 2 π β2 2

1 πβ 2

π

, which has maximum value β

π¦ at π = 0 and π = 1, and ππ β π πβ2

b) ππ = ππ‘ππ‘ β π π

β1

4π¦ 2 π2

at π =

π 2

, minimum value

πβπ π .

πβπ π .

c) ππ‘ππ‘ = π¦ 2 . d) ππ = π ππ + ππβ1 ln π . For odd π, this evolution degenerates to an orthogonal evolution with period π and step size π 2. Proof: a) By proposition 3c) and the premise πβ1

πΌπ = π

β1

β1 π π

π¦

1 β2ππ πβ π π 2

π=0

=

π

β1

π¦

1 β2ππ π β 2 1β β1 π π 1 β2ππ π β 2 1+π

π

=

2π β1 π¦ 1 β2ππ π β 2 1+π

π

=π¦

π

βππ π β

1 2

π 1 2

p cos π πβ

π

0

π ππ£ππ

.

π πππ

The maxima, minima and asymptotics are elementary. For odd π and π¦ = Β±1, the support of πππ,4π π consists of π equidistant points. Thus the evolution is of minimum anticipation type with period π and step size π 2. b) From a) by the Euler summation formula. c) By Lemma 1. d) From a) and the Binomial theorem one obtains ππ β

π ,π

π πβπ 2

βπ

π

π β2 π , π!

from

which the result follows by the Euler summation formula. The notation ππ denotes the π π‘π falling power of π. q.e.d. 3.3 General Case We consider now sampling distributions such that π¦π π = 0, β¦ , π β 1 is a sequence of i.i.d. 1 1 random variables with distribution πΉπ π¦ β π β€ π¦ β€ π with moments πβπ ππ .

All rights reserved 16/32

Orthogonal Evolution and Anticipation

Proposition 9 a) πΈ ππ = πβ1 π 2 +

π 12

1 2

π 2 sin 2 π πβ

= πβ1 π 2 +

π

πβΞ© π

π πβ2 π πβ2

πβπ π

. 4

πΈ ππ has maximum value πβ1 π 2 + π 2 π12 at π = 0 and π = 1. πΈ ππ has minimum value πβ1 π 2 + =0 β ππ πβ1

π 12 π2

π 2

.

ππ = π1π 2π β 1 = π

2

β1 2 β1 π b) ππ = πΈ ππ + β ππ π β1 β ππ π

at π =

2

πβπ π . π β Ξ© π , 2π β 1 β π

β ππ πβ1

π1 β π πβ1

2

c) πΈ ππ‘ππ‘ = π2 .

d)

πΈ ππ = 1 β 2πβ1 π π 2 + π12

1

π=0

2 π2π β1

π β π π ,π β 0 .

π e) ππ = πΈ ππ + f)

πΈ ππ =

π 2 π π+1

=0 β ππ πβ1

2

1 β 2πβ1 π

otherwise

ππ = π1π . ππ‘ππππ€ππ π

π2 + π ππβ1 .

Proof: a) b) c) d) e) f)

From proposition 3c) and the premises, as detailed in the appendix. From proposition 5f, where πππ ππ is calculated in the appendix. See appendix. See appendix. From proposition 5f, where πππ ππ is calculated in the appendix. From proposition 8a and the Euler summation formula. q.e.d.

Proposition 9 shows that anticipation is fairly strong. Whereas the expected probability is independent of the period, the standard deviation is π πβ1 2 . We summarize these findings by

All rights reserved 17/32

Orthogonal Evolution and Anticipation

Theorem 1: Anticipation (Point Spectrum) a) When randomly sampling with mean π1 and π 2 , then, for any 0 < πΏ < 1, π measurements at time yield a state from the set πππ π < π < π + 1 β π with 2

π

probability ππ = π 2 πΏ + ππ 1 for π = 2 1 β πΏ . b) For any 0 < πΏ < π 2 and any π, the probability ππ > πΏ with certainty as π β β. c) The operator ππ has expectation π ππ . Proof: a) By proposition 9b. b) By a) and definition 5f). c) By proposition 9f). q.e.d. Notice that for unbiased sampling π1 = 0 and π 2 = π2 , which by proposition 4c) is easily achievable.

All rights reserved 18/32

Orthogonal Evolution and Anticipation

4

Anticipation (Continuous Spectrum)

Here we determine, for model and random states with π = β, the values and distributions of the anticipation statistics identified and defined in section 2.3, and prove theorem 2. 4.1 Minimum Anticipation We consider here the extremal case of constant π¦π . Notice that these model states have zero occurrence probability. Proposition 10 Let π¦π = π¦ 0 β€ π β€ 2π for some β1 β€ π¦ β€ 1. Then 1 β2

a) ππ = π¦ 2 π β2 π β 2

with maximum value β

β1

b) ππ β π π πββ . 2 c) ππ‘ππ‘ = π¦ . d) ππ = β, π β₯ 1 .

4π¦ 2 π2

at π = 0 and π = 1.

Proof: a) By proposition 3d) and the premise 2π 1 1 β1 βπ πβ π ππ β1 2 πΌπ = π¦π π πππ‘ππππ = βππ¦π πβ 2π 2 0 The maxima, minima and asymptotics are elementary. b) From a) by the Euler summation formula. c) By a well known result, first discovered by Euler. See also [He88]. d) From a). q.e.d. 4.2 Maximum Anticipation We consider here the extremal case of π¦π alternating between π¦ and βπ¦ in π equidistant intervals. Notice that these model states have zero occurrence probability. Proposition 11 2ππ π

Let π¦π = β1 π π¦

a) ππ =

π¦ 1 2

π πβ

β€π