Orthogonal Evolution and Anticipation - arXiv

2 downloads 0 Views 626KB Size Report
Orthogonal Evolution and Anticipation. All rights reserved. 1/32. Orthogonal Evolution and Anticipation by Hans-Rudolf Thomann. October 2008. Author address.
Orthogonal Evolution and Anticipation

Orthogonal Evolution and Anticipation

by Hans-Rudolf Thomann October 2008

Author address

Im Altried 1h, 8051 ZΓΌrich, [email protected]

All rights reserved 1/32

Orthogonal Evolution and Anticipation

CONTENTS 0

Introduction ................................................................................................................. 4

1

Orthogonal Evolution ................................................................................................. 6

2

3

4

5

1.1

Spectrum ......................................................................................................................... 6

1.2

Fundamental Frequency Bound ..................................................................................... 7

1.3

Representation of Anticipation Amplitudes .................................................................. 8

Random Sampling..................................................................................................... 10 2.1

Sampling Scheme ......................................................................................................... 10

2.2

Distribution of the Spectral Difference ........................................................................ 12

2.3

Anticipation Statistics................................................................................................... 12

2.4

Multiple Periods ............................................................................................................ 13

Anticipation (Point Spectrum) ................................................................................. 15 3.1

Minimum Anticipation ................................................................................................... 15

3.2

Maximum Anticipation .................................................................................................. 16

3.3

General Case ................................................................................................................. 16

Anticipation (Continuous Spectrum)....................................................................... 19 4.1

Minimum Anticipation ................................................................................................... 19

4.2

Maximum Anticipation .................................................................................................. 19

4.3

General Case ................................................................................................................. 20

Conclusions .............................................................................................................. 23 5.1

Summary........................................................................................................................ 23

5.2

Statistical Permises ...................................................................................................... 23

5.3

Features and Limitations .............................................................................................. 23

5.4

Applications................................................................................................................... 24

5.5

Open Problems.............................................................................................................. 24

6

Bibliography .............................................................................................................. 26

7

Appendix: Proofs of Proposition 8 .......................................................................... 27 7.1

Mean and Variance of 𝒑𝒏 .............................................................................................. 27

7.2

Mean and Variance of 𝒑𝑡.............................................................................................. 30

All rights reserved 2/32

Orthogonal Evolution and Anticipation

Abstract Quantum states evolving, for some 0 < 𝑝 ≀ ∞, into a sequence of mutually orthogonal states at times 𝑑 + 𝑛𝑇 𝑛 = 0, … , 𝑝 βˆ’ 1 exhibit an interesting physical effect. The analysis of the anticipation probabilities 𝑝𝑛 =

π‘žπ‘‘+𝑇 , π‘žπ‘‘+𝑛𝑇

2

shows, that for randomly chosen states π‘ž

2

measurements of the state at time 𝑑 + 𝑇 2 reveal information about the state at time 𝑑 Β± 𝑛𝑇 𝑛 = 0, … , 𝑝 βˆ’ 1 , anticipating future states and reflecting past states with significant 𝑇 probability. For 𝑝 < ∞ and any 0 < 𝛿 < 1, the probability to measure at time 𝑑 + 2 states from 𝑝

𝑝

the set π‘žπ‘‘+𝑛𝑇 𝑛 βˆ’ 2 < 2 𝛿 exceeds 𝜎 2 𝛿 with certainty as 𝑝 β†’ ∞, where 𝜎 2 is a constant 𝑇

independent from the period. For 𝑝 = ∞ and any 𝑁, the probability to measure at time 𝑑 + 2

states from the set π‘žπ‘‘Β±π‘›π‘‡ 𝑛 > 𝑁 is β‰₯ 𝜎 2 . We characterize the spectrum, establish an analog to Planck's relation, define a random sampling scheme, analyze the resulting distribution of the anticipation probabilities and point out applications.

All rights reserved 3/32

Orthogonal Evolution and Anticipation

0

Introduction 𝑖𝐻

Under a Hamiltonian 𝐻, a quantum state π‘ž evolves into an orbit π‘žπ‘‘ = π‘ˆπ‘‘ π‘ž, where π‘ˆπ‘‘ = 𝑒 βˆ’ ℏ 𝑑 . Due to the unitarity of π‘ˆπ‘‘ , the amplitude π‘‘πœ‡π‘ž^ 𝑇 = π‘žπ‘‘ , π‘žπ‘‘+𝑇 = 𝑒 βˆ’π‘–πœ†π‘‡ π‘‘πœ‡π‘ž πœ† is independent from 𝑑. If for some integer 0 < 𝑝 ≀ ∞: π‘‘πœ‡π‘ž^ 𝑛𝑇 = 𝛿𝑛 π‘šπ‘œπ‘‘ 𝑝 𝑛 ∈ β„€ , then π‘žπ‘‘+𝑛𝑇 𝑛 = 0, … , 𝑝 βˆ’ 1 is a set of mutually orthogonal states, for every 𝑑. The subject of our study are states with this property, named orthogonal evolution of period 𝑝 and step size 𝑇, and the anticipation amplitudes 𝛼𝑛 = π‘‘πœ‡π‘ž^ 𝑛 + 1 2 𝑇 , i.e. the inner product 𝑇

of states at time 𝑛𝑇 and 2 , as well as the anticipation probabilities 𝑝𝑛 = 𝛼𝑛 2 . It turns out, that 𝑇

for randomly chosen states π‘ž, measurements of the state at time 2 reveal information about the states ±𝑛𝑇 𝑛 = 0, … , 𝑝 βˆ’ 1 . Future states are anticipated and past states reflected with significant probability. These probabilities and the strength of this effect, which we name Quantum-Mechanical anticipation, are quantified in this paper. Chapter 1 shows that orthogonally evolving states have pure point or absolutely continuous spectrum π‘‘πœ‡π‘ž πœ† , but neither mixed nor singular continuous spectrum. The fundamental 2

upper bound ℏ𝑇 βˆ’1 ≀ Ο€ infπœ† 0 𝐻 in terms of the spectral quantity 𝐻 is established, which is an analog to Planck's relation. The spectral difference is defined in terms of the reduced spectrum and used to represent the anticipation amplitudes. In chapter 2 we justify from the symmetries of the set of self-adjoint Hamiltonians the random sampling scheme underlying our analysis, derive the statistical distribution of the spectral difference as π‘ž is randomly sampled from all orthogonally evolving states with period 𝑝 and step size 𝑇, identify the statistical quantities used in the subsequent chapters to assess anticipation strength and show that multiples of 𝑝 are unlikely to occur by chance. In chapter 3 the anticipation effect is analyzed under point spectrum. The distribution of the 𝑝𝑛 and of various observables is derived for some model states as well as under random 𝑇 sampling. Theorem 1 states, that for any 0 < 𝛿 < 1, the probability to measure at time 2 a 𝑝

𝑝

state from the set π‘žπ‘›π‘‡ 𝑛 βˆ’ 2 < 2 𝛿 exceeds 𝜎 2 𝛿 with certainty as 𝑝 β†’ ∞, where 𝜎 2 is a constant independent from the period. The operator π‘›π‘Ÿ expectation β‰ˆ π‘π‘Ÿ .

(defined in section 2.3) has

Chapter 4 determines anticipation under absolutely continuous spectrum. Theorem 2 states 𝑇 the constant lower bound 𝜎 2 for the probability to measure at time 2 a state from the set π‘žΒ±π‘›π‘‡ 𝑛 > 𝑁 , and infinite expectation for the operator π‘›π‘Ÿ . The statistics in the continuous case are the limiting values of those in the periodic case as 𝑝 β†’ ∞.

In the final chapter we draw conclusions, discuss some open questions and point out applications. Anticipation in general type quantum evolution will be analyzed in a future paper.

All rights reserved 4/32

Orthogonal Evolution and Anticipation

All rights reserved 5/32

Orthogonal Evolution and Anticipation

1

Orthogonal Evolution

After introducing some terminology, we characterize the spectrum of orthogonally evolving states, derive necessary and sufficient conditions for their existence, establish a fundamental bound on the frequency and represent the anticipation amplitudes in terms of the reduced spectrum. Definition 1 Let 𝐻 be a Hamiltonian, π‘ž a state and 0 < 𝑝 ≀ ∞ an integer. Throughout the paper we will assume that π‘ž 22 = 1. 𝑖𝐻

a) π‘ž is evolving under 𝐻 into π‘žπ‘‘ = π‘ˆπ‘‘ π‘ž, where π‘ˆπ‘‘ = 𝑒 βˆ’ ℏ 𝑑 , has spectral measure π‘‘πœ‡π‘ž πœ† . b) π‘ž has orthogonal evolution of period 𝑝 and step size 𝑇, iff π‘‘πœ‡π‘ž^ 𝑛𝑇 = 𝛿𝑛 π‘šπ‘œπ‘‘ 𝑝 𝑛 ∈ β„€ . Here π‘‘πœ‡π‘ž^ 𝑇 = π‘žπ‘‘ , π‘žπ‘‘+𝑇 = 𝑒 βˆ’π‘–πœ†π‘‡ ℏ π‘‘πœ‡π‘ž πœ† is the Fourier transform of π‘‘πœ‡π‘ž πœ† . c) 𝛼𝑛 = π‘‘πœ‡π‘ž^ 𝑛 + 1 2 𝑇 is the anticipation amplitude of π‘žπ‘›π‘‡ . d) 𝑝𝑛 = 𝛼𝑛 2 is the anticipation probability of π‘žπ‘›π‘‡ . e) The reduction of π‘‘πœ‡π‘ž πœ† modulo π‘š is π‘‘πœˆπ‘ž,π‘š πœ… ≝ πœ…=πœ†π‘‡ π‘šπ‘œπ‘‘ π‘š π‘‘πœ‡π‘ž πœ† . ℏ

f)

The ΞΊ-cut of π‘‘πœ‡π‘ž πœ† modulo π‘š is the function π‘πœ…,π‘š 𝑛 =

g) The standard scale is defined to make

𝑇 ℏ

πœ…β„ 𝑇

π‘‘πœ‡ π‘ž π‘›π‘š +

π‘‘πœˆ π‘ž ,π‘š πœ…

π‘›βˆˆβ„€ .

= 1.

1.1 Spectrum Orthogonal evolution constrains the spectrum to either pure point or absolutely continuous, as stated by the following proposition. Proposition 1: Characterization of the Spectrum (Orthogonal Evolution Constraint) Let π‘ž have orthogonal evolution with period 𝑝 and step size 𝑇. In standard scale, a) If 𝑝 < ∞, then πœˆπ‘ž,2πœ‹

2πœ‹π‘˜ 𝑝

1

=𝑝

π‘˜ = 0, … , 𝑝 βˆ’ 1 is pure point.

π‘‘πœ…

b) If 𝑝 = ∞, then π‘‘πœˆπ‘ž,2πœ‹ πœ… = 2πœ‹ 0 ≀ πœ… ≀ 2πœ‹ is absolutely continuous. Proof: 𝑖𝐻

The spectrum of 𝐻 and π‘ˆπ‘‡ are related by π‘ˆπ‘‘ = 𝑒 βˆ’ ℏ 𝑑 . π‘ˆπ‘‡ is a bounded operator with spectrum on the complex unit circle [RS80]. Its spectral measure is uniquely determined by 𝑓 π‘ˆπ‘‡ π‘ž0 , π‘ž0 𝑓 ∈ 𝐢 ∞ β„‚ . These functions in turn are uniquely determined by the monomials 𝑧 π‘˜ , which due to mutual orthogonality satisfy π‘ˆπ‘‡π‘› π‘ž0 , π‘ž0 = π‘žπ‘›π‘‡ , π‘ž0 = 𝛿𝑛 π‘šπ‘œπ‘‘ 𝑝 . Thus the spectral measure of π‘ˆπ‘‡ is equally distributed, in the periodic case over the unit roots, in the infinite case over the whole complex unit circle. Singular continuous spectrum is excluded. Mixed spectrum cannot occur, as in the periodic case the initial state must recur, whereas in the aperiodic case π‘‘πœ‡π‘ž^ 𝑑 ∈ π‘œ 1 (see [YL96]). The numerical values (in standard scale) follow from

2πœ‹ 0

𝑒 βˆ’π‘–π‘›πœ… π‘‘πœˆπ‘ž,2πœ‹ πœ… = 𝛿𝑛 π‘šπ‘œπ‘‘

definition 1b) and the normalization of π‘ž. The integral becomes 𝑝 2πœ‹ 0

π‘‘πœ… 𝑒 βˆ’π‘–π‘›πœ… 2πœ‹

βˆ’1

𝑝 π‘βˆ’1 βˆ’2πœ‹π‘–π‘›π‘˜ 𝑝 π‘˜=0 𝑒

(see and

for 𝑝 < ∞ and 𝑝 = ∞, respectively, which is easily verified to yield the correct

values.

q.e.d. All rights reserved 6/32

Orthogonal Evolution and Anticipation

A more elaborate proof facing the various intricacies and the explicit determination of the Fourier expansion of a continuous function 𝑓 on 0,2πœ‹ such that π‘‘πœˆπ‘ž,2πœ‹ πœ… = D2 f will be given in a future paper. Proposition 1 has obvious consequences for the existence of orthogonal evolution. Corollary 1: Existence Necessary and sufficient for the existence of a state π‘ž evolving orthogonally with period 𝑝 under Hamiltonian 𝐻 is that either 𝑝 < ∞ and the reduced point spectrum of 𝐻 contains 𝑝 equidistant points (such as the 𝑋 2 operator), or 𝑝 = ∞ and 𝐻 has non-empty absolutely continuous spectrum (such as the βˆ’βˆ† operator). Proof: In either case states meeting proposition 1 are easily constructed. 1.2 Fundamental Frequency Bound In terms of the spectral quantity 𝐻 we derive here the fundamental frequency bound 2 ℏ𝑇 βˆ’1 ≀ Ο€ 𝐻 , an analog to Planck's 𝐸 = β„πœˆ. Definition 2 For any Hamiltonian 𝐻 and state π‘ž, 𝐻 ≝

πœ† βˆ’ πœ†0 π‘‘πœ‡π‘ž πœ† .

Proposition 2 Let π‘ž be any state evolving under Hamiltonian 𝐻. Then a) 𝐻 < ∞ iff 𝐻 < ∞. b) 𝐻 > 0 iff π‘ž β‰  0 and π‘ž not an eigenstates with zero eigenvalue. c) π‘ž, π‘žπ‘‘ β‰₯ 𝑅𝑒 π‘ž, π‘žπ‘‘ β‰₯ 1 βˆ’ 𝐻 𝑑 ℏ. d) If π‘‘πœ‡π‘ž^ 𝑇 = 0, then 𝑇 β‰₯ supπœ† 0 ℏ 𝐻 βˆ’ πœ†0 . e) The supremum is attained when πœ†0 equals the median of πœ‡, i.e. πœ‡ πœ†0 , ∞ πœ‡ βˆ’βˆž, πœ†0 . f) infπœ† 0 𝐻 βˆ’ πœ†0 = 0 iff π‘ž is an eigenstate.

=

Proof: a) By elementary measure theory,

𝐻 =

0 πœ†π‘‘πœ‡π‘ž βˆ’βˆž

πœ†π‘‘πœ‡π‘ž πœ†

exists iff

∞ πœ†π‘‘πœ‡π‘ž 0

πœ† < ∞ and

πœ† < ∞, i.e. iff 𝐻 < ∞. b) As 𝐻 is a sum of two non-negative quantities, it vanishes only, if π‘‘πœ‡π‘ž πœ† = 0 or 𝑠𝑒𝑝𝑝 πœ‡π‘ž πœ† = 0 c) By the SchrΓΆdinger equation and the triangle inequality, 𝑑 𝑑𝑑

π‘žπ‘‘ βˆ’ π‘ž

2 2

=

Therefore π‘žπ‘‘ βˆ’ π‘ž d) By orthogonality, follows.

𝐻

𝐻

βˆ’π‘– ℏ π‘žπ‘‘ , π‘ž + π‘ž, βˆ’π‘– ℏ π‘žπ‘‘ 2 2

≀

2 ℏ

= 2

πœ† πœ†π‘‘ sin ℏ ℏ

π‘‘πœ‡π‘ž πœ†

2

≀ℏ 𝐻 .

𝐻 𝑑. The result follows from 𝑅𝑒 π‘ž, π‘žπ‘‘ = 1 βˆ’ π‘žπ‘‘ βˆ’ π‘ž

π‘žπ‘‡ βˆ’ π‘ž

2 2

2 2

2

= 2. As 𝑇 is the same for 𝐻 βˆ’ πœ†0 , the claimed result

All rights reserved 7/32

Orthogonal Evolution and Anticipation

πœ•π‘¦

e) From πœ•πœ† f)

0

∞ π‘‘πœ‡π‘ž 0

𝐻 βˆ’ πœ†0 = βˆ’

πœ† + πœ†0 +

∞ π‘‘πœ‡π‘ž 0

πœ†0 βˆ’ πœ† = 0.

Elementary. q.e.d. 𝑇

2c) yields a lower bound on the modulus of the anticipation amplitude at 𝑑 = 2 . shows that at given 𝐻 the frequency of the orthogonal evolution cannot be arbitrarily high, or equivalently the step size arbitrarily small. A bound for the passage time based on 𝐻 2 has been established by Fleming [Fl73] and discussed by various authors. The bound given in 2d) for the passage time can be strengthened to a bound for the step size of orthogonal evolution: Corollary 2: Fundamental Frequency Bound The frequency of an orthogonal evolution is bounded by 2

ℏ𝑇 βˆ’1 ≀ Ο€ 𝐻 . Proof: Proposition 2d) and 2e) imply ℏ𝑇 βˆ’1 ≀ 𝐻 βˆ’ πœ†0 . We conclude from proposition 1 that among all Hamiltonians providing orthogonal evolution those with spectrum evenly πœ‹ distributed in – πœ‹, πœ‹ minimize 𝐻 , which is easily found to be 2 . q.e.d.

1.3 Representation of Anticipation Amplitudes From now on all spectra are in standard scale. In proposition 1 we have determined the reduced spectrum π‘‘πœˆπ‘ž,2πœ‹ πœ… . Using a notation for the reduced spectrum on the doubled interval 0,4πœ‹ we establish a representation of 𝛼𝑛 which we use throughout the paper. Definition 3: Spectral difference a) Let 𝑝 < ∞. Then π‘₯π‘˜,𝑠 = πœˆπ‘ž,4πœ‹

π‘˜

2πœ‹ 𝑠 + 𝑝

𝑠 = 0, 1; π‘˜ = 0, … , 𝑝 βˆ’ 1 .

π‘¦π‘˜ = π‘₯π‘˜,0 βˆ’ π‘₯π‘˜,1 is the spectral difference. 𝑑

b) Let 𝑝 = ∞. Then π‘₯πœ…,𝑠 = π‘‘πœ… πœˆπ‘ž,4πœ‹ 2πœ‹π‘  + πœ… 𝑠 = 0,1; 0 ≀ πœ… ≀ 2πœ‹ . π‘¦πœ… = π‘₯πœ…,0 βˆ’ π‘₯πœ…,1 is the spectral difference. Proposition 3: Representation of anticipation amplitudes 1 𝑝 1 2πœ‹

1 1 𝑝 𝑝 1 1 , βˆ’ 2πœ‹ ≀ π‘¦πœ… ≀ 2πœ‹ 1 βˆ’2πœ‹π‘– π‘›βˆ’ π‘˜ 𝑝

a) If 𝑝 < ∞, then 0 ≀ π‘₯π‘˜,𝑠 ≀ , βˆ’ ≀ π‘¦π‘˜ ≀ , π‘₯π‘˜,0 + π‘₯π‘˜,1 = b) If 𝑝 = ∞, then 0 ≀ π‘₯πœ…,𝑠 ≀ c) If 𝑝 < ∞, then 𝛼𝑛 = d) If 𝑝 = ∞, then 𝛼𝑛 =

π‘βˆ’1 2 π‘˜=0 π‘¦π‘˜ 𝑒 1 2πœ‹ βˆ’π‘– π‘›βˆ’ πœ… π‘‘πœ… 2 π‘¦πœ… 𝑒 0 2πœ‹

𝑛 π‘–π‘›π‘‘π‘’π‘”π‘’π‘Ÿ .

All rights reserved 8/32

. 1

, π‘₯πœ…,0 + π‘₯πœ…,1 = 2πœ‹ .

𝑛 π‘–π‘›π‘‘π‘’π‘”π‘’π‘Ÿ .

Proof:

1 𝑝

Orthogonal Evolution and Anticipation

a) From proposition 1a) and definition 3a). b) From proposition 1b) and definition 3b). c) By definition 1c) and 3a), π‘βˆ’1

𝛼𝑛 =

π‘₯π‘˜,0 𝑒

1 βˆ’2πœ‹π‘– π‘›βˆ’ π‘˜ 𝑝 2

π‘˜=0 π‘βˆ’1

=

π‘₯π‘˜,0 βˆ’ π‘₯π‘˜,1 𝑒

+ π‘₯π‘˜,1 𝑒

1 βˆ’2πœ‹π‘– π‘›βˆ’ 1+π‘˜ 𝑝 2

1 βˆ’2πœ‹π‘– π‘›βˆ’ π‘˜ 𝑝 2 .

π‘˜=0

The claim follows from definition 3a). d) By definition 1c) and 3b), 2πœ‹ 1 1 βˆ’π‘– π‘›βˆ’ πœ… π‘‘πœ… βˆ’π‘– π‘›βˆ’ 2 2 𝛼𝑛 = π‘₯πœ…,0 𝑒 + π‘₯πœ…,1 𝑒 2πœ‹ 0 2πœ‹ 1 βˆ’π‘– π‘›βˆ’ πœ… π‘‘πœ… 2 = π‘₯πœ…,0 βˆ’ π‘₯πœ…,1 𝑒 , 2πœ‹ 0 The claim follows from definition 3b).

2πœ‹+πœ…

π‘‘πœ… 2πœ‹

q.e.d.

All rights reserved 9/32

Orthogonal Evolution and Anticipation

2

Random Sampling

Corollary 1 implies that orthogonally evolving states exist under infinitely many dynamics. In the remaining chapters we investigate, how likely such states are to exhibit anticipation of significant strength. To this end, we define in this chapter a random sampling scheme which will underlie all our subsequent analyses. We justify this scheme by the symmetry properties of the set of self-adjoint Hamiltonians. Under this scheme we derive the statistical distribution of the spectral difference, which determines the anticipation amplitudes. We define the set of statistical quantities used in the subsequent chapters to assess anticipation strength, and finally show that multiples of the period are unlikely to occur by chance. 2.1 Sampling Scheme In order to investigate the likelihood of physical states to exhibit anticipation of a certain strength we must consider the set of all orthogonally evolving states and the way such states arise. We focus here on a laboratory situation, where the experimentalist prepares an experimental setup and an orthogonally evolving target, and then studies its evolution under the laws of nature. The setup is modeled by a Hamiltonian and the target by a physical state. What Hamiltonians and states are preparable depends on the experimentalist's skills and capabilities and the constituents available in nature. Abstracting from these practical limitations, we only constrain the experimentalist to prepare self-adjoint Hamiltonians and states exhibiting orthogonal evolution, and analyze the outcome if these are randomly chosen, letting freely fluctuate all degrees of freedom except those fixed by the orthogonal evolution constraint (proposition 1). We are going to define a sampling scheme, where, for a given step size and period, first a self-adjoint Hamiltonian is chosen from the set of all self-adjoint Hamiltonians admitting orthogonal evolution, and then a state is chosen from the set of states orthogonally evolving under that Hamiltonian. Proposition 3 shows that orthogonal evolution and anticipation are completely determined by the spectral properties of the Hamiltonians and states involved, more specifically by the spectral difference. Thus, for our purpose, only constraints on and degrees of freedom of spectra need consideration. Inverse theory tells us that, for any bounded real measure πœ‡ and any positive integer 𝑑, there are uncountably many self-adjoint operators 𝐻 and states π‘ž ∈ 𝐿2 ℝ𝑑 with spectral measure πœ‡. This holds as well for at least one SchrΓΆdinger operator βˆ’βˆ† + 𝑉 with self-adjoint extension 𝐻. This enables us to define the sampling scheme in terms of probability spaces of spectral measures. The core feature of spectral measures of orthogonally evolving states are the πœ…-cuts modulo 2πœ‹ (see definition 1f), related to the spectral measure πœ‡ by π‘‘πœ‡ 2πœ‹π‘› + πœ… = π‘πœ…,2πœ‹ 𝑛 π‘‘πœˆπ‘ž,2πœ‹ πœ… . The quantities π‘₯π‘˜,𝑠 from definition 3 are π‘₯π‘˜,𝑠 = π‘πœ…,2πœ‹ 2𝑛 + 𝑠 π‘‘πœˆπ‘ž,2πœ‹ πœ… . Whereas the reduced spectral measure is fixed by the orthogonal evolution constraint, the πœ…cuts are arbitrary doubly-infinite sequences with non-negative values, summing up to 1. Let's firstly consider their supports 𝑠𝑒𝑝𝑝 π‘πœ…,2πœ‹ , i.e. the patterns of points 𝑛 ∈ β„€ where π‘πœ…,2πœ‹ 𝑛 β‰  0. This pattern is a property of the Hamiltonian, whereas the size of π‘πœ…,2πœ‹ 𝑛 is a determined by the state.

All rights reserved 10/32

Orthogonal Evolution and Anticipation

Self-adjointness allows for any such pattern at any index πœ…, and to choose patterns at different indices πœ… independently. This holds for point spectrum as well as for continuous spectrum, with the only constraint that for fixed 𝑛 the dependence from πœ… must be absolutely continuous. While the following arguments will be given for point spectrum, they are easily extended to continuous spectrum as the limiting case. This symmetry of the set of self-adjoint Hamiltonians allows us to require the sampling scheme to make the 𝑠𝑒𝑝𝑝 π‘πœ…,2πœ‹ a family of independent identically distributed random variables. As these symmetries hold as well when constraining the Hamiltonians to admit orthogonal evolution, we demand the same property in this case. Given any Hamiltonian, the value of π‘πœ…,2πœ‹ 𝑛 on 𝑠𝑒𝑝𝑝 π‘πœ…,2πœ‹ depends only on the state π‘ž. When randomly sampling states π‘ž ∈ 𝐿2 ℝ𝑑 meeting the orthogonal evolution constraint, then π‘πœ…,2πœ‹ can attain any non-negative function 𝑓 ∈ 𝑙1 𝑠𝑒𝑝𝑝 π‘πœ…,2πœ‹ meeting this constraint, independently for any index πœ…, because orthogonal components of these states belonging to different indices can be sampled separately in the discrete case. As by the preceding requirement the supports are i.i.d., we impose the same requirement on the sampling distribution of the πœ…-cuts. As all distributions as well as the reduced spectrum are index-independent, we will omit indices in the following. Let πœƒπ‘ be sampling distribution of the πœ…-cuts π‘πœ…,2πœ‹ at period 𝑝. Notice that πœƒπ‘ is not obtained by conditioning from another, unconditional distribution. The orthogonal evolution constraint is met by normalization of the πœ…-cuts, not by conditioning of some distribution. Practical sampling procedures will be designed to meet the constraint approximately, but not by selection of "good" states from unconstraint samples. Only the latter procedure would yield a conditional distribution, whereas the real procedure is sufficiently modeled by a procedure where orthogonal components are sampled and normalized independently (index by index). The period 𝑝 only determines the orthogonal evolution constraint. As this constraint is met by 𝑝 normalization, it only enters into πœƒπ‘ as a scaling factor: πœƒπ‘ π‘πœ…,2πœ‹ = πœƒπ‘ž π‘ž π‘πœ…,2πœ‹ for any 𝑝, π‘ž and π‘πœ…,2πœ‹ . As our reasoning did not depend otherwise from the period 𝑝, we finally postulate the sampling distribution of πœ…-cuts to be independent from 𝑝, up to the orthogonal evolution constraint and the scaling factor. Our requirements are summarized in the following definition. Definition 4: Random Samples A family of sets 𝑆𝑝 of measures πœ‡ is a family of random samples of measures of orthogonally evolving states with period 𝑝 ≀ ∞ iff the following holds for all 𝑝: a) The reduced spectrum πœˆπ‘ž,2πœ‹ meets proposition 1 above. b) For 𝑝 < ∞, the πœ…-cut-valued random variables π‘πœ… are i.i.d. under probability measure πœƒπ‘ 𝑐 = πœƒ1 𝑝𝑐 independent from πœ…. c) For 𝑝 = ∞, the πœ…-cut-valued random variables π‘πœ… are i.i.d. under probability measure πœƒβˆž 𝑐 = πœƒ1 2πœ‹π‘ independent from πœ….

All rights reserved 11/32

Orthogonal Evolution and Anticipation

From the preceding arguments and definition 4 follows: Corollary 3: Existence The class of self-adjoint Hamiltonians admits for sampling schemes yielding random samples as defined above. Further evidence justifying our sampling scheme will be provided by the main results and discussed in the final chapter. 2.2 Distribution of the Spectral Difference Random sampling according to the above scheme yields a probability distribution of the spectral difference function π‘¦π‘˜ . It is no surprise that these are again i.i.d.. Proposition 4 Consider the sampling distribution as π‘ž is randomly sampled under the sampling scheme of definition 4. Then for some distribution function 𝐹 𝑧 on βˆ’1, +1 the following holds: a) If 𝑝 < ∞, then π‘¦π‘˜ π‘˜ = 0, … , 𝑝 βˆ’ 1 is a sequence of i.i.d. random variables with 1 1 distribution 𝐹 𝑝𝑦 with density 𝑝𝑓 𝑝𝑦 for βˆ’ 𝑝 ≀ 𝑦 ≀ 𝑝 .

b) If 𝑝 = ∞, then π‘¦πœ… 0 < πœ… ≀ 2πœ‹ is a family of i.i.d. random variables with distribution 1 1 𝐹 2πœ‹π‘¦ with density 2πœ‹π‘“ 2πœ‹π‘¦ for βˆ’ ≀ 𝑦 ≀ . 2πœ‹ 2πœ‹ c) If the random sampling scheme is invariant under translations of the energy scale, then the density 𝑓 is an even function. Proof: a) For 𝑝 < ∞ 𝑑he distribution of π‘₯π‘˜,s is 𝑋𝑠 π‘₯ = πœƒ1 𝑐2πœ‹ 𝑝π‘₯ β‰₯ 𝑐2πœ‹ 2𝑛 + 𝑠 , from which 𝐹 is obtained. The scaling is due to proposition 1. The i.i.d. and period dependence are direct consequences of definition 4. b) For 𝑝 = ∞ 𝑑he distribution of π‘₯π‘˜,s is 𝑋𝑠 π‘₯ = πœƒ1 𝑐2πœ‹ 2πœ‹π‘₯ β‰₯ 𝑐2πœ‹ 2𝑛 + 𝑠 , from which 𝐹 is obtained. The scaling is due to proposition 1. The i.i.d. and period dependence are direct consequences of definition 4. c) Translations of the energy scale by 2πœ‹π‘š effect a shift by βˆ’π‘š on the πœ…-cuts, transforming them to π‘πœ…,2πœ‹ 𝑛 βˆ’ π‘š . If π‘š is odd, the sign of the spectral difference is toggled. q.e.d.

2.3 Anticipation Statistics We identify here the statistical quantities used to assess anticipation strength in the subsequent chapters, and recall some statistical terminology and basic facts. The following anticipation statistics will be analyzed in the subsequent chapters, with 𝑛 as by definitions 5a and 5b below. 𝑇

a) 𝑝𝑛 , the probability to measure π‘žπ‘›π‘‡ at time 2 , and 𝑝𝑁 ≝ 𝑃 𝑛 > 𝑁 . All rights reserved 12/32

Orthogonal Evolution and Anticipation

𝑇

b) π‘π‘‘π‘œπ‘‘ = 𝑝𝑛 , the probability to measure one of the states π‘žπ‘›π‘‡ at time . 2 c) Observable π‘›π‘Ÿ π‘Ÿ β‰₯ 0 , with value π‘›π‘Ÿ in state π‘žπ‘›π‘‡ 𝑛 β‰₯ 0 and expected value π‘›π‘Ÿ ≝ π‘›π‘Ÿ 𝑝𝑛 . We perform the analysis for minimum and maximum anticipation model states as well as under biased and unbiased random sampling according to definition 4 above. In the sampling scenarios, we provide the sample mean and variance as well as stochastic asymptotics of the probabilities. The following definitions will be used: Definition 5 𝑛 π‘šπ‘œπ‘‘ 𝑝 𝑛 π‘šπ‘œπ‘‘ 𝑝 ≀ 𝑝 + 1 2 . 𝑝 + 1 βˆ’ 𝑛 π‘šπ‘œπ‘‘ 𝑝 π‘œπ‘‘π‘•π‘’π‘Ÿπ‘€π‘–π‘ π‘’ If 𝑝 = ∞, then 𝑛 = 𝑛 . π‘šπ‘› = 𝐸 𝑧 𝑛 𝑛 = 0, 1, … is the 𝑛𝑑𝑕 moment. 𝜎 2 𝑧 = 𝐸 𝑧 βˆ’ π‘š1 2 is the variance. 𝑧 ∈ 𝑂𝑝 1 iff βˆ€πœ€ > 0 βˆƒπ‘: lim sup 𝑃 𝑧 > 𝑐 < πœ€. 𝑧 ∈ π‘œπ‘ 1 iff βˆ€πœ€ > 0: lim 𝑃 𝑧 > πœ€ = 0 .

a) If 𝑝 < ∞, then 𝑛 = b) c) d) e) f)

𝑂𝑝 and π‘œπ‘ are Pratt's stochastic asymptotics [Pr59], the former expressing convergence in distribution, or stochastic boundedness, the latter stochastic convergence to zero [BFH75]. Recall the following facts: Proposition 5 a) If the probability density is an even function, then π‘š2𝑛+1 = 0. b) 𝐸 𝑐𝑧 𝑛 = 𝑐 𝑛 π‘šπ‘› . c) 𝜎 2 = π‘š2 βˆ’ π‘š12 . d) 𝑃 𝑧 > 𝐸 𝑧 π‘§βˆ’π‘§ 𝜎

π‘Ÿ

1 π‘Ÿ

1 , βˆ†2

1

βˆ† < βˆ†π‘Ÿ

π‘Ÿ > 0, βˆ† > 0 .

e) 𝑃 >βˆ† < where 𝑧 = π‘š1 . f) 𝑧 = 𝑧 + 𝑂𝑝 𝜎 . 𝑧 g) If 𝜎 ∈ π‘œ 𝑧 then 𝑧 = 1 + π‘œπ‘ 1 . Proof: Parts a-c) are elementary. d) is the BienaymΓ©-Chebychev inequality [TJ71], e) its special case π‘Ÿ = 2, f) and g) direct consequences of e) and definition 5e-f). 2.4 Multiple Periods Random sampling of orthogonally evolving states with period 𝑝 does admit states with period 𝑛𝑝 for some 𝑛 > 1. Such states have π‘¦π‘˜ = 0 βˆ€π‘˜, thus 𝛼𝑛 = 0. However, they are not likely to occur by chance in random samples.

All rights reserved 13/32

Orthogonal Evolution and Anticipation

Proposition 6 Let π‘ž be a state with orthogonal evolution of period 𝑝. a) 𝑝 < ∞. For any πœ€ > 0, let 𝑛 be the cardinality of the set π‘˜ π‘π‘¦π‘˜ < πœ€, 0 ≀ π‘˜ < 𝑝 . Then 𝑛 has the Binomial distribution with parameter 𝑝 and 𝐹 πœ€ . b) 𝑝 = ∞. For any πœ€ > 0, then the set πœ… π‘¦πœ… < πœ€, 0 < πœ… ≀ 2πœ‹ has Lebesgue measure 𝐹 βˆ’πœ€, +πœ€ . Proof: a) Immediately from proposition 4. b) From proposition 4, as the variables are i.i.d. q.e.d. This result rules out the occurrence by chance of period 𝑛𝑝 or, equivalently, step size 𝑇 𝑛 when randomly sampling states with period 𝑝 and step size 𝑇. However it does not guarantee strong anticipation. This will be provided by the proof of constant probabilistic lower bounds on 𝑝𝑁 in the next two chapters.

All rights reserved 14/32

Orthogonal Evolution and Anticipation

3

Anticipation (Point Spectrum)

Here we determine, for model and random states with 𝑝 < ∞, the values and distributions of the anticipation statistics identified and defined in section 2.3, and prove theorem 1. 3.1 Minimum Anticipation We consider here the extremal case of constant π‘¦π‘˜ . Notice that these model states have zero occurrence probability. Proposition 7 Let π‘π‘¦π‘˜ = 𝑦 π‘˜ = 0, … , 𝑝 βˆ’ 1 for some βˆ’1 ≀ 𝑦 ≀ 1. Then a) 𝑝𝑛 =

𝑦2 𝑝 2 sin 2 πœ‹

1 π‘›βˆ’ 2

β‰ˆ π‘βˆ’2 𝑦 2 at 𝑛 = b) 𝑝𝑁 ∈ 𝑂 𝑁 βˆ’1

with maximum value β‰ˆ

𝑝 𝑝 2

and 𝑝𝑛 ∈ 𝑂 π‘›βˆ’2

𝑁→

2

𝑝 2

𝑛→

𝑝 2

4𝑦 2 πœ‹2

at 𝑛 = 0 and 𝑛 = 1, minimum value

.

.

c) π‘π‘‘π‘œπ‘‘ = 𝑦 . d) 𝑛1 = ln 𝑝 + 𝑂 1 , π‘›π‘Ÿ = 𝑂 π‘π‘Ÿβˆ’1

π‘Ÿ>1 .

Proof: a) By proposition 3c) and the premise π‘βˆ’1

𝛼𝑛 = 𝑝

=

βˆ’1

π‘βˆ’1 𝑦

𝑦

𝑒

1 βˆ’2πœ‹π‘– π‘›βˆ’ π‘˜ 𝑝 2

π‘˜=0 1 βˆ’2πœ‹π‘– π‘›βˆ’ 2 𝑒 1 βˆ’2πœ‹π‘– π‘›βˆ’2 𝑒

βˆ’1 𝑝

=

βˆ’1

βˆ’2π‘βˆ’1 𝑦 1 βˆ’2πœ‹π‘– π‘›βˆ’2 𝑒

𝑝

= 𝑖𝑦 βˆ’1

βˆ’1

𝑒

1 βˆ’πœ‹π‘– π‘›βˆ’ 2

𝑝

1 𝑝 sin πœ‹ 𝑛 βˆ’ 2

𝑝 𝑦 The maxima, minima and asymptotics are elementary. b) From a) by the Euler summation formula. c) From Lemma 1 below. d) From a) by the Euler summation formula.

𝑝

2𝑛 βˆ’ 1 β‰  p otherwise

q.e.d.

Lemma 1 π‘βˆ’1

π‘βˆ’2 sinβˆ’2 πœ‹ 𝑛 βˆ’ 𝑛=0

1 2

𝑝=1

Proof: Let π‘ž be a superposition of exactly 𝑝 eigenfunctions. Then π‘†π‘π‘Žπ‘› π‘žπ‘‘ 𝑑 ∈ ℝ = π‘†π‘π‘Žπ‘› π‘žπ‘˜π‘‡ π‘˜ ∈ 2 β„€ . Therefore π‘π‘‘π‘œπ‘‘ = π‘žπ‘‡ 2 2 = 1. As in this case 𝑦 = π‘π‘¦π‘˜ = 1, the claimed result follows. The alternatively, direct summation, is easily achieved by adapting theorem 4.9a of [He88] to periodic functions. All rights reserved 15/32

Orthogonal Evolution and Anticipation

3.2 Maximum Anticipation We consider here the extremal case of alternating π‘¦π‘˜ = βˆ’1 π‘˜ 𝑦. Notice that these model states have zero occurrence probability. Proposition 8 Let 𝑝 be even, π‘π‘¦π‘˜ = βˆ’1 π‘˜ 𝑦 π‘˜ = 0, … , 𝑝 βˆ’ 1 for some βˆ’1 ≀ 𝑦 ≀ 1. Then 𝑦2

a) 𝑝𝑛 = β‰ˆπ‘

𝑝 2 cos 2 πœ‹ βˆ’2 2

1 π‘›βˆ’ 2

𝑝

, which has maximum value β‰ˆ

𝑦 at 𝑛 = 0 and 𝑛 = 1, and 𝑝𝑛 ∈ 𝑂 π‘βˆ’2

b) 𝑝𝑁 = π‘π‘‘π‘œπ‘‘ βˆ’ 𝑂 𝑝

βˆ’1

4𝑦 2 πœ‹2

at 𝑛 =

𝑝 2

, minimum value

π‘›βˆˆπ‘œ 𝑝 .

π‘βˆˆπ‘œ 𝑝 .

c) π‘π‘‘π‘œπ‘‘ = 𝑦 2 . d) π‘›π‘Ÿ = 𝑂 π‘π‘Ÿ + π‘π‘Ÿβˆ’1 ln 𝑝 . For odd 𝑝, this evolution degenerates to an orthogonal evolution with period 𝑝 and step size 𝑇 2. Proof: a) By proposition 3c) and the premise π‘βˆ’1

𝛼𝑛 = 𝑝

βˆ’1

βˆ’1 π‘˜ 𝑒

𝑦

1 βˆ’2πœ‹π‘– π‘›βˆ’ π‘˜ 𝑝 2

π‘˜=0

=

𝑝

βˆ’1

𝑦

1 βˆ’2πœ‹π‘– 𝑛 βˆ’ 2 1βˆ’ βˆ’1 𝑝 𝑒 1 βˆ’2πœ‹π‘– 𝑛 βˆ’ 2 1+𝑒

𝑝

=

2𝑝 βˆ’1 𝑦 1 βˆ’2πœ‹π‘– 𝑛 βˆ’ 2 1+𝑒

𝑝

=𝑦

𝑒

βˆ’πœ‹π‘– 𝑛 βˆ’

1 2

𝑝 1 2

p cos πœ‹ π‘›βˆ’

𝑝

0

𝑝 𝑒𝑣𝑒𝑛

.

𝑝 π‘œπ‘‘π‘‘

The maxima, minima and asymptotics are elementary. For odd 𝑝 and 𝑦 = Β±1, the support of π‘‘πœˆπ‘ž,4πœ‹ π‘˜ consists of 𝑝 equidistant points. Thus the evolution is of minimum anticipation type with period 𝑝 and step size 𝑇 2. b) From a) by the Euler summation formula. c) By Lemma 1. d) From a) and the Binomial theorem one obtains π‘›π‘Ÿ β‰ˆ

𝑗 ,𝑛

𝑝 π‘Ÿβˆ’π‘— 2

βˆ’π‘›

𝑗

𝑗 βˆ’2 π‘Ÿ , 𝑗!

from

which the result follows by the Euler summation formula. The notation π‘Žπ‘ denotes the 𝑏 𝑑𝑕 falling power of π‘Ž. q.e.d. 3.3 General Case We consider now sampling distributions such that π‘¦π‘˜ π‘˜ = 0, … , 𝑝 βˆ’ 1 is a sequence of i.i.d. 1 1 random variables with distribution 𝐹𝑝 𝑦 βˆ’ 𝑝 ≀ 𝑦 ≀ 𝑝 with moments π‘βˆ’π‘› π‘šπ‘› .

All rights reserved 16/32

Orthogonal Evolution and Anticipation

Proposition 9 a) 𝐸 𝑝𝑛 = π‘βˆ’1 𝜎 2 +

π‘š 12

1 2

𝑝 2 sin 2 πœ‹ π‘›βˆ’

= π‘βˆ’1 𝜎 2 +

𝑝

π‘›βˆˆΞ© 𝑝

𝑂 π‘βˆ’2 𝑂 π‘›βˆ’2

π‘›βˆˆπ‘œ 𝑝

. 4

𝐸 𝑝𝑛 has maximum value π‘βˆ’1 𝜎 2 + πœ‹ 2 π‘š12 at 𝑛 = 0 and 𝑛 = 1. 𝐸 𝑝𝑛 has minimum value π‘βˆ’1 𝜎 2 + =0 ∈ 𝑂𝑝 π‘βˆ’1

π‘š 12 𝑝2

𝑝 2

.

π‘šπ‘› = π‘š1𝑛 2𝑛 βˆ’ 1 = 𝑝

2

βˆ’1 2 βˆ’1 𝑛 b) 𝑝𝑛 = 𝐸 𝑝𝑛 + ∈ 𝑂𝑝 𝑝 βˆ’1 ∈ 𝑂𝑝 𝑝

at 𝑛 =

2

π‘›βˆˆπ‘œ 𝑝 . 𝑛 ∈ Ξ© 𝑝 , 2𝑛 βˆ’ 1 β‰  𝑝

∈ 𝑂𝑝 π‘βˆ’1

π‘š1 ∈ 𝑂 π‘βˆ’1

2

c) 𝐸 π‘π‘‘π‘œπ‘‘ = π‘š2 .

d)

𝐸 𝑝𝑁 = 1 βˆ’ 2π‘βˆ’1 𝑁 𝜎 2 + π‘š12

1

𝑁=0

2 πœ‹2𝑁 βˆ’1

𝑁 ∈ π‘œ 𝑝 ,𝑁 β‰  0 .

𝑝 e) 𝑝𝑁 = 𝐸 𝑝𝑁 + f)

𝐸 π‘›π‘Ÿ =

𝑝 2 π‘Ÿ π‘Ÿ+1

=0 ∈ 𝑂𝑝 π‘βˆ’1

2

1 βˆ’ 2π‘βˆ’1 𝑁

otherwise

π‘šπ‘› = π‘š1𝑛 . π‘œπ‘‘π‘•π‘’π‘Ÿπ‘€π‘–π‘ π‘’

π‘š2 + 𝑂 π‘π‘Ÿβˆ’1 .

Proof: a) b) c) d) e) f)

From proposition 3c) and the premises, as detailed in the appendix. From proposition 5f, where π‘‰π‘Žπ‘Ÿ 𝑝𝑛 is calculated in the appendix. See appendix. See appendix. From proposition 5f, where π‘‰π‘Žπ‘Ÿ 𝑝𝑁 is calculated in the appendix. From proposition 8a and the Euler summation formula. q.e.d.

Proposition 9 shows that anticipation is fairly strong. Whereas the expected probability is independent of the period, the standard deviation is 𝑂 π‘βˆ’1 2 . We summarize these findings by

All rights reserved 17/32

Orthogonal Evolution and Anticipation

Theorem 1: Anticipation (Point Spectrum) a) When randomly sampling with mean π‘š1 and 𝜎 2 , then, for any 0 < 𝛿 < 1, 𝑇 measurements at time yield a state from the set π‘žπ‘›π‘‡ 𝑁 < 𝑛 < 𝑝 + 1 βˆ’ 𝑁 with 2

𝑝

probability 𝑝𝑁 = 𝜎 2 𝛿 + π‘œπ‘ 1 for 𝑁 = 2 1 βˆ’ 𝛿 . b) For any 0 < 𝛿 < 𝜎 2 and any 𝑁, the probability 𝑝𝑁 > 𝛿 with certainty as 𝑝 β†’ ∞. c) The operator π‘›π‘Ÿ has expectation 𝑂 π‘π‘Ÿ . Proof: a) By proposition 9b. b) By a) and definition 5f). c) By proposition 9f). q.e.d. Notice that for unbiased sampling π‘š1 = 0 and 𝜎 2 = π‘š2 , which by proposition 4c) is easily achievable.

All rights reserved 18/32

Orthogonal Evolution and Anticipation

4

Anticipation (Continuous Spectrum)

Here we determine, for model and random states with 𝑝 = ∞, the values and distributions of the anticipation statistics identified and defined in section 2.3, and prove theorem 2. 4.1 Minimum Anticipation We consider here the extremal case of constant π‘¦π‘˜ . Notice that these model states have zero occurrence probability. Proposition 10 Let π‘¦πœ… = 𝑦 0 ≀ πœ… ≀ 2πœ‹ for some βˆ’1 ≀ 𝑦 ≀ 1. Then 1 βˆ’2

a) 𝑝𝑛 = 𝑦 2 πœ‹ βˆ’2 𝑛 βˆ’ 2

with maximum value β‰ˆ

βˆ’1

b) 𝑝𝑁 ∈ 𝑂 𝑁 π‘β†’βˆž . 2 c) π‘π‘‘π‘œπ‘‘ = 𝑦 . d) π‘›π‘Ÿ = ∞, π‘Ÿ β‰₯ 1 .

4𝑦 2 πœ‹2

at 𝑛 = 0 and 𝑛 = 1.

Proof: a) By proposition 3d) and the premise 2πœ‹ 1 1 βˆ’1 βˆ’π‘– π‘›βˆ’ πœ… π‘‘πœ… βˆ’1 2 𝛼𝑛 = 𝑦𝑒 𝑛 π‘–π‘›π‘‘π‘’π‘”π‘’π‘Ÿ = βˆ’π‘–π‘¦πœ‹ π‘›βˆ’ 2πœ‹ 2 0 The maxima, minima and asymptotics are elementary. b) From a) by the Euler summation formula. c) By a well known result, first discovered by Euler. See also [He88]. d) From a). q.e.d. 4.2 Maximum Anticipation We consider here the extremal case of π‘¦π‘˜ alternating between 𝑦 and βˆ’π‘¦ in 𝑁 equidistant intervals. Notice that these model states have zero occurrence probability. Proposition 11 2πœ‹π‘— 𝑀

Let π‘¦πœ… = βˆ’1 𝑗 𝑦

a) 𝑝𝑛 =

𝑦 1 2

πœ‹ π‘›βˆ’

≀πœ