## QM: G1.3 Probability and Expectation

A forum for learning, discussing, developing, and identifying top quality science materials -- physics, chemistry, mathematics, engineering, computer science, geology, biology, etc. Being versant in these areas will help make one versant in exploration of ID and Creation Science.

### QM: G1.3 Probability and Expectation

In the discrete case:

let j be the number of samples, and N(j) the population of objects found in each sample, there is the trivial case where the population of objects per sample is just 1 so N(j) = for all j that have objects and N(j) = j for all j without objects. The formalism is a little difficult like all formalisms will usually be.

The total number of objects for all samples:
$N=\sum_{j=0 }^{\infty}N(j)$

$\left \langle j \right \rangle = \frac{\sum jN(j)}{N} = \sum_{j=0}^{\infty }jP(j)$

For the non trivial case
$P(j)=\frac{N(j)}{N}$

But it gets a little unwieldy in the continuous case. So I'll work through it a bit.

Suppose we have a clock that goes form 0 to 5 seconds. What is the expected value of all possible measurement of time in that time frame? Well, a discrete approximation is just taking the the time readings from start to finish at 1 second intervals:

j = 0, 1, 2, 3, 4, 5 and N(j) = 1 for all j

Using the above formula (0+1+2+3+4+5) * (1/6) = 15/6 = 2.5

Before going into the formalism let's just try to do this in the continuous case where we run the clock from 0 to 5 seconds. What is the average time (which ONLY in the special case the average is also the expectation value). I will use Psi notation to drive the point home a bit. I begin with the Born constraint (which is really true of all probability conceptions):

$\int_{0}^{5}\left | \Psi(t) \right |^2dt=\int_{0}^{5}P(t)dt=1$

Now, by inspection and assumption, the probability density P(t) is independent of t, so we can, thankfully, solve this integral:

$\left | \Psi(t) \right |^2\int_{0}^{5}dt=P(t)\int_{0}^{5}dt=1$

$\left | \Psi(t) \right |^2\int_{0}^{5}dt=P(t)\int_{0}^{5}dt=1$

$\left | \Psi \right |^2t\left |_{t=0}^{t=5}=Pt\left |_{t=0}^{t=5}=1$

Which implies

$\left | \Psi \right |^2=P = \frac{1}{5}$ for all t.

So now we can state the definition of expectation in the continuous case and then solve and example:

$1=\int_{-\infty }^{+\infty }\rho (x)dx$

$\left \langle x \right \rangle=\int_{-\infty }^{+\infty }x\rho (x)dx$

$\left \langle f(x) \right \rangle=\int_{-\infty }^{+\infty }f(x)\rho (x)dx$

$\delta^2\equiv \left \langle \left ( \Delta x \right )^2 \right \rangle=\left \langle x^2 \right \rangle-\left \langle x \right \rangle^2$

So what is the expected value of t as we run the clock from 0 to 5 seconds?

$\left \langle f(x) \right \rangle=\int_{-\infty }^{+\infty }f(x)\rho (x)dx==\int_{0 }^{5}t(1/5) (t))dt=\left ( \frac{1}{5} \right )\left ( \frac{1}{2}\right )t^2 |_{t=0}^{t=5}=2.5$

So bear in mind, even for trivial problems the formal rigor can feel like rigor mortis!
Last edited by stcordova on Mon Feb 12, 2018 1:39 pm, edited 2 times in total.
stcordova

Posts: 447
Joined: Wed Mar 05, 2014 1:41 am

### Re: QM: G1.3 Probability and Expectation

useful little theorem on variances:
$\delta ^2=\left \langle \left ( \Delta j \right )^2 \right \rangle=\sum \left ( \Delta j \right )^2P\left ( j \right )=\sum \left ( j-\left \langle j \right \rangle \right )^2P\left ( j \right )$

$=\sum \left ( j^2-2j\left \langle j \right \rangle +\left \langle \left j \right \rangle ^2\right )P\left ( j \right )$

$=\sum j^2P\left ( j \right )-2\left \langle j\right \rangle\sum jP\left ( j \right )+\left \langle j \right \rangle^2\sum P\left ( j \right )$

$=\left \langle j^2 \right \rangle-2\left \langle j \right \rangle \left \langle j \right \rangle+\left \langle j \right \rangle^2=\left \langle j^2 \right \rangle-\left \langle j \right \rangle^2$
stcordova

Posts: 447
Joined: Wed Mar 05, 2014 1:41 am

### Re: QM: G1.3 Probability and Expectation

The continuous case theorem:

slide_49[1].jpg (52.68 KiB) Viewed 1710 times
stcordova

Posts: 447
Joined: Wed Mar 05, 2014 1:41 am

### Re: QM: G1.3 Probability and Expectation

Incidentally:

$\left \langle j \right \rangle^2=\sum_{j=0}^{\infty }j^2P\left ( j \right )$
stcordova

Posts: 447
Joined: Wed Mar 05, 2014 1:41 am