• 沒有找到結果。

(and basics about random processes)

N/A
N/A
Protected

Academic year: 2022

Share "(and basics about random processes)"

Copied!
41
0
0

加載中.... (立即查看全文)

全文

(1)

Small-Scale Fading II

(and basics about random processes)

PROF. MICHAEL TSAI 2014/3/31

(2)

Random processes

2

t

t 𝑥2(𝑡)

𝑥1(𝑡)

t 𝑥3(𝑡)

𝑋(𝑡)

One realization of X(t)

………

𝑋(𝜏) is a random variable:

𝑋(𝜏) 𝑓𝑋 𝜏 𝑋

(3)

Joint CDF for a random process

• If we sample X(t) at times 𝒕𝟎, … 𝒕𝒏, we can have a joint cdf of samples at those times:

3

𝑃𝑋 𝑡0 ,…,𝑋 𝑡𝑛 𝑥0, … , 𝑥𝑛 = 𝑝 𝑋 𝑡0 ≤ 𝑥0, 𝑋 𝑡1 ≤ 𝑥1, … , 𝑋 𝑡𝑛 ≤ 𝑥𝑛

t

X(t) 𝑋 𝑡0 𝑋 𝑡1 𝑋 𝑡𝑛

(4)

Stationary Random

Process (Strict-sense)

• A random process X(t) is stationary if for all T, all n, and all sets of sample times 𝒕𝟎, 𝒕𝟏, … , 𝒕𝒏 we have:

4

𝑝 𝑋 𝑡0 ≤ 𝑥0, 𝑋 𝑡1 ≤ 𝑥1, … , 𝑋 𝑡𝑛 ≤ 𝑥𝑛 =

𝑝 𝑋 𝑡0 + 𝑇 ≤ 𝑥0, 𝑋 𝑡1 + 𝑇 ≤ 𝑥1, … , 𝑋 𝑡𝑛 + 𝑇 ≤ 𝑥𝑛

If time shifts does not matter, then it is stationary

(5)

Mean (First Moment)

5

t

t 𝑥2(𝑡)

𝑥1(𝑡)

t 𝑥3(𝑡)

𝐸[𝑋 𝑡 ]

……… Averaging over all realizations

𝐸[𝑋 𝑡 ]

𝐸 𝑋 𝜏

(6)

Autocorrelation (Second Moment)

• “How similar a random process and a shifted version of itself is”

• Autocorrelation of a random process is defined as:

6

𝐴𝑋 𝑡, 𝑡 + 𝜏 ≜ 𝐸 𝑋 𝑡 𝑋 𝑡 + 𝜏

t

t 𝑥𝑗(𝑡 + 𝜏)

𝑥𝑖(𝑡)

Shifted by 𝜏

×

All possible combinations of realizations

𝜏 𝐴 𝑡, 𝑡 + 𝜏

For a particular t

=

(7)

For stationary random processes…

• Mean

• Autocorrelation

7

𝐸 𝑋 𝑡 = 𝐸 𝑋 𝑡 − 𝑡 = 𝐸 𝑋 0 = 𝜇𝑋

Constant. Does not change with t.

𝐴𝑋 𝑡, 𝑡 + 𝜏 = 𝐸 𝑋 𝑡 − 𝑡 𝑋 𝑡 + 𝜏 − 𝑡 = 𝐸[𝑋 0 𝑋 𝜏 ] ≜ 𝐴𝑋(𝜏)

(8)

Two random processes

• Two random processes X(t) and Y(t) defined on the same underlying probability space have a joint cdf:

for all possible sets of sample times 𝒕𝟎, … , 𝒕𝒏 and 𝒕𝟎 , … , 𝒕𝒎 .

• Two random processes are independent if we have

8

𝑝𝑋 𝑡

0 ,…,𝑋 𝑡𝑛 𝑌 𝑡0 ,…,𝑌 𝑡𝑚 𝑥0, … , 𝑥𝑛, 𝑦0, … , 𝑦𝑚 =

𝑝 𝑋 𝑡0 ≤ 𝑥0, … , 𝑋 𝑡𝑛 ≤ 𝑥𝑛, 𝑌 𝑡0 ≤ 𝑦0, … , 𝑌 𝑡𝑚 ≤ 𝑦𝑚

Similar to how you can define a joint cdf for two random variables

𝑝𝑋 𝑡

0 ,…,𝑋 𝑡𝑛 𝑌 𝑡0 ,…,𝑌 𝑡𝑚 𝑥0, … , 𝑥𝑛, 𝑦0, … , 𝑦𝑚 =

𝑝 𝑋 𝑡0 ≤ 𝑥0, 𝑋 𝑡1 ≤ 𝑥1, … , 𝑋 𝑡𝑛 ≤ 𝑥𝑛 𝑝 𝑌 𝑡0 ≤ 𝑦0, 𝑌 𝑡1 ≤ 𝑦1, … , 𝑌 𝑡𝑛 ≤ 𝑦𝑛

(9)

Cross-correlation

• The cross-correlation between two random processes X(t) and Y(t) is defined as

• Two random processes are uncorrelated if

for all t and 𝝉.

• If both random processes are stationary, we have

9

𝐴𝑋𝑌 𝑡, 𝑡 + 𝜏 ≜ 𝐸 𝑋 𝑡 𝑌 𝑡 + 𝜏

𝐸 𝑋 𝑡 𝑌 𝑡 + 𝜏 = 𝐸 𝑋 𝑡 𝐸 𝑌 𝑡 + 𝜏

𝐴𝑋𝑌 𝑡, 𝑡 + 𝜏 = 𝐸 𝑋 𝑡 𝑌 𝑡 + 𝜏 = 𝐸 𝑋 0 𝑌 𝜏 = 𝐴XY 𝜏

(10)

Wide-Sense Stationary (WSS)

• A process is wide-sense stationary if

and

• 𝑨𝑿(𝝉) has its maximum value at 𝝉 = 𝟎.

10

𝐸 𝑋 𝑡 = 𝜇𝑋

𝐴𝑋 𝑡, 𝑡 + 𝜏 = 𝐸 𝑋 𝑡 𝑋 𝑡 + 𝜏 = 𝐴𝑋 𝜏

𝐴𝑋 𝜏 ≤ 𝐴𝑋 0 = 𝐸 𝑋2 𝑡

A random process is always “the most similar” to the version of itself without shifting.

(11)

Ergodicity

11

t

t 𝑥2(𝑡)

𝑥1(𝑡)

t 𝑥3(𝑡)

𝑋(𝑡)

Expectation value over time is the same as expectation over all possible realizations 𝐸𝑡 .

𝐸𝑖 .

(12)

Power Spectral Density

• The power spectral density of a WSS process is

defined as the Fourier transform of its autocorrelation function with respect to 𝝉:

• PSD takes its name from the fact that the expected power of a random process X(t) is the integral of its PSD:

12

𝑆𝑋 𝑓 =

−∞

𝐴𝑋 𝜏 exp −𝑗2𝜋𝑓𝜏 𝑑𝜏

𝐸 𝑋2 𝑡 = 𝐴𝑋 0 =

−∞

𝑆𝑋 𝑓 𝑑𝑓

(13)

Gaussian random processes

• A random process X(t) is a Gaussian process if, for all values of T and all functions g(t), the random variable

has a Gaussian distribution.

• We usually use this to model the noise for a communication receiver.

• Mean & variance:

13

𝑋𝑔 =

0 𝑇

𝑔 𝑡 𝑋 𝑡 𝑑𝑡

Linear combination of samples

𝐸 𝑋𝑔 =

0 𝑇

𝑔 𝑡 𝐸 𝑋 𝑡 𝑑𝑡

𝑉𝑎𝑟 𝑋𝑔 =

0 𝑇

0 𝑇

𝑔 𝑡 𝑔 𝑠 𝐸 𝑋 𝑡 𝑋 𝑠 𝑑𝑡 𝑑𝑠 − 𝐸2[𝑋𝑔]

(14)

Gaussian random processes

• Samples of a random process, 𝑿 𝒕𝒊 , 𝒊 = 𝟎, … , 𝒏, are jointly Gaussian random variables, if we let 𝒈 𝒕 = 𝜹 𝒕 − 𝒕𝒊 .

14

𝑋𝑔 =

0 𝑇

𝑔 𝑡 𝑋 𝑡 𝑑𝑡

𝑋𝑔 =

0 𝑇

𝛿 𝑡 − 𝑡𝑖 𝑋 𝑡 𝑑𝑡 = 𝑋 𝑡𝑖

(15)

Recap: 2 important aspects (of channel time variation)

15

(16)

Example: white noise

• White noise is a zero-mean WSS random process with a PSD that is constant over all frequencies.

• 𝑁0 is often called as one-sided white noise PSD.

• By inverse Fourier transform, the autocorrelation can be obtained:

16

𝐸 𝑋 𝑡 = 0 𝑆𝑋 𝑓 = 𝑁0

2 for some constant 𝑁0

𝐴𝑋 𝜏 = 𝑁0

2 𝛿 𝜏

White noise is not correlated with any shifted version of itself.

(Not similar at all after ANY time period)

(17)

17

t

t0

0 1 2 3 4 5 6 (t

0)

(t1) t1

t2

(t2) t3

(t3)

hb(t,)

Two main aspects

of the wireless

channel

(18)

Doppler Effect

• Difference in path lengths 𝚫𝐥 = 𝒅 𝒄𝒐𝒔𝜽 = 𝒗𝚫𝐭 𝐜𝐨𝐬𝜽

• Phase change 𝚫𝝓 = 𝟐𝝅𝚫𝐥

𝝀 = 𝟐𝝅𝒗𝚫𝐭

𝝀 𝐜𝐨𝐬𝜽

• Frequency change, or Doppler shift,

𝒇𝒅 = 𝟏 𝟐𝝅

𝚫𝝓

𝚫𝐭 = 𝒗

𝝀 𝒄𝒐𝒔𝜽

18

(19)

Example

• Consider a transmitter which radiates a sinusoidal carrier frequency of 1850 MHz. For a vehicle moving 60 mph, compute the received carrier frequency if the mobile is moving

1. directly toward the transmitter.

2. directly away from the transmitter

3. in a direction which is perpendicular to the direction of arrival of the transmitted signal.

• Ans:

Wavelength=𝜆 = 𝑐

𝑓𝑐 = 3×108

1850×106 = 0.162 (𝑚)

Vehicle speed 𝑣 = 60𝑚𝑝ℎ = 26.82 𝑚

𝑠

1. 𝑓𝑑 = 26.82

0.162cos 0 = 160 𝐻𝑧 2. 𝑓𝑑 = 26.82

0.162cos 𝜋 = −160 (𝐻𝑧) 3. Since cos 𝜋

2 = 0, there is no Doppler shift!

19

𝒇𝒅 = 𝟏 𝟐𝝅

𝚫𝝓

𝚫𝐭 = 𝒗

𝝀𝒄𝒐𝒔𝜽

(20)

Doppler Effect

• If the car (mobile) is moving toward the direction of the arriving wave, the Doppler shift is positive

• Different Doppler shifts if different 𝜽 (incoming angle)

• Multi-path: all different angles

• Many Doppler shifts  Doppler spread

20

(21)

Narrow-band Fading Model

• Sending an unmodulated carrier wave with random phase offset 𝝓𝟎:

• Received signal becomes

21

𝑠 𝑡 = 𝑅𝑒{exp 𝑗 2𝜋𝑓𝑐𝑡 + 𝜙0 } = cos 2𝜋𝑓𝑐𝑡 + 𝜙0

𝑟 𝑡 = 𝑅𝑒

𝑛=1 𝑁 𝑡

𝛼𝑛 𝑡 exp −𝑗𝜙𝑛 𝑡 exp 𝑗2𝜋𝑓𝑐𝑡

= 𝑟𝐼 𝑡 cos(2𝜋𝑓𝑐𝑡) − 𝑟𝑄 𝑡 sin(2𝜋𝑓𝑐𝑡)

Sum of many MPC Carrier with frequency 𝑓𝑐

(22)

22

𝑟𝐼 𝑡 =

𝑛=1 𝑁 𝑡

𝛼𝑛 𝑡 cos 𝜙𝑛 𝑡 𝑟𝑄 𝑡 =

𝑛=1 𝑁 𝑡

𝛼𝑛 𝑡 sin 𝜙𝑛 𝑡

𝜙𝑛 𝑡 = 2𝜋𝑓𝑐𝜏𝑛 𝑡 − 𝜙𝐷𝑛 − 𝜙0

= 𝑟𝐼 𝑡 cos(2𝜋𝑓𝑐𝑡) − 𝑟𝑄 𝑡 sin(2𝜋𝑓𝑐𝑡) 𝑟 𝑡 = 𝑅𝑒

𝑛=1 𝑁 𝑡

𝛼𝑛 𝑡 exp −𝑗𝜙𝑛 𝑡 exp 𝑗2𝜋𝑓𝑐𝑡

Doppler Shift Carrier phase shift (same for all MPC) Phase shift due to delay

Since N(t) is large & we assume 𝛼𝑛(𝑡) and 𝜙𝑛(𝑡) are independent for different MPC, we can approximate 𝑟𝐼(𝑡) and 𝑟𝑄(𝑡) as jointly Gaussian random processes.

(23)

Some assumptions

• No dominant LOS component

• 𝜶𝒏 𝒕 , 𝒇𝑫𝒏 𝒕 , 𝒂𝒏𝒅 𝝉𝒏 𝒕 change slowly over time

• 𝟐𝝅𝒇𝒄𝝉𝒏 changes rapidly relative to all other phase terms

• 𝝓𝒏(𝒕) uniformly distributed on [−𝝅, 𝝅].

• 𝜶𝒏 and 𝝓𝒏 are independent of each other.

23

𝜙𝑛 𝑡 = 2𝜋𝑓𝑐𝜏𝑛 𝑡 − 𝜙𝐷𝑛 − 𝜙0

Very large

(24)

Zero-mean

• Similarly,

• So, E[r(t)]=0, and it is a zero-mean Gaussian process.

• If there is a dominant LOS component, then this is no longer true.

24

𝐸 𝑟𝐼 𝑡 = 𝐸

𝑛

𝛼𝑛 cos 𝜙𝑛 𝑡 =

𝑛

𝐸 𝛼𝑛 𝐸[cos 𝜙𝑛(𝑡)] = 0

𝐸 𝑟𝑄 𝑡 = 0

(25)

Un-correlated

25

𝐸 𝑟𝐼 𝑡 𝑟𝑄 𝑡 = 𝐸

𝑛

𝛼𝑛𝑐𝑜𝑠𝜙𝑛 𝑡

𝑚

𝛼𝑚 sin 𝛼𝑚 𝑡

=

𝑛 𝑚

𝐸 𝛼𝑛𝛼𝑚 𝐸 cos 𝜙𝑛 𝑡 sin 𝜙𝑚 𝑡

=

𝑛

𝐸 𝛼𝑛2 𝐸 cos 𝜙𝑛 𝑡 sin 𝜙𝑛 𝑡 =

𝑛

𝐸 𝛼𝑛2 𝐸 sin 2𝜙𝑛 𝑡

2 = 0

𝛼𝑛 and 𝜙𝑛 are not correlated.

=

𝑛,𝑚 𝑛≠𝑚

𝐸 𝛼𝑛 𝐸 𝛼𝑚 𝐸 cos 𝜙𝑛 𝑡 𝐸 sin 𝜙𝑚 𝑡 +

𝑛

𝐸 𝛼𝑛2 𝐸 cos 𝜙𝑛 𝑡 sin 𝜙𝑛 𝑡 Different MPC’s 𝛼𝑛 and 𝜙𝑛 are independent

Uniformly distributed over −𝜋, 𝜋 , so =0.

𝑟𝐼(𝑡) and 𝑟𝑄 𝑡 are uncorrelated, and they are Gaussian processes

 they are independent.

(26)

Autocorrelation

26

𝐴𝑟𝐼 𝑡, 𝑡 + 𝜏 = 𝐸 𝑟𝐼 𝑡 𝑟𝐼 𝑡 + 𝜏 =

𝑛

𝐸 𝛼𝑛2 𝐸 cos 𝜙𝑛 𝑡 cos 𝜙𝑛 𝑡 + 𝜏

= .5𝐸[cos(2𝜋𝑓𝐷𝑛𝜏)] + .5𝐸 cos 4𝜋𝑓𝑐𝜏𝑛 − 4𝜋𝑓𝐷𝑛𝑡 − 2𝜋𝑓𝐷𝑛𝜏 − 2𝜙0 𝐸 cos 𝜙𝑛 𝑡 cos 𝜙𝑛 𝑡 + 𝜏 =

= 𝐸[.5 cos 𝜙𝑛 𝑡 + 𝜏 − 𝜙𝑛 𝑡 + .5 cos 𝜙𝑛 𝑡 + 𝜏 + 𝜙𝑛 𝑡 ]

𝜙𝑛 𝑡 + 𝜏 = 2𝜋𝑓𝑐𝜏𝑛 − 2𝜋𝑓𝐷𝑛 𝑡 + 𝜏 − 𝜙0 𝜙𝑛 𝑡 = 2𝜋𝑓𝑐𝜏𝑛 − 2𝜋𝑓𝐷𝑛𝑡 − 𝜙0

Large and uniformly distributed over [−𝜋, 𝜋]

0

𝐴𝑟𝐼 𝑡, 𝑡 + 𝜏 = .5

𝑛

𝐸 𝛼𝑛2 𝐸[cos(2𝜋𝑓𝐷𝑛𝜏)] = .5

𝑛

𝐸 𝛼𝑛2 2𝜋𝑣𝜏 cos 𝜃𝑛 𝜆

Only depends on 𝜏, so WSS!

(27)

Autocorrelation

• Finally,

27

𝐴𝑟𝐼,𝑟𝑄 𝑡, 𝑡 + 𝜏 = 𝐴𝑟𝐼,𝑟𝑄 𝜏 = 𝐸 𝑟𝐼 𝑡 𝑟𝑄 𝑡 + 𝜏

= −.5

𝑛

𝐸 𝛼𝑛2 sin 2𝜋𝑣𝜏 cos𝜃𝑛

𝜆 = −𝐸 𝑟𝑄 𝑡 𝑟𝐼 𝑡 + 𝜏

𝑟 𝑡 = 𝑟𝐼 𝑡 cos 2𝜋𝑓𝑐𝑡 − 𝑟𝑄 𝑡 sin 2𝜋𝑓𝑐𝑡

𝐴𝑟 𝜏 = 𝐸 𝑟 𝑡 𝑟 𝑡 + 𝜏 = 𝐴𝑟𝐼 𝜏 cos 2𝜋𝑓𝑐𝜏 + 𝐴𝑟𝐼,𝑟𝑄 𝜏 sin(2𝜋𝑓𝑐𝜏) Also only depends on 𝜏, WSS!

The received signal, representing how the channel changes over time

(28)

Amplitude distribution - Rayleigh

• 𝒛 𝒕 = 𝒓 𝒕 = 𝒓𝑰𝟐 𝒕 + 𝒓𝑸𝟐 𝒕

• 𝒓𝑰(𝒕) and 𝒓𝑸(𝒕) are both zero-mean Gaussian random process (so at a given time, two Gaussian random

variables).

• z(t)’s distribution - the amplitude distribution of r(t):

28

Channel path loss

t

𝑝𝑍 𝑧 = 2𝑧

𝑃𝑟 exp −𝑧2

𝑃𝑟 = 𝑧

𝜎2 exp − 𝑧2

2𝜎2 , 𝑧 ≥ 0

This is the famous Rayleigh distribution!

(29)

2-variable joint

Gaussian distribution

• PDF for 2-variable joint Gaussian distribution:

• 𝝆: X and Y’s correlation (in our case, 0)

• 𝝁𝑿 and 𝝁𝒀: X and Y’s mean

• 𝝈𝑿𝟐 and 𝝈𝒀𝟐: X and Y’s variance (in our case, both are 𝝈𝟐)

29

𝑓 𝑋, 𝑌 = 1

2𝜋𝜎𝑋𝜎𝑌 1 − 𝜌2 exp − 1 2 1 − 𝜌2

𝑋 − 𝜇𝑋 2

𝜎𝑋2 2𝜌(𝑋 − 𝜇𝑋)(𝑌 − 𝜇𝑌)

𝜎𝑋𝜎𝑌 + 𝑌 − 𝜇𝑌 2 𝜎𝑌2

𝑓 𝑋, 𝑌 = 1

2𝜋𝜎2 exp −1 2

𝑋2 + 𝑌2 𝜎2 The rest of the derivation can be found here:

http://www.dsplog.com/2008/07/17/derive-pdf-rayleigh-random-variable/

(30)

Power distribution:

Rayleigh

• We can obtain the power distribution by making the change of variables 𝒛𝟐 𝒕 = 𝒓 𝒕 𝟐 to obtain

30

𝑝𝑍2 𝑥 = 1

𝑃𝑟 exp − 𝑥

𝑃𝑟 = 1

2𝜎2 exp − 𝑥

2𝜎2 , 𝑥 ≥ 0

(31)

Example: Rayleigh fading

• Consider a channel with Rayleigh fading (no LOS!) and average received power 𝑷𝒓 = 𝟐𝟎 dBm. Find the probability that the received power is below 10 dBm.

• We want to find the probability that 𝒁𝟐 < 𝟏𝟎 𝒅𝑩𝒎 = 𝟏𝟎 𝒎𝑾.

31

𝑝 𝑍2 < 10 =

0

10 1

100exp − 𝑥

100 𝑑𝑥 = 0.095

(32)

With a LOS component – Ricean (or Rician)

• If the channel has a fixed LOS component then 𝒓𝑰(𝒕) and 𝒓𝑸(𝒕) are no longer zero-mean variables.

• The received signal becomes the superposition of a complex Gaussian component and a LOS component.

32

Rayleigh: lots of random nLOS components

Ricean: Rayleigh + one strong static component

(LOS or strong reflection nLOS)

(33)

Ricean distribution

• Amplitude distribution:

• 𝟐𝝈𝟐 = 𝒏,𝒏≠𝟎 𝑬[𝜶𝒏𝟐] is the average power in the nLOS MPCs.

• 𝒔𝟐 = 𝜶𝟎𝟐 is the power in the dominant strong component.

• 𝑰𝟎(𝒙): the modified Bessel function of zeroth order.

33

𝑝𝑍 𝑧 = 𝑧

𝜎2 exp −𝑧2 + 𝑠2

2𝜎2 𝐼0 𝑧𝑠

𝜎2 , 𝑧 ≥ 0

(34)

Ricean distribution

• The average power in the Ricean fading is

• The Ricean distribution is often described in terms of a fading parameter K, defined by

• K is the ratio of the power in the dominant component to the power in the other random MPCs.

K=0, then Ricean degenerates to Rayleigh

K=∞, then Ricean becomes a non-fading LOS channel.

34

𝑃𝑟 =

0

𝑧2𝑝𝑍 𝑧 𝑑𝑧 = 𝑠2 + 2𝜎2

𝐾 = 𝑠2 2𝜎2

(35)

Ricean and Rayleigh

35

(36)

36

Example: Intra-car Wireless Channel Measurements

Which distribution fit the empirical amplitude distribution function the best?

Lognormal, Nakagami, Rician, Rayleigh, and Weibull

Engine compartment  Center

(Parked)

Engine compartment  Under the engine

(Parked)

CDF CDF

Parked: Weibull

(37)

37

Engine compartment  Trunk

(Driving)

Under the engine  Engine Compartment

(Driving)

CDF CDF

Driving: Rician/Nakagami

(38)

38

Channel model

No Line-of-Sight component Rayleigh

Rician

(39)

Coherence Time

• Coherence Time:

Coherence time is a statistical measure of the range of time over which the channel can be considered “static”.

• 90% coherence time:

• We can define 50% coherence time in a similar way too.

39

𝑻𝒄,𝟎.𝟗 = 𝒂𝒓𝒈𝒎𝒊𝒏𝝉 𝑨𝒓 𝝉

𝑨𝒓 𝟎 < 𝟎. 𝟗

The first time interval that normalized autocorrelation drops below the threshold.

(40)

40

10-3 10-2 10-1 100 101 102 103 104

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Time spacing (seconds)

Normalized Correlation

Scenario 1 (Parked) Scenario 2 (Driving)

50% correlation

~ 2.5 sec ~ 60 sec

Worst Channel

Coherence time is always on the order of seconds.

IE/UE Channel

(41)

Fast and slow fading channel

41

𝜏

𝜏 𝐴𝑟 𝜏

𝐴𝑟 𝜏

𝑇𝑐

𝑇𝑐 𝑇𝑠 t

𝑇𝑠: symbol period

Slow fading

Fast fading

f 𝐵𝑠

𝐵𝑠: signal bandwidth 𝐵𝐷

𝐵𝐷: Doppler Spread

f

𝑇𝑆 > 𝑇𝑐

𝐵𝐷

𝐵𝐷: Doppler Spread

f 𝐵𝐷 > 𝐵𝑆

𝐵𝐷 ≪ 𝐵𝑆

𝑇𝐶 ≫ 𝑇𝑆

參考文獻

相關文件

Describe finite-volume method for solving proposed model numerically on fixed mapped (body-fitted) grids Discuss adaptive moving mesh approach for efficient improvement of

Given a shift κ, if we want to compute the eigenvalue λ of A which is closest to κ, then we need to compute the eigenvalue δ of (11) such that |δ| is the smallest value of all of

Courtesy: Ned Wright’s Cosmology Page Burles, Nolette &amp; Turner, 1999?. Total Mass Density

If growing cities in Asia and Africa can provide clean, safe housing, the future of the people moving there should be a very good one... What is the main idea of the

Corollary 13.3. For, if C is simple and lies in D, the function f is analytic at each point interior to and on C; so we apply the Cauchy-Goursat theorem directly. On the other hand,

Corollary 13.3. For, if C is simple and lies in D, the function f is analytic at each point interior to and on C; so we apply the Cauchy-Goursat theorem directly. On the other hand,

Because simultaneous localization, mapping and moving object tracking is a more general process based on the integration of SLAM and moving object tracking, it inherits the

• But Monte Carlo simulation can be modified to price American options with small biases..