• 沒有找到結果。

6.2 Detailed Derivations for Three Terms in (4)

6.2.3 Third Term

Recalling the definition of Ek in (6.25), we lower-bound the fourth term on the RHS of (6.24) as follows:

6.2 Detailed Derivations for Three Terms in (6.24) Chapter 6 Here, (6.90) follows because the first term and last term in (6.89) are equal or greater than zero and Hb(Ek|Yk−1k−κ, Gk= 0) ≤ Hb(Ek) = Hbk); in (6.92), we add {Θk}, which is IID

∼ U ((−π, π]) and independent of Yk. Because {Θk} is uniformly distributed, it destroys the phase of {Hk} and let {Hkek} becomes circularly symmetric. (6.94) follows because we drop eΘk on both side of mutual information.

By Appendix C.1, we have βkI,

so we further bound (6.96) as follows:

I,

-Chapter 6 Proof of Theorem 5.4

Here, in (6.100), we drop {Zk−1k−κ}, so the mutual information becomes smaller.

By Appendix C.2, we have βkI,%

and we further bound (6.101) as follows:

I,

6.2 Detailed Derivations for Three Terms in (6.24) Chapter 6

Here, (6.105) follows from taking the magnitude from Hk|Xk|ek; (6.106) follows be-cause we drop some terms in mutual information; (6.107) follows from the definition of differential entropy for unit vectors (see Section 3.1.2); (6.110) follows because dropping conditioning increases entropy.

Chapter 7

Discussion and Conclusion

In this thesis, we have shown that the asymptotic capacity of general regular SIMO fading channels with memory remains unchanged even if one allows causal noiseless feedback.

This once again shows the extremely unattractive behavior of regular fading channels at high SNR: besides the double-logarithmic growth [8] and the very poor performance in a multiple-user setup (where the maximum sum-rate only can be achieved if all users apart from one always remain switched off [16]), we now see that any type of feedback does not increase capacity in spite of memory in the channel.

Possible future works for the general regular fading channels with memory and feedback might include the following:

• Considering the case with multiple-input single-output, i.e., having several mobile phones (each having one antenna) communicating with one base station (having only one antenna). The difficulties for this case lies in the fact that now we not only need to optimize the phase and magnitude of the inputs, but also the direction of them.

• Considering the case with multiple-input multiple-output.

• The situation where both transmitter and receiver have access to causal partial side-information Sk about the fading, where by partial we mean that

n→∞lim 1 nI!

Sn1; Hn1"

<∞. (7.1)

Appendix A

Upper Bound (6.46)

In this appendix, we derive the following upper bound:

I!

We bound the first term as follows:

I! where in (A.7), CIID(·) denotes the capacity without feedback or memory for a given power.

Because CIID(·) is nondecreasing, and under the condition that Ek = 0, i.e., |Xk| ≤ ξmin,

Appendix A Upper Bound (6.46)

Here, (A.9) follows from conditioning that reduces entropy; and (A.10) follows because conditional on Hk−11 , Hk is independent of 7

. We first have the following inequality: where both h+(·) and h(·) are nonnegative (see Section 3.1.1), we further bound the first term in (A.11) as follows:

−1 Here, (A.14) follows from Lemma A.12 in [10, Appendix A.4.2]; and (A.15) follows because

E#

Appendix A

Choosing ξ = 1, we then get from (A.15) and (A.11) E#

log %Hk%2)

)Ek= 0, Gk = 0$

≥ h! Hk)

)Ek= 0, Gk = 0"

−nR+ 1

e − nRlog+ .πe

nR E#

%H0%2)

)G0= 0$ 1 − βk

/

− ∆(nR,1)(A.18)

We put this back into (A.10), and get I!

Xk,Hk−11 ; Yk)

)Ek= 0, Gk= 0"

− hλ! ˆHkek)

) ˆHlel, Ek= 0, Gk= 0"

−nRE#

log %Hk%2)

)Ek= 0, Gk = 0$

≤ CIIDmin)

)Gk= 0) + h! Hk

))Ek= 0, Gk= 0"

− hλ! ˆHkek)

)Hk−11 , Gk= 0"

− h(Hk

))Hk−11 , Gk = 0) − nRh(Hk)

)Ek= 0, Gk= 0) + nR(nR+ 1) e + n2Rlog+

.πe nR

E#

%H0%2|G0 = 0$ 1 − βk

/

+ nR∆(nR,1) (A.19)

≤ CIIDmin)

)G0 = 0) − (nR− 1)h(Hk

))Ek= 0, Gk= 0) − hλ! ˆH0e0)

)H−1−∞, G0= 0"

− h(Hk

))Hk−11 , Gk = 0) +nR(nR+ 1)

e + n2Rlog+ .πe

nR E#

%H0%2|G0 = 0$ 1 − βk

/

+ nR∆(nR,1) (A.20)

≤ CIIDmin)

)G0 = 0) − (nR− 1)h(H0|H−1−∞, G0= 0) − hλ! ˆH0e0)

)H−1−∞, G0 = 0"

− h(H0

))H−11−k, G0= 0) + nR(nR+ 1)

e + n2Rlog+ .πe

nR E#

%H0%2|G0 = 0$ 1 − βk

/

+ nR∆(nR,1) (A.21)

where (A.20) follows because we shift the time index in hλ(·) by k using the stationarity of {Hk}, and add more terms to it; we also shift Gk to G0 in CIID(·) since CIID(·) is IID.

(A.21) follows because {Hk} is a stationary process, h(Hk|Hk−11 ) is nonincreasing in k, and therefore, we have

h(Hk|Ek= 0, Gk= 0) ≥ h(Hk|Hk−11 , Ek= 0, Gk = 0) (A.22)

= h(Hk|Hk−11 , Gk = 0) (A.23)

≥ h(Hk|Hk−1−∞, Gk = 0) (A.24)

= h(H0|H−1−∞, G0 = 0). (A.25)

Appendix B

Upper Bound (6.81)

In this appendix, we derive the following upper bound:

I!

Xk; %Hk%|Xk|ek)

)Ek= 1, Gk= 0"

+ I,

Xk; ˆHkei(Φkk) ))

)%Hk%|Xk|, ek, Ek= 1, Gk= 0

-(B.1)

≤ − log 2 − h(Hk

))Xk, Ek = 1, Gk= 0) + (2nR− 1)E[log %Hk%|Ek= 1, Gk = 0]

− E[log %Hk%|Ek = 1, Gk= 0] + µ log η + log Γ 5

µ,ν η

6

+ (1 − µ)E#

log %Hk%2)

)Ek = 1, Gk= 0$

− µE#

log |Xk|2)

)Ek= 1, Gk= 0$ + )ν,k

+1 ηE#

%Hk%2|Xk|2)

)Ek= 1, Gk = 0$ + ν

η + hλ, Hˆkek

))

)Ek= 1, Gk= 0

-, (B.2) using a similar approach as in [9, Appendix D].

First, we apply Lemma 11 in [9] to the first term in (B.1), i.e., we choose S = Xk and T = %Hk%|Xk|ek. Note that we need to condition everything on the events Ek= 1 and Gk= 0.

I!

Xk; %Hk%|Xk|ek)

)Ek= 1, Gk = 0"

≤ −h!

%Hk%|Xk|ek)

)Xk, Ek= 1, Gk= 0"

+ log π + µ log η + log Γ 5

µ,ν η

6

+ (1 − µ)E#

log(%Hk%2|Xk|2+ ν))

)Ek = 1, Gk = 0$ + 1

ηE#

%Hk%2|Xk|2)

)Ek= 1, Gk= 0$ +ν

η (B.3)

where µ, η > 0, and ν ≥ 0 can be chosen freely. Note that from a conditional version of Lemma 2 in [9] with m = 1 follows that

h!

%Hk%|Xk|ek)

)Xk= xk, Ek= 1, Gk = 0"

= h! Θk

))Xk= xk, Ek= 1, Gk= 0"

+ h!

%Hk%|Xk|)

)ek, Xk = xk, Ek= 1, Gk= 0"

+ E#

log %Hk%|Xk|)

)Xk = xk, Ek= 1, Gk= 0$

(B.4)

= log 2π + h(%Hk%|Xk|)

)Xk= xk, Ek = 1, Gk = 0) + E#

log %Hk%|Xk|)

)Xk = xk, Ek= 1, Gk= 0$

, (B.5)

Appendix B

where we have used that Θk is independent of all other random quantities and uniformly distributed on the unit circle. Taking the expectation over Xkconditional on Ek= 1, Gk= 0 and noting that by the law of total expectation

EXk where (B.8) follows from the scaling property of entropy with a real argument.

We choose 0 < µ < 1 (recall that µ is a free parameter!) such that 1 − µ > 0. Then

Appendix B Upper Bound (6.81) (B.16) into (B.3) yields

I!

Next, we continue with the second term in (B.1):

I!

Hence, using (B.21) and (B.18) we get the following upper bound for (B.1):

I!

Appendix B

+ (1 − µ)E#

log %Hk%2)

)Ek = 1, Gk = 0$

− µE#

log |Xk|2)

)Ek= 1$ + )ν,k +1

ηE#

%Hk%2|Xk|2)

)Ek= 1$ +ν

η + hλ! ˆHkek)

)Ek= 1, Gk= 0"

(B.23) Here, (B.23) follows from a conditional version of Lemma 2 in [9] similar to (B.4)–(B.9) which allows us to combine the second and the last term in (B.22).

Appendix C

Upper Bounds (6.97) and (6.102)

In this appendix, we will find bounds (6.97) and (6.102).

C.1 δ1(κ, ξmin)

We first derive upper bound (6.97):

βkI,

Hk|Xk|ek+ Zk; Zk−1k−κ) ))

%Hl|Xl|el+ Zl&k−1

l=k−κ, Ek= 1, Gk= 0

-≤ δ1(κ, ξmin) + Hbk). (C.1)

We start as follows:

βkI,

Hk|Xk|ek+ Zk; Zk−1k−κ) ))

%Hl|Xl|el+ Zl

&k−1

l=k−κ, Ek = 1, Gk= 0

-= βkh, Zk−1k−κ)

))

%Hl|Xl|el+ Zl

&k−1

l=k−κ, Ek= 1, Gk= 0

-− βkh, Zk−1k−κ)

))

%Hl|Xl|el+ Zl

&k

l=k−κ, Ek= 1, Gk = 0

-(C.2)

≤ βkh, Zk−1k−κ)

))Ek= 1, Gk= 0

-− βkh, Zk−1k−κ)

))

%Hl|Xl|el+ Zl&k

l=k−κ,|Xl|kl=k−κ,Zk, Ek = 1, Gk= 0

-(C.3)

= βkh, Zk−1k−κ)

))Ek= 1, Gk= 0

-− βkh .

Zk−1k−κ )) )) ) :

Hlel+ Zl

|Xl|

;k−1 l=k−κ

,Hkek,|Xl|kl=k−κ, Ek= 1, Gk= 0 /

. (C.4) Here (C.3) follows from conditioning that reduces entropy. The reason why we do not drop Ek is because we have βk in front of the mutual information, if we drop Ek now, we will not be able to get rid of βk later. In (C.4) we drop Zk since {Zk} are IID. In order to get rid of the dependence on input, we take an infimum:

βkh, Zk−1k−κ)

))Ek= 1, Gk= 0

-− βkh .

Zk−1k−κ )) )) )

:

Hlel+ Zl

|Xl|

;k−1 l=k−κ

,Hkek,|Xl|kl=k−κ, Ek= 1, Gk= 0 /

C.1 δ1(κ, ξmin) Appendix C thus the smaller the entropy of Zk−1k−κ would be. From this stage, the dependence on input inside mutual information is gone (except Ek), but we still have βk in front of mutual information, therefore we add 1 − βk to get rid of βk as follows:

-Appendix C Upper Bounds (6.97) and (6.102) {Zk} are stationary processes.

If ξmin goes to infinity, h57

Next, we derive upper bound (6.102):

βkI,%

Hl|Xl|el&k−1

l=k−κ; Zk)

)) Hk|Xk|ek+ Zk, Ek= 1, Gk= 0

-≤ δ2(κ, ξmin) + Hbk). (C.14)

The derivation is similar to (C.2)–(C.13).

βkI,% Here (C.16) follows from conditioning that reduces entropy and for the same reason as in Section C.1, we keep Ek = 1. In order to get rid of the dependence on input, we take an infimum:

C.2 δ2(κ, ξmin) Appendix C

Here, (C.19) follows because the smaller γk is, the more Zk can reflect in Hkek + Zγk

k, thus the smaller the entropy of Zk would be. Next, we want to get rid of βk as follows:

βkI {Zk} are independent of each other; (C.26) follows because {Hk} and {Zk} are stationary processes.

-converges to h,%

Hlel&−1

Appendix D

Causal Interpretations for Independence

In Figures D.3–D.5 we prove the following independence claims used in (6.21), (6.73), and (6.84).

• (M, Yk−11 ) ⊥⊥ Yk when conditioned on (Xk,Hk−11 );

• Xk ⊥⊥ Yk when conditioned on (HkXk);

• Hk−11 ⊥⊥ Yk when conditioned on (Xk,Hk).

X1 X2 X3 Xk−1 Xk

F1 F2 F3 Fk−1 Fk

Z1 Z2 Z3 Zk−1 Zk

Y1 Y2 Y3 Yk−1 Yk

H1 H2 H3 Hk−1 Hk

M

Figure D.3: The relevant subgraph of V showing the independence of (M, Y1k−1) and Yk

when conditioned on (Xk,Hk−11 ).

Appendix D

X1 X2 X3 Xk−1 Xk

F1 F2 F3 Fk−1 Fk

Z1 Z2 Z3 Zk−1 Zk

Y1 Y2 Y3 Yk−1 Yk

H1 H2 H3 Hk−1 Hk

M

Figure D.4: The relevant subgraph of V showing the independence of Xk and Yk when conditioned on (HkXk).

X1 X2 X3 Xk−1 Xk

F1 F2 F3 Fk−1 Fk

Z1 Z2 Z3 Zk−1 Zk

Y1 Y2 Y3 Yk−1 Yk

H1 H2 H3 Hk−1 Hk

M

Figure D.5: The relevant subgraph of V showing the independence of Hk−11 and Yk when conditioned on (Xk,Hk).

Bibliography

[1] ˙I. E. Telatar, “Capacity of multi-antenna Gaussian channels,” European Transactions on Telecommunications, vol. 10, no. 6, pp. 585–595, November–December 1999.

[2] G. Durisi, U. G. Schuster, H. B¨olcskei, and S. Shamai (Shitz), “Noncoherent capacity of underspread fading channels,” IEEE Transactions on Information Theory, vol. 56, no. 1, pp. 367–395, January 2010.

[3] T. L. Marzetta and B. M. Hochwald, “Capacity of a mobile multiple-antenna com-munication link in Rayleigh flat fading,” IEEE Transactions on Information Theory, vol. 45, no. 1, pp. 139–157, January 1999.

[4] L. Zheng and D. N. C. Tse, “Communicating on the Grassmann manifold: a geomet-ric approach to the non coherent multiple-antenna channel,” IEEE Transactions on Information Theory, vol. 48, no. 2, pp. 359–383, February 2002.

[5] Y. Liang and V. V. Veeravalli, “Capacity of noncoherent time-selective Rayleigh-fading channels,” IEEE Transactions on Information Theory, vol. 50, no. 12, pp.

3095–3110, December 2004.

[6] T. Koch and A. Lapidoth, “Degrees of freedom in non-coherent stationary MIMO fading channels,” in Proceedings Winter School on Coding and Information Theory, Bratislava, Slovakia, February 20–25, 2005, pp. 91–97.

[7] ——, “The fading number and degrees of freedom in non-coherent MIMO fading chan-nels: a peace pipe,” in Proceedings IEEE International Symposium on Information Theory (ISIT), Adelaide, Australia, September 4–9, 2005, pp. 661–665.

[8] A. Lapidoth and S. M. Moser, “Capacity bounds via duality with applications to multiple-antenna systems on flat fading channels,” IEEE Transactions on Information Theory, vol. 49, no. 10, pp. 2426–2467, October 2003.

[9] S. M. Moser, “The fading number of multiple-input multiple-output fading channels with memory,” IEEE Transactions on Information Theory, vol. 55, no. 6, pp. 2716–

2755, June 2009.

Bibliography

[10] ——, “Duality-based bounds on channel capacity,” Ph.D. dissertation, ETH Zurich, October 2004, Diss. ETH No. 15769. [Online]. Available: http://moser.cm.nctu.edu.

tw/publications.html

[11] R. G. Gallager, Low Density Parity Check Codes. Cambridge, MA: MIT Press, 1962.

[12] J. L. Massey, “Causal interpretations of random variables,” Problemy Peredachi Infor-matsii (Problems of Information Transmission), vol. 32, no. 1, pp. 131–136, January–

March 1996.

[13] ——, “Correction to ‘Causal interpretations of random variables’,” Problemy Peredachi Informatsii (Problems of Information Transmission), May 16, 1997.

[14] R. L. Dobruˇsin, “General formulation of Shannon’s main theorem in information theory,” in Proceedings American Mathematical Society Translations, ser. 2, 1963, vol. 33, pp. 323–438.

[15] A. Lapidoth and S. M. Moser, “The fading number of single-input multiple-output fading channels with memory,” IEEE Transactions on Information Theory, vol. 52, no. 2, pp. 437–453, February 2006.

[16] G.-R. Lin and S. M. Moser, “The fading number of a multiple-access Rician fading channel,” IEEE Transactions on Information Theory, vol. 57, no. 8, pp. 4983–4991, August 2011.

相關文件