• 沒有找到結果。

正子斷層掃描之快速交互參照式最大概似估計影像重建演算法

N/A
N/A
Protected

Academic year: 2021

Share "正子斷層掃描之快速交互參照式最大概似估計影像重建演算法"

Copied!
4
0
0

加載中.... (立即查看全文)

全文

(1)

正子斷層掃描之快速交互參照式最大概似估計影像重建演算法

Accelerated Cross-Reference Maximum Likelihood Estimates for PET Image Reconstruction 計畫編號:NSC 88-2213-E-002-016 執行期限:87 年 8 月 1 日至 88 年 7 月 31 日 主持人﹕陳中明 國立台灣大學醫學院醫學工程學研究所 1. 中文摘要 正子斷層掃描(PET)是一種提供被能 釋放出正子之放射性同位素所標記的化學 物於人體中之分佈的影像方法。和提供解剖 學資料的 CT 與 MRI 所不同的是, PET 透 露了人體中活體之生理與代謝之功能性的 訊息。臨床上,在形狀上起變化以前的早期 診斷可以藉由研究 PET 影像中的生理或代 謝的病變而達成。因此,PET 已成為現代診 斷中最重要的影像工具之一。於 PET 中, 新陳代謝的強度是由置於人體外部的偵測 器所間接觀測到的。而用間接的觀測值來重 建實際的影像,這是一種典型的統計逆向問 題。 由於這種問題解的不良性,所以,沒 有正則化的 PET 影像將會有雜訊及邊界的 假象。這是 PET 的能力限制,並不能藉由 改良儀器設計來解決。所以為了要有較好的 重建影像,我們需要去考慮專家的見解或其 它的斷層掃描系統,例如:X-ray CT, MRI 等掃描器,所提供的相關資訊。 相 關 的 邊 界 資 訊 可 以 提 供 有 用 的 訊 息。但是因為解剖學上的人體器官構造與實 際的新陳代謝情形並不盡相同,所以,邊界 資訊可能是不完全的或是不正確的。因此交 互參照是重要而明智的。我們考慮有偶發事 件及衰減情形的 PET,研究交互參照式的最 大概似估計重建法,並以修改後的 EM 演算 法來處理。特別是,我們將研究快速的影像 重建演算法,包含著連接式及平行式處理的 步驟。在本計畫中,我們將使用 IBM SP2 及工作 站網 路作 為平行 演算 法的 發展 平 台。而本計畫的目標是應用相關但不完全的 邊界資訊,使用一部或多部的電腦來找到快 速、有效、且可行的方法,以重建 PET 的 影像。這可用來改進 PET 的重建影像,並 且可以用來整合其它不同的斷層掃描系統 以形成完整的專家系統。 關鍵詞﹕正子斷層掃描(PET), 統計逆向 問題, 最大概似估計量,EM 演算法, 正則 化, 平行演算法。 Abstr act

Positron Emission Tomography (PET) is an imaging modality giving distribution of positron-emitting isotope-labeled chemicals in the human body. Unlike X-ray CT and MRI, which provide anatomical data, PET reveals functional information on in vivo

physiology and metabolism of the human body. Clinically, early detection of a disease before morphologically distinguishable may be achieved through PET by studying physiological or metabolic disorders. Hence, PET has become one of the most important imaging tools in modern diagnosis. The intensity of metabolic activity is indirectly observed through the scintillation detectors outside a human body. The reconstruction from indirect observations to a target image is a typical problem in statistical inverse problem. Due to the inherent ill-posedness of statistical inverse problems, the reconstructed images of positron emission tomography (PET) without regularization will have noise and edge artifacts. This is the limit of PET, which can not be resolved from the improvement of instrumental designs. In order to have better reconstructed images, it is necessary to borrow the strength from the related information from expertise or other tomography systems, such as X-ray CT scan, MRI, and so forth.

The correlated boundary information may offer the useful information in reducing the noise and edge artifacts. However, the boundary information may be incomplete or incorrect since the anatomy boundaries are different from the functional ones. Thus, cross-reference is important to make use the boundary information wisely. In this project, we will study the cross-reference reconstruction methods for the maximum likelihood estimate with the adapted EM

(2)

algorithm for PET in the presence of accidental coincidence (AC) events and attenuation. In particular, fast reconstruction algorithms for both sequential and parallel approaches will be investigated, which is very important for the practical use of the proposed PET reconstruction algorithms. In this project, we will use a cluster of computers as the platform of the parallel reconstruction algorithms. The aim is to find the fast, efficient and reliable approaches that can reconstruct the PET images with the related but incomplete boundary information with single or multiple computers. The proposed approaches will not only improve the quality of the reconstructed PET images but also establish a bridge to an expert system for various tomography systems.

Keywor ds: positron emission tomography (PET), statistical inverse problems, maximum likelihood estimator, EM algorithm, regularization, parallel algorithms.

2. Backgr ound and Aims

The reconstruction from indirect observations to a target image is a typical problem in statistical inverse problem. Due to the inherent ill-posedness of statistical inverse problems, the reconstructed images of positron emission tomography (PET) without regularization will have noise and edge artifacts. On the other hand, the correlated boundary information may offer the useful information in reducing the noise and edge artifacts. However, the boundary information may be incomplete or incorrect since the anatomy boundaries are different from the functional ones. Thus, cross-reference is important to make use of the boundary information wisely. In this project, we will study the cross-reference reconstruction methods for the maximum likelihood estimate with the adapted EM algorithm for PET in the presence of accidental coincidence (AC) events and attenuation. In particular, fast reconstruction algorithms for both sequential and parallel approaches will be investigated, which is very important for the practical use of the proposed PET reconstruction algorithms. The aim is to

find the fast, efficient and reliable approaches that can reconstruct the PET images with the related but incomplete boundary information with single or multiple computers. The proposed approaches will not only improve the quality of the reconstructed PET images but also establish a bridge to an expert system for various tomography systems.

3. Mater ials and Methods

The maximum likelihood estimate (MLE) with expectation maximization (EM) algorithm has been investigated for reconstruction of positron emission tomography (PET) in literature [1-2]. Two cross-reference-based approaches have been proposed in our recent studies. One is the cross-reference weighted least square estimate (CRWLSE) with the algebraic reconstruction technique [3]. The AC events and attenuation were included in this model. The distribution of differences between prompt and delay windows is approximated by a normal distribution. The other is the cross-reference MLE (CRMLE) with the modified EM algorithm for PET without AC events and attenuation [4].

Based on these studies, this project investigates the solutions of the following problems.

1. What is the MLE for the exact model of PET with AC events and attenuation? 2. How can we combine the correlated but

incomplete boundary information in? 3. What is the proper numerical algorithm

for finding the solution?

4. How can we speed up the convergence rate of the cross-reference method? 5. How to choose the penalty parameter

more quickly?

4. Results and Discussions

The expectation maximization (EM) algorithm is a row operation iterative approach to find the maximum likelihood estimate with monotonic convergence, linear complexity and nonnegativeness preserving. It is also parallelizable [5-9]. However, the convergence rate is slow. A variety of accelerated methods

(3)

had been proposed in literature. Among them, the space-alternating generalized expectation maximization (SAGE) [10] or the alternative expectation/conditional maximization (AECM) algorithms [11] accelerates the convergence rates effectively by alternating the complete data space. They are monotonically convergent, nonnegativeness preserving and of linear complexity. However, they are not parallelizable. We propose a hybrid SAGE (HSAGE) and a hybrid ECM (HECM) algorithms to further speed up the convergence rates. The new algorithms retain monotonic convergence, linear complexity and nonnegativeness preserving. Furthermore, the new algorithms are easily parallelizable, which make them even more practically appealing. They are termed as parallel HSAGE (PHSAGE) and parallel ECM (PHECM) algorithms respectively. The incomplete boundary information can be incorporated by a constructed penalty function. These new fast algorithms can be applied to find the CRMLE fast and preserve the merits of EM algorithms. The penalty parameter can be selected from data in speed when we apply the generalized approximate cross-validation (GACV) method [12].

Table 1 reports the required time and iteration numbers for convergence by the SAGE, HSAGE, ECM and HECM algorithms on a personal computer with Pentimum II 300 MHz CPU and linux environment. The convergence rates of these algorithms and the parallel versions are reported in Figure 1 and 2 by a cluster of SUN SPARC workstations with Message Passing Interface (MPI) [13] to simulate the message-passing interconnection network. These results confirm the advantages of HSAGE and HECM algorithms and their parallel versions.

5. Conclusions

The main consideration of a reconstruction algorithm that can be realized in a clinical environment strongly depends on the computational time and the quality of the reconstruction. The computational time can be reduced in two directions. One is aimed to accelerate the convergence rate and reduce the

overall computational time of the reconstruction algorithm sequentially. The other approach is that a highly efficient parallel algorithm may be helpful in attaching the goal.

Our new approaches, the HSAGE and HECM algorithms, preserve the feature of monotonic convergence. The convergence rates are also accelerated. Furthermore, the nonnegativeness is preserved. No more numerically complex implementations are needed. It only requires a simple modification of the code after adding a simple search program, which only consumes small overhead. The computational complexity remains the same linearity as O(BD).

In addition, the HSAGE and HECM algorithms are easily parallelizable. The convergence rates of parallel versions are comparable to those of sequential ones. The gains in the total computational time depend on the architectures and loading of the parallel systems.

Hence, the success of the hybrid accelerators not only represents an progress of the SAGE and ECM algorithms, but also make it a potential tool to improve other iterative methods.

6. Refer ences

[1] L.A. Shepp and Y.Vardi, “Maximum likelihood reconstruction for emission tomography,” IEEE Trans. Med. Imaging, vol. MI-1, pp. 113-122, Oct.

1982.

[2] K. Lange and R. Carson, “EM reconstruction algorithms for emission and transmission tomography,” J. Comput. Assist. Tomog., vol. 8, pp.

306-316, 1984.

[3] H. H.-S. Lu, C.-M. Chen, and I.-H. Yang, “Cross-reference weighted least square estimates for positron emission tomography,” IEEE Trans. on Medical Imaging, vol. 17, pp. 1-8, 1998.

[4] C.-M. Chen, H.H.-S. Lu, and Y.-P. Hsu, “Cross-reference maximum likelihood estimates for positron emission tomography,” Technical Report, 1998. [5] Z.H. Cho, C.M. Chen, and S.Y. Lee,

(4)

backprojection scheme for parallel beam geometries,”IEEE Trans. Med. Imaging,

vol. 9, pp. 207-217, June 1990.

[6] C.M. Chen, S.Y. Lee, and Z.H. Cho, “A parallel implementation of 3-D CT image reconstruction on hypercube multiprocessor,”IEEE Trans. Nucl. Sci.,

vol. NS-37, pp. 1333-1346, June 1990. [7] C.M. Chen, S.-Y. Lee, and Z.H. Cho,

“Parallelization of the EM algorithm for 3D PET image reconstruction,” IEEE Trans. Med. Imaging, vol. 10, pp.

513-522, Dec. 1991.

[8] C.M. Chen and S.-Y. Lee, “On parallelizing the EM algorithm for PET image reconstruction,” IEEE Trans. Parallel and Distributed Systems, vol. 5,

pp. 860-873, August 1994.

[9] C.M. Chen and S.-Y. Lee, “Optimal data replication: A new approach to optimizing parallel EM algorithms on a mesh-connected multiprocessor for 3D PET image reconstruction,”IEEE Trans. Nucl. Sci., vol. 42, pp. 1235-1245,

August 1995.

[10] J.A. Fessler and A.O. Hero, “Space-alternating generalized expectation-maximization algorithm,” IEEE Trans. on Signal Processing, vol. 42, pp.

2664-2677, 1994.

[11] X. L. Meng and D. A. van Dyk, "The EM algorithm - an old folk-song sung to a fast new tune", J. R. Statist. Soc. B, 59, 3, 511-567, 1997.

[12] D. Xiang and G. Wahba, “A generalized approximate cross validation for smoothing splines with non-gaussian data,”Statistica Sinica, vol. 6, pp.

675-692, 1996.

[13] W. Gropp, E. Lusk and A. Skjellum,

Using MPI: Portable Parallel Programming with the Message Passing Interface, MIT Press, 1994.

6. Figur es and Tables 7.

Table 1: Required time and iteration numbers for convergence by the SAGE, HSAGE, ECM and HECM algorithms on a personal computer with Pentimum II 300 MHz CPU and linux environment. Iteration No. Total Time (sec.) Time per Iteration SAGE 17 17.5 1.029 HSAGE 13 13.7 1.054 ECM 31 20.2 0.65 HECM 13 10.6 0.82

Figure 1: The convergence rates of the SAGE, HSAGE, and PHSAGE algorithms with respect to iteration numbers.

Figure 2: The convergence rates of the ECM, HECM, and PHECM algorithms with respect to iteration numbers.

數據

Figure  1:  The  convergence  rates  of  the  SAGE, HSAGE, and PHSAGE algorithms with respect to iteration numbers.

參考文獻

相關文件

We are not aware of any existing methods for identifying constant parameters or covariates in the parametric component of a semiparametric model, although there exists an

This theorem does not establish the existence of a consis- tent estimator sequence since, with the true value θ 0 unknown, the data do not tell us which root to choose so as to obtain

Since we use the Fourier transform in time to reduce our inverse source problem to identification of the initial data in the time-dependent Maxwell equations by data on the

Breu and Kirk- patrick [35] (see [4]) improved this by giving O(nm 2 )-time algorithms for the domination and the total domination problems and an O(n 2.376 )-time algorithm for

電子學實習、計算機概論

鼓勵家長一同參與,每天與孩子互相分 享感恩和快樂的事,建立正面積極的生

1.概估 2.設計估價 3.競爭估價 4.明細估價 5.實費精算估價 6.雇工估價

本系已於 2013 年購置精密之三維掃描影像儀器(RIEGL