Knowledge Based Automated Boundary Detection for Qualifying of LV Function in Low Contrast Angiographic Images
Yang Hee Yee* , Chun Kee Jeon** , Sang-Rok Oh*** and Mignon-Park****
*Department of Automation Engineering, Korea Polytechnic University 3Ga-101, Jungwang-Dong, Shihung Kyonggi-Do, 429-450, Korea
**Department of Mechatronics, Kyonggi Institute of Technology 3Ga-102, Jungwang-Dong, Shihung Kyonggi-Do, 429-450, Korea
*** Korea Institute of Science and Technology 39-1, Hawalgok-Dong Sungbuk-Gu, Seoul, 136-791, Korea
****ICS Lab, Dept. of Electronic engineering, Yonsei University Shinchon-Dong, Seodaemun-Gu, Seoul, 120-749, Korea
A Cardiac function is evaluated quantitatively by analyzing a shape change of the heart wall boundaries in angiographic images. To begin with, a boundary detection of end systolic left ventricle (ESLV) and end diastolic left ventricle (EDLV) is essential for the quantitative analysis of the cardiac function. Conventional methods for the boundary detection are almost semi-automatic, and a knowledgeable human operator’s intervention is still required. Manual tracing of the boundaries is currently used for subsequent analysis and diagnosis. However, these methods do not cut excessive time, labor, and subjectivity associated with manual intervention by a human operator. Generally, EDLV images have noncontiguous and ambiguous edge signal on some boundary regions. In this paper, we propose a new method for an automated detection of left ventricle (LV) boundaries in noncontiguous and ambiguous EDLV images. The proposed boundary detection scheme is based on a priori knowledge information and is divided into two steps. The first step is to detect EDLV boundary using ESLV boundary. The second step is to correct the detected EDLV boundary using the left ventricle (LV) shape information. We compared the proposed method with the manual method to detect the EDLV boundary. And through the experiments of the proposed method, we verified the usefulness of this method.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 International License.