Image segmentation by EM-based adaptive pulse coupled neural networks in brain magnetic resonance imaging. Academic Article uri icon

Overview

abstract

  • We propose an automatic hybrid image segmentation model that integrates the statistical expectation maximization (EM) model and the spatial pulse coupled neural network (PCNN) for brain magnetic resonance imaging (MRI) segmentation. In addition, an adaptive mechanism is developed to fine tune the PCNN parameters. The EM model serves two functions: evaluation of the PCNN image segmentation and adaptive adjustment of the PCNN parameters for optimal segmentation. To evaluate the performance of the adaptive EM-PCNN, we use it to segment MR brain image into gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF). The performance of the adaptive EM-PCNN is compared with that of the non-adaptive EM-PCNN, EM, and Bias Corrected Fuzzy C-Means (BCFCM) algorithms. The result is four sets of boundaries for the GM and the brain parenchyma (GM+WM), the two regions of most interest in medical research and clinical applications. Each set of boundaries is compared with the golden standard to evaluate the segmentation performance. The adaptive EM-PCNN significantly outperforms the non-adaptive EM-PCNN, EM, and BCFCM algorithms in gray mater segmentation. In brain parenchyma segmentation, the adaptive EM-PCNN significantly outperforms the BCFCM only. However, the adaptive EM-PCNN is better than the non-adaptive EM-PCNN and EM on average. We conclude that of the three approaches, the adaptive EM-PCNN yields the best results for gray matter and brain parenchyma segmentation.

publication date

  • December 29, 2009

Research

keywords

  • Algorithms
  • Brain
  • Image Interpretation, Computer-Assisted
  • Imaging, Three-Dimensional
  • Magnetic Resonance Imaging
  • Nerve Net
  • Pattern Recognition, Automated

Identity

Scopus Document Identifier

  • 77952241858

Digital Object Identifier (DOI)

  • 10.1016/j.compmedimag.2009.12.002

PubMed ID

  • 20042313

Additional Document Info

volume

  • 34

issue

  • 4