Synthesizing longitudinal cortical thickness estimates with a flexible and hierarchical multivariate measurement-error model
MRI-based entorhinal cortical thickness (eCT) measurements predict cognitive decline in Alzheimer's disease (AD) with low cost and minimal invasiveness. Two prominent imaging paradigms, FreeSurfer (FS) and Advanced Normalization Tools (ANTs), feature multiple pipelines for extracting region-specific eCT measurements from raw MRI, but the sheer complexity of these pipelines makes it difficult to choose between pipelines, compare results between pipelines, and characterize uncertainty in pipeline estimates. Worse yet, the EC is particularly difficult to image, leading to variations in thickness estimates between pipelines that overwhelm physiologicl variations predictive of AD. We examine the eCT outputs of seven different pipelines on MRIs from the Alzheimer's Disease Neuroimaging Initiative. Because of both theoretical and practical limitations, we have no gold standard by which to evaluate them. Instead, we use a Bayesian hierarchical model to combine the estimates. The resulting posterior distribution yields high-probability idealized eCT values that account for inherent uncertainty through a flexible multivariate error model that supports different constant offsets, standard deviations, tailedness, and correlation structures between pipelines. Our hierarchical model directly relates idealized eCTs to clinical outcomes in a way that propagates eCT estimation uncertainty to clinical estimates while accounting for longitudinal structure in the data. Surprisingly, even though it incorporates greater uncertainty in the predictor and regularization provided by the prior, the combined model reveals a stronger association between eCT and cognitive capacity than do nonhierarchical models based on data from single pipelines alone.
READ FULL TEXT