UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application

Azimbagirad, Mehran; Murta Junior, Luiz Otavio; (2021) Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application. Neuroscience Informatics , 1 (1-2) , Article 100002. 10.1016/j.neuri.2021.100002. Green open access

[thumbnail of 1-s2.0-S2772528621000029-main.pdf]
Preview
Text
1-s2.0-S2772528621000029-main.pdf - Accepted Version

Download (952kB) | Preview

Abstract

Among statistical models, Gaussian Mixture Models (GMMs) have been used in numerous applications to model the data in which a mixture of Gaussian curves fits them. Several methods have been introduced to estimate the optimum parameters to a GMM fitted to the data. The accuracy of such estimation methods is crucial to interpret the data. In this paper, we proposed a new approach to estimate the parameters of a GMM using critical points of Tsallis-entropy to adjust each parameter's accuracy. To evaluate the proposed method, seven GMMs of simulated random (noisy) samples generated by MATLAB were used. Each simulated model was repeated 1000 times to generates 1000 random values obeying the GMM. In addition, five GMM shaped samples extracted from magnetic resonance brain images were used, aiming for image segmentation application. For comparison assessment, Expectation-Maximization, K-means, and Shannon's estimator were employed on the same dataset. These four estimation methods using accuracy, Akaike information criterion (AIC), Bayesian information criterion (BIC), and Mean Squared Error (MSE) were evaluated. The mean accuracies of the Tsallis-estimator for simulated data, i.e., the mean values, variances, and proportions, were 99.9(±0.1), 99.8(±0.2), and 99.7(±0.3)%, respectively. For both datasets, the Tsallis-estimator accuracies were significantly higher than EM, K-means, and Shannon. Tsallis-estimator, increasing the estimated parameters' accuracy, can be used in statistical approaches and machine learning.

Type: Article
Title: Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application
Open access status: An open access version is available from UCL Discovery
DOI: 10.1016/j.neuri.2021.100002
Publisher version: https://doi.org/10.1016/j.neuri.2021.100002
Language: English
Additional information: © 2021 The Author(s). Published by Elsevier Masson SAS. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
Keywords: Tsallis entropy, Shannon entropy, Expectation-Maximization, K-means, Gaussian Mixture Model
UCL classification: UCL
UCL > Provost and Vice Provost Offices > UCL BEAMS
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Computer Science
URI: https://discovery.ucl.ac.uk/id/eprint/10177528
Downloads since deposit
12Downloads
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item