Abstract
Data fusion, within the evidence theory framework, consists of obtaining a unique belief function by the combination of several belief functions induced from various information sources. Considerable attention has been paid to combination rules dealing with beliefs induced from non-distinct information sources. The most popular fusion rule is the cautious conjunctive rule, proposed by Denœux. This latter has the empty set, called also the conflict, as an absorbing element. In fact, the mass assigned to the conflict tends toward 1 when applying a high number of the cautious conjunctive operator, and consequently, the conflict loses its initial role as an alarm signal announcing that there is a kind of disagreement between sources. This problem has led to the introduction of the normalized cautious rule which totally ignores the mass assigned to the conflict. An intermediate rule between the cautious conjunctive and the normalized cautious rules, named the cautious Combination With Adaptive Conflict (cautious CWAC), has been proposed to preserve the initial alarm role of the conflict. Despite this diversification, no great effort has been devoted until now to find out the most convenient combination rule. Thus, in this paper, we suggest to evaluate and compare the cautious conjunctive, the normalized cautious and the cautious CWAC rules in order to pick out the most appropriate one within the classifier fusion framework.
Similar content being viewed by others
References
Al-Ani A, Deriche M (2002) A new technique for combining multiple classifiers using the Dempster–Shafer theory of evidence. J Artif Intell Res 17:333–361
Bertoni A, Folgieri R, Valentini G (2005) Biomolecular cancer prediction with random subspace ensembles of support vector machines. Neurocomputing 63:535–539
Bi Y, Guan J, Bell D (2008) The combination of multiple classifiers using an evidential reasoning approach. Artif Intell 172(15):1731–1751
Boubaker J, Elouedi Z, Lefevre E (2013) Conflict management with dependent information sources in the belief function framework. In: 14th International symposium of computational intelligence and informatics (CINTI). IEEE, vol 52, pp 393–398
Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140
Cattaneo MEGV (2003) Combining belief functions issued from dependent sources. In: 3rd International symposium on imprecise probabilities and their applications (ISIPTA), vol 3
Cho S-B, Kim JH (1995) Combining multiple neural networks by fuzzy integral for robust classification. IEEE Trans Syst Man Cybern 25(2):380–384
Dempster AP (1967) Upper and Lower probabilities induced by a multivalued mapping. Ann Math Stat 38:325–339
Denoeux T (1999) Reasoning with imprecise belief structures. Int J Approx Reason 20(1):79–111
Denoeux T (2006) The cautious rule of combination for belief functions and some extension. In: 9th International conference on information fusion (FUSION’2006), pp 1–8
Denoeux T (2008) Conjunctive and disjunctive combination of belief functions induced by nondistinct bodies of evidence. Artif Intell 172(2):234–264
Denoeux T, Masson M-H (2012) Belief functions: theory and applications. In: 2nd International conference on belief functions, vol 164. Springer, New York
Dietterich T (2000) An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Mach Learn 40(2):139–157
Dubois D, Prade H (1988) Representation and combination of uncertainty with belief functions and possibility measures. Comput Intell 4(3):244–264
Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139
Geurts P, Ernst D, Wehenkel L (2006) Extremely randomized trees. Mach Learn 63(1):3–42
Goebel K, Yan W (2004) Choosing classifiers for decision fusion. In: The 7th International Conference on Information Fusion, vol 1, pp 563–568
Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18
Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 10:993–1001
Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844
Huang YS, Liu K, Suen CY (1995) The combination of multiple classifiers by a neural network approach. Int J Pattern Recognit Artif Intell 9(03):579–597
Johansson R, Boström H, Karlsson A (2008) A study on class-specifically discounted belief for ensemble classifiers. In: IEEE international conference on multisensor fusion and integration for intelligent systems, pp 614–619
Jousselme A, Grenier D, Bossé E (2001) A new distance between two bodies of evidence. Inf Fusion 2(2):91–101
Kittler J, Hatef M, Duin RP, Matas J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239
Kuncheva L, Rodríguez J, Plumpton C, Linden D, Johnston S (2010) Random subspace ensembles for FMRI classification. IEEE Trans Med Imaging 29(2):531–542
Kuncheva L, Skurichina M, Duin RP (2002) An experimental study on diversity for bagging and boosting with linear classifiers. Inf Fusion 3(4):245–258
Kuncheva L, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51(2):181–207
Le CA, Huynh V-N, Shimazu A, Nakamori Y (2007) Combining classifiers for word sense disambiguation based on Dempster–Shafer theory and OWA operators. Data KnowlEng 63(2):381–396
Lefevre E, Colot O, Vannoorenberghe P (2002) Belief function combination and conflict management. Inf Fusion 3(2):149–162
Lefevre E, Elouedi Z (2013) How to preserve the conflict as an alarm in the combination of belief functions? Decis Support Syst 56:326–333
Mercier D, Cron G, Denoeux T, Masson M (2005) Fusion of multi-level decision systems using the Transferable Belief Model. In: 8th International conference on information fusion (fusion’2005), vol 2, pp 655–658
Murphy CK (2000) Combining belief functions when evidence conflicts. Decis Support Syst 29(1):1–9
Murphy P, Aha D (1996) UCI repository databases. http://www.ics.uci.edu/mlear
Pizzi NJ, Pedrycz W (2010) Aggregating multiple classification results using fuzzy integration and stochastic feature selection. Int J Approx Reason 51(8):883–894
Quost B, Denoeux T, Masson M-H (2007) Pairwise classifier combination using belief functions. Pattern Recogn Lett 28(5):644–653
Quost B, Denoeux T, Masson M-H (2008) Adapting a combination rule to non-independent information sources. In: 12th Information processing and management of uncertainty in knowledge-based systems (IPMU), pp 448–455
Quost B, Masson M-H, Denoeux T (2011) Classifier fusion in the Dempster–Shafer framework using optimized t-norm based combination rules. Int J Approx Reason 52(3):353–374
Reformat M, Yager RR (2008) Building ensemble classifiers using belief functions and OWA operators. Soft Comput 12(6):543–558
Rodriguez JJ, Kuncheva LI, Alonso CJ (2006) Rotation forest: a new classifier ensemble method. IEEE Trans Pattern Anal Mach Intell 28(10):1619–1630
Ruta D, Gabrys B (2005) Classifier selection for majority voting. Inf Fusion 6(1):63–81
Shafer G (1976) A mathematical theory of evidence, vol 1. Princeton University Press, Princeton
Sharkey AJ, Sharkey NE (1997) Combining diverse neural nets. Knowl Eng Rev 12(03):231–247
Smets P (1988) The Transferable Belief Model for quantified belief representation. In: Handbook of defeasible reasoning and uncertainty management systems, vol 1, pp 267–301
Smets P (1990a) The combination of evidence in the transferable belief model. IEEE Trans Pattern Anal Mach Intell 12(5):447–458
Smets P (1990b) The combination of evidence in the transferable belief model. IEEE Trans Pattern Anal Mach Intell 12(5):447–458
Smets P (1995) The canonical decomposition of a weighted belief. In: 14th International joint conference on artificial intelligence (IJCAI), vol 95, pp 1896–1901
Smets P (1998) The application of the transferable belief model to diagnostic problems. Int J Intell Syst 13:127–157
Trabelsi A, Elouedi Z, Lefevre E (2015a) Belief function combination: comparative study in classifier fusion framework. In: 1st International symposium on advanced intelligent systems and informatics (AISI), vol 407, pp 425–435
Trabelsi A, Elouedi Z, Lefevre E (2015b) Classifier fusion within the belief function framework using dependent combination rules. In: 22nd International symposium on methodologies for intelligent systems (ISMIS), vol 9384, pp 133–138
Wolpert DH (1992) Stacked generalization. Neural Netw 5(2):241–259
Woods K, Kegelmeyer WP, Bowyer K Jr (1997) Combination of multiple classifiers using local accuracy estimates. IEEE Trans Pattern Anal Mach Intell 19:405–410
Xu L, Krzyzak A, Suen CY (1992) Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Trans Syst Man Cybern A Syst Humans 22(3):418–435
Xu P, Davoine F, Denoeux T (2014) Evidential logistic regression for binary SVM classifier calibration. In: 3rd International conference on belief functions (BELIEF). Springer, New York, pp 49–57
Xu P, Davoine F, Zha H, Denoeux T (2016) Evidential calibration of binary SVM classifiers. Int J Approx Reason 72:55–70
Yager RR (1987) On the Dempster–Shafer framework and new combination rules. Inf Sci 41(2):93–137
Yen J (1990) Generalizing the Dempster–Shafer theory to fuzzy sets. IEEE Trans Syst Man Cybern 20(3):559–570
Zadeh LA (1986) A simple view of the Dempster–Shafer theory of evidence and its implication for the rule of combination. AI Mag 7(2):85–90
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Communicated by A. Di Nola.
Rights and permissions
About this article
Cite this article
Trabelsi, A., Elouedi, Z. & Lefevre, E. Comparing dependent combination rules under the belief classifier fusion framework. Soft Comput 21, 6919–6932 (2017). https://doi.org/10.1007/s00500-016-2402-9
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-016-2402-9