Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2005 Nov;16(6):1513-30.
doi: 10.1109/TNN.2005.853337.

Connectionist-based Dempster-Shafer evidential reasoning for data fusion

Affiliations

Connectionist-based Dempster-Shafer evidential reasoning for data fusion

Otman Basir et al. IEEE Trans Neural Netw. 2005 Nov.

Abstract

Dempster-Shafer evidence theory (DSET) is a popular paradigm for dealing with uncertainty and imprecision. Its corresponding evidential reasoning framework is theoretically attractive. However, there are outstanding issues that hinder its use in real-life applications. Two prominent issues in this regard are 1) the issue of basic probability assignments (masses) and 2) the issue of dependence among information sources. This paper attempts to deal with these issues by utilizing neural networks in the context of pattern classification application. First, a multilayer perceptron neural network with the mean squared error as a cost function is implemented to calculate, for each information source, posteriori probabilities for all classes. Second, an evidence structure construction scheme is developed for transferring the estimated posteriori probabilities to a set of masses along with the corresponding focal elements, from a Bayesian decision point of view. Third, a network realization of the Dempster-Shafer evidential reasoning is designed and analyzed, and it is further extended to a DSET-based neural network, referred to as DSETNN, to manipulate the evidence structures. In order to tackle the issue of dependence between sources, DSETNN is tuned for optimal performance through a supervised learning process. To demonstrate the effectiveness of the proposed approach, we apply it to three benchmark pattern classification problems. Experiments reveal that the DSETNN out-performs DSET and provide encouraging results in terms of classification accuracy and the speed of learning convergence.

PubMed Disclaimer

MeSH terms

LinkOut - more resources