This is an old revision of the document!
A generalization of the above methods, in which each training example can be labeled by an arbitrary belief function on the set of classes. Thus, an example is of the form (xi,mi), where xi is a feature vector, and mi is a bba describing one's partial knowledge of the class of example i (in fact, the label for example i is given as the commonality function associated to mi; it may easily be computed from mi using the mtoq function in the FMT package. If the labels are crisp (the class of each training example is known with certainty), it is preferable to use the other two packages above, which are then much faster.
The basic method is described in
T. Denoeux. A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Transactions on Systems, Man and Cybernetics, 25(05):804-813, 1995.
The application of the method with possibilistic labels (a special case of the training data considered in this package), as well as error criterion 1 used by edcfit, are discussed in
T. Denoeux and L. M. Zouhal. Handling possibilistic labels in pattern classification using evidential reasoning. Fuzzy Sets and Systems, 122(3):47-62, 2001. postscript
The two criteria used by edcfit are discussed in
The routines in this package use the FMT package developed by Philippe Smets.