language-icon Old Web
English
Sign In

Relative entropy of Z-numbers

2021 
Abstract Real-world information is characterized by uncertainty and partial reliability. In order to model this information, Zadeh introduced the concept of Z-numbers. It has been a challenge to construct a mathematical model to handle Z-number-based information similar to probability theory. One of the basic concepts in probability theory is relative entropy, also known as Kullback-Leibler divergence and information divergence, which is directed divergence between two probability distributions. In this work, we propose an approach for the development of the concept of relative entropy of Z-numbers. It based on the essence of Z-numbers, maximum entropy method and the relative entropy of probability distributions. Based on the proposed relative entropy, we construct a novel Technique for Order of Preference by Similarity to Ideal Solution based on the Z-numbers (Z-TOPSIS) method, which directly calculates Z-numbers instead of converting them to fuzzy numbers. A case study of supplier selection is then used to illustrate the effectiveness of our proposed Z-TOPSIS method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    43
    References
    0
    Citations
    NaN
    KQI
    []