The Kullback-Leibler Divergence Used in Machine Learning Algorithms for Health Care Applications and Hypertension Prediction: A Literature Review.

2018 
Abstract Kullback-Leibler divergence class or relative entropy is an exceptional instance of a more extensive divergence. It is an estimation of how a particular dissemination wanders from another, normal likelihood appropriation. Kullback-Leibler divergence has a considerable measure of ongoing applications. Despite the fact that there is an advance in the drug field, it still requires a measurable examination for supporting developing prerequisites. This paper discusses the use of Kullback-Leibler divergence as a conceivable technique to foresee hypertension, utilizing chest sound accounts and machine-learning calculations. It would have a unique, elevated advantage in the crisis medical services framework. Interpreting the chest sound example gives a wide and varying degree of recognition concerning different abnormalities and prosperity conditions of the restorative field. The proposed technique to estimate circulatory strain is chest sound examination, using a strategy that makes a record of sounds conveyed by the contracting heart, coming to fruition in view of valves and related vessel vibration, finally investigating it with the assistance of Kullback-Leibler divergence and machine calculation. An investigation employing the Kullback-Leibler divergence strategy will permit finding the distinction in chest sound chronicles, which can be assessed by a machine-learning calculation. Likewise, the report proposes the strategy for examining the chest sound accounts in Kullback-Leibler divergence class.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    15
    References
    8
    Citations
    NaN
    KQI
    []