Deep feature weighting in Naive Bayes for Chinese text classification

2016 
Naive Bayes (NB) continues to be one of the popular methods for text categorization because of its simplicity, efficiency and efficacy. In all of the existing feature weighting approaches, the learned feature weights are only applied to the classification of formula of Naive Bayes. In this paper, we propose a high efficient method which is called deep feature weighting Naive Bayes (DFWNB) [1]. DFWNB addresses to incorporate the learned weights into both the formula of classification and its conditional probability estimates. This paper defines the weight of each feature by TF-IDF feature weighting method. In the field of data mining, there are numerous studies about English text categorization but the Chinese text classification has received less attention from researchers. Thus, we apply the deep feature weighting naive Bayes to Chinese text classifiers and obtain a better performance than ordinary feature weight naive Bayes (OFWNB).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    14
    Citations
    NaN
    KQI
    []