On Optimization of Multi-Class Logistic Regression Classifier

2013 
The classical multi-class logistic regression classifier uses Newton method to optimize its loss function and suffers the expensive computations and the un-stable iteration process. In our work, we apply two state-of-art optimization techniques including conjugate gradient (CG) and BFGS to train multi-class logistic regression and compare them with Newton method on the classification accuracy of 20 datasets experimentally. The results show that CG and BFGS achieves better classification accuracy than the Newton method. Moreover, CG and BFGS have the lower time complexity, in contrast with Newton method. Finally, we also observe that CG and BFGS demonstrate similar performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    0
    Citations
    NaN
    KQI
    []