Feedforward Back Propagation Neural Network (FFBPNN) Based Approach for the Identification of Handwritten Math Equations

2020 
The demand for the identification of manually written mathematical equations is increasing day by day. Despite the hype, due to the increasing ambiguity in recognition, 2D and touching symbols, and complication mathematical equations, the recognition of the emerging mathematical equations has become a challenging task. The statistical, as well as complex features such as skew, kurtosis, entropy, mean, variance, standard deviation, has been considered. The classification and training have been provided using neural networks (NN) and the recognition rate has been dependent on the classifier used as well as features to be extracted. Speed of execution, efficiency and recognition rate have been enhanced by utilizing feed-forward back propagation neural network (FBBPNN) with training function gradient descent and learning rule of momentum and adaptive learning. The system can take scanned images of handwritten mathematical equations from simple through complex equations and classifies it according to the type of equations e.g. straight-line equation, the law of indices, gravity law, roots of quadratic expressions, area of a circle, convolution summation and convolution integration.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    0
    Citations
    NaN
    KQI
    []