Downsizing training data with weighted FCM for predicting the evolution of specific grinding energy with RNNs

2017 
Abstract Grinding plays a prominent role in modern manufacturing due to its capacity for producing parts of high accuracy and precision. Among the various grinding variables, the specific grinding energy (e c ) is key because it measures the energy required to remove the unit volume of part material and therefore it gives information about the performance of the grinding process. In addition, a measure of the specific grinding energy is also useful for estimating the power requirements of the grinding machine. Thus, a Recurrent Neural Network (RNN) is used for predicting e c in a more realistic manner that involves the development of the e c over time. Moreover, since performing grinding experiments is a highly time and resource consuming task, it would be very useful to downsize the required dataset to train the RNN since it could substantially reduce the time and costs involved in carrying out the experiments to generate the dataset, as well as in training the RNNs. Therefore, in this work a methodology combining Fuzzy C-Means (FCM) and RNNs for downsizing the dataset and predicting specific grinding energy is proposed. Unlike other approaches for reducing the dataset using FCM, in the current work the inputs are weighted. To achieve this, the knowledge is extracted from the weights of satisfactorily trained RNN obtained from previous work. The results show that under reduced training datasets (weighted and non-weighted FCM inputs) and non-reduced datasets (all available experiments), superior results were yielded with the RNNs obtained with the weighted approach. In fact, in some cases, for the reduced training dataset (weighted) the error is halved. Furthermore, the results show that it is more advantageous to use a reduced training dataset obtained after FCM, since this reduces the costs associated with experimental time, as well as the training time required for RNNs.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    4
    Citations
    NaN
    KQI
    []