Application of deep learning method to Reynolds stress models of channel flow based on reduced-order modeling of DNS data

2019 
Recently, the methodology of deep learning is used to improve the calculation accuracy of the Reynolds-averaged Navier-Stokes (RANS) model. In this paper, a neural network is designed to predict the Reynolds stress of a channel flow of different Reynolds numbers. The rationality and the high efficiency of the neural network is validated by comparing with the results of the direct numerical simulation (DNS), the large eddy simulation (LES), and the deep neural network (DNN) of other studies. To further enhance the prediction accuracy, three methods are developed by using several algorithms and simplified models in the neural network. In the method 1, the regularization is introduced and it is found that the oscillation and the overfitting of the results are effectively prevented. In the method 2, y+ is embedded in the input variable while the combination of the invariants is simplified in the method 3. From the predicted results, it can be seen that by using the first two methods, the errors are reduced. Moreover, the method 3 shows considerable advantages in the DNS trend and the smoothness of a curve. Consequently, it is concluded that the DNNs can predict effectively the anisotropic Reynolds stress and is a promising technique of the computational fluid dynamics.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    27
    Citations
    NaN
    KQI
    []