Study on Therapeutic Dropout Rates of a Pediatric Population in South-Eastern Central Europe, Dependent on Individual Particularities
1
Citation
13
Reference
10
Related Paper
Citation Trend
Abstract:
Specialized studies conducted in the last decade have shown that the therapeutic success of specific treatment with tuberculostatic drugs, along with the decrease in the risk of relapse are elements that can only be achieved by increasing patients’ adherence to treatment. This is of paramount importance, especially for pediatric populations, as they are subject to a higher rate of therapeutic dropout, dependent on a number of individual particularities (from socio-demographic characteristics, to the incidence of depressive disorders, to pathological personal history, etc.). When we talk about patient behavior, we refer to how he/she complies with the specialist’s recommendation regarding: medication administration; following a diet; lifestyle changes.Keywords:
Dropout (neural networks)
Therapeutic effect
Dropout (neural networks)
Overfitting
Cite
Citations (2)
Dropout is typically interpreted as bagging a large number of models sharing parameters. We show that using dropout in a network can also be interpreted as a kind of data augmentation in the input space without domain knowledge. We present an approach to projecting the dropout noise within a network back into the input space, thereby generating augmented versions of the training data, and we show that training a deterministic network on the augmented samples yields similar results. Finally, we propose a new dropout noise scheme based on our observations and show that it improves dropout results without adding significant computational cost.
Dropout (neural networks)
Cite
Citations (96)
Statistic data at home and abroad demonstrates that the rate of dropout of distance learners in the area of distance education far surpasses that of learners in the area of traditional education,and in Asia the rate up to 50%. In the overseas the relative research on the dropout began earlier,and there have been some ready theories.But there are few of them as to how to solve this problem of dropout.In our country we started the same research about dropout later and we are lack of relative and respective theories.Based on the basic work and actual practice,this paper analyzes the characteristics of distance learners'dropout,and the main reasons for the dropout,according to the data analyzing,and then tries to manage a series of ways or measures taken to solve the same.It is expected to have some benefits to solve this problem of distance learners dropout.
Dropout (neural networks)
Statistic
Cite
Citations (2)
We explore a recently proposed Variational Dropout technique that provided an elegant Bayesian interpretation to Gaussian Dropout. We extend Variational Dropout to the case when dropout rates are unbounded, propose a way to reduce the variance of the gradient estimator and report first experimental results with individual dropout rates per weight. Interestingly, it leads to extremely sparse solutions both in fully-connected and convolutional layers. This effect is similar to automatic relevance determination effect in empirical Bayes but has a number of advantages. We reduce the number of parameters up to 280 times on LeNet architectures and up to 68 times on VGG-like networks with a negligible decrease of accuracy.
Dropout (neural networks)
Cite
Citations (319)
Study on pathological diagnosis of pathological changes of 457 patients with atypical squamous cells
Objective To discuss the pathological diagnosis of pathological changes.Methods The 457 patients with pathological changes of cervical atypical squamous cells were further investigated by colposcope and multiple cervical biopsies for pathological analysis.Results The rates of cervical epitheilial neoplasma II(CINⅡ) of ASC-US and ASC-H were 8.15%(34/417) and 35%(14/40),showing significant defferences between the two groups(P0.01).Conclusion Pathological analysis be carried out for early diagnosis of cervical epithelial neoplams.
Cite
Citations (0)
Dropout prediction in MOOCs is a well-researched problem where we classify which students are likely to persist or drop out of a course. Most research into creating models which can predict outcomes is based on student engagement data. Why these students might be dropping out has only been studied through retroactive exit surveys. This helps identify an important extension area to dropout prediction- how can we interpret dropout predictions at the student and model level? We demonstrate how existing MOOC dropout prediction pipelines can be made interpretable, all while having predictive performance close to existing techniques. We explore each stage of the pipeline as design components in the context of interpretability. Our end result is a layer which longitudinally interprets both predictions and entire classification models of MOOC dropout to provide researchers with in-depth insights of why a student is likely to dropout.
Dropout (neural networks)
Interpretability
Dropout (neural networks)
Learning Analytics
Cite
Citations (81)
This paper extended Stacked Denoising Autoencoder to build a deep neural network which initialized the weight of neural network through the encoder's weight and used Dropout to reduce the error rate in fine-tuning stage. The neural network used the information of students in recent years as input data to train neural network, and predicted the possibility of dropout on the students during the semester. The prediction result can be used to counseling and warning students which be dropout likely and then reduced the unnecessary resource of school.
Dropout (neural networks)
Autoencoder
Cite
Citations (12)
Data dropout in networked control systems (NCSs) is unavoidable. The desired performance may not be achieved if the data dropout is not considered. Therefore, a controller in the NCS needs to be designed which takes the data dropout into account. In this paper, the data dropout is assumed to follow a stochastic process. For the NCS, a model predictive control (MPC) method is proposed for the design of an optimal input sequence. The effectiveness of the proposed method is verified through simulations and experiments.
Dropout (neural networks)
Model Predictive Control
Networked control system
Cite
Citations (1)
Deep neural networks (DNNs), which show outstanding performance in various areas, consume considerable amounts of memory and time during training. Our research led us to propose a controlled dropout technique with the potential of reducing the memory space and training time of DNNs. Dropout is a popular algorithm that solves the overfitting problem of DNNs by randomly dropping units in the training process. The proposed controlled dropout intentionally chooses which units to drop compared to conventional dropout, thereby possibly facilitating a reduction in training time and memory usage. In this paper, we focus on validating whether controlled dropout can replace the traditional dropout technique to enable us to further our research aimed at improving the training speed and memory efficiency. A performance comparison between controlled dropout and traditional dropout is carried out by implementing an image classification experiment on data comprising handwritten digits from the MNIST dataset (Mixed National Institute of Standards and Technology dataset). The experimental results show that the proposed controlled dropout is as effective as traditional dropout. Furthermore, the experimental result implies that controlled dropout is more efficient when an appropriate dropout rate and number of hidden layers are used.
Dropout (neural networks)
Overfitting
MNIST database
Cite
Citations (31)
Development has been seen in the advanced education segment in South Africa. With this development, an expansion in the dropout rate is noticed. This study explores the adequacy of dimensional decrease and concentrates the significant data covered up in the student information for the identification of students in danger of dropout. This study depends on educational data mining techniques and makes forecasts of dropout goal of understudies from courses. In the test, the researchers show promising outcomes with information from the recognition courses of a University of Technology.
Dropout (neural networks)
Identification
Cite
Citations (13)