Children with autism face challenges in various skills (e.g., communication and social) and they exhibit challenging behaviours. These challenging behaviours represent a challenge to their families, therapists, and caregivers, especially during therapy sessions. In this study, we have investigated several machine learning techniques and data modalities acquired using wearable sensors from children with autism during their interactions with social robots and toys in their potential to detect challenging behaviours. Each child wore a wearable device that collected data. Video annotations of the sessions were used to identify the occurrence of challenging behaviours. Extracted time features (i.e., mean, standard deviation, min, and max) in conjunction with four machine learning techniques were considered to detect challenging behaviors. The heart rate variability (HRV) changes have also been investigated in this study. The XGBoost algorithm has achieved the best performance (i.e., an accuracy of 99%). Additionally, physiological features outperformed the kinetic ones, with the heart rate being the main contributing feature in the prediction performance. One HRV parameter (i.e., RMSSD) was found to correlate with the occurrence of challenging behaviours. This work highlights the importance of developing the tools and methods to detect challenging behaviors among children with autism during aided sessions with social robots.
In strategic energy planning, human-oriented factors are uncertain and lead to unpredictable challenges. Thus, decision-makers must contextualize the target society to address these uncertainties. More precisely, uncertainties lead to performance gaps between assumed and actual sustainability target outcomes. This study proposed a new framework that considers vital elements, including occupant motivation, preference, socioeconomic characteristics, and building features (MPSEB). To utilize this model, a thorough face-to-face survey questionnaire was administered to measure these elements. This study explored how these elements affect the patterns of residential energy consumption in a region with numerous expat communities of various ethnic and cultural backgrounds. In particular, the study investigated the patterns of energy behaviors and human-building interactions among the residents of Qatar by collecting empirical evidence and conducting a subsequent survey analysis. Machine learning approaches were employed to explore the survey data and determine the interdependencies between features, as well as the significance of the fundamental factors influencing human-building interactions. The XGBoost method was used to conduct a feature importance analysis to determine factors contributing to residential energy consumption. The results revealed the primary behavioral and socioeconomic factors that affect residential energy consumption, and confirmed the influence of human factors in Qatar while considering its diverse population.
Could we detect anomalies during the run-time of a program by learning from the analysis of its previous traces for normally completed executions? In this paper we create a featured data set from program traces at run time, either during its regular life, or during its testing phase. This data set represents execution traces of relevant variables including inputs, outputs, intermediate variables, and invariant checks. During a learning mining step, we start from exhaustive random training input sets and map program traces to a minimal set of conceptual patterns. We employ formal concept analysis to do this in an incremental way, and without losing dependencies between data set features. This set of patterns becomes a reference for checking the normality of future program executions as it captures invariant functional dependencies between the variables that need to be preserved during execution. During the learning step, we consider enough input classes corresponding to the different patterns by using random input selection until reaching stability of the set of patterns (i.e. the set is almost no longer changing, and only negligible new patterns are not reducible to it). Experimental results show that the generated patterns are significant in representing normal program executions. They also enable the detection of different executable code contamination at early stages. The proposed method is general and modular. If applied systematically, it enhances software resilience against abnormal and unpredictable events.
Autism spectrum disorder is a neurodevelopmental disorder that is characterized by patterns of behaviours and difficulties with social communication and interaction. Children on the spectrum exhibit atypical, restricted, repetitive, and challenging behaviours. In this study, we investigate the feasibility of integrating wearable sensors and machine learning techniques to detect the occurrence of challenging behaviours in real-time. A session of a child with autism interacting with different stimuli groups that included social robots was annotated with observed challenging behaviors. The child wore a wearable device that captured different motion and physiological signals. Different features and machine learning configurations were investigated to identify the most effective combination. Our results showed that physiological signals in addition to typical kinetic measures led to more accurate predictions. The best features and learning model combination achieved an accuracy of 97%. The findings of this work motivate research toward methods of early detection of challenging behaviours, which may enable the timely intervention by caregivers and possibly by social robots.
Accurately modelling and forecasting electricity consumption is a key prerequisite for strategic sustainable energy planning and development. In this study, we use four advanced econometrics time series models and four machine learning (ML) and deep learning models including an AR with seasonality, ARX, ARFIMAX, 3S-MSARX, Prophet, XGBoost, LSTM and SVR to analyze and forecast electricity consumption during COVID-19 pre-lockdown, lockdown, releasing-lockdown, and post-lockdown phases. We use monthly data on Qatar’s total electricity consumption from January 2010 to December 2021. The empirical findings demonstrate that both econometric and ML models can capture most of the important statistical features characterizing electricity consumption (e.g., seasonality, sudden changes, outliers, trend, and potential long-lasting impact of shocks). In particular, we find that climate change based factors, e.g temperature, rainfall, mean sea-level pressure and wind speed, are key determinants of electricity consumption. In terms of forecasting, the results indicate that the ARFIMAX(1,d,0) and the 3S-MSARX(1) models outperform all other models. Policy implications and energy-environmental recommendations are proposed and discussed.
Abstract Accurate forecasting of environmental pollution indicators holds significant importance in diverse fields, including climate modeling, environmental monitoring, and public health. In this study, we investigate a wide range of machine learning and deep learning models to enhance Aerosol Optical Depth (AOD) predictions for the Arabian Peninsula (AP) region, one of the world’s main dust source regions. Additionally, we explore the impact of feature extraction and their different types on the forecasting performance of each of the proposed models. Preprocessing of the data involves inputting missing values, data deseasonalization, and data normalization. Subsequently, hyperparameter optimization is performed on each model using grid search. The empirical results of the basic, hybrid and combined models revealed that the convolutional long short-term memory and Bayesian ridge models significantly outperformed the other basic models. Moreover, for the combined models, specifically the weighted averaging scheme, exhibit remarkable predictive accuracy, outperforming individual models and demonstrating superior performance in longer-term forecasts. Our findings emphasize the efficacy of combining distinct models and highlight the potential of the convolutional long short-term memory and Bayesian ridge models for univariate time series forecasting, particularly in the context of AOD predictions. These accurate daily forecasts bear practical implications for policymakers in various areas such as tourism, transportation, and public health, enabling better planning and resource allocation.
Accurately modelling and forecasting electricity consumption remains a challenging task due to the large number of the statistical properties that characterize this time series such as seasonality, trend, sudden changes, slow decay of autocrrelation function, among many others. This study contributes to this literature by using and comparing four advanced time series econometrics models, and four machine learning and deep learning models1 to analyze and forecast electricity consumption during COVID-19 pre-lockdown, lockdown, releasing-lockdown, and post-lockdown phases. Monthly data on Qatar's total electricity consumption has been used from January 2010 to December 2021. The empirical findings demonstrate that both econometric and machine learning models are able to capture most of the important statistical features characterizing electricity consumption. In particular, it is found that climate change based factors, e.g temperature, rainfall, mean sea-level pressure and wind speed, are key determinants of electricity consumption. In terms of forecasting, the results indicate that the autoregressive fractionally integrated moving average and the three state autoregressive markov switching models with exogenous variables outperform all other models. Policy implications and energy-environmental recommendations are proposed and discussed.
Currently, during runtime, programs are mostly uncontrollable objects. They are very vulnerable to either transient or permanent, external, or internal contaminations of the program state. Here one finds a new method that combines a theoretical approach for goal-oriented software fault tolerance with SGX technology. Our approach consists to protect program critical information, by using enclaves in SGX security technology, in addition to checking regularly their preservation. Each module in the software is split into two parts: trusted codes protecting any operation on critical information, and an untrusted code for the remaining operations. The program contains two kinds of recovery routines: first, recovering against endless loops, and second, a goal-oriented recovery by repeating each module call until consensus (i.e. two consecutive executions of the module give the same output). Applied to several programs, it enabled different transient faults recovery successfully with minor time complexity overhead. The method proposes a new fault-tolerant program structure. Designers should use such kind of approach in a systematic way to avoid software failure caused by several types of transient faults. By protecting the code that preserves critical information, we reinforce the initial fault tolerance of the programs. This solution is convenient for critical applications requiring high security.