Early detection of gait disorders may provide a safer living for elderly people. In this paper, we propose an automatic method for detecting gait disorders using RGB or RGBD camera (e.g., MS Kinect, Asus Xtion PRO). We use Gait Energy Image (GEI) as our main feature that can be computed from different views. Our method depends on computing GEI, learning the representative features from the GEI using convolutional autoencoder, and using anomaly detection method for detecting abnormal gait. We applied the proposed method on two different public datasets that include normal and abnormal gait from different views. Experimental results show that our method achieves high accuracy in detecting different gait disorders from different views, which makes it general to be applied to home environment and adds a step towards convenient in-home automatic health care services.
A new technique for action clustering-based human action representation on the basis of optical flow analysis and random sample consensus (RANSAC) method is proposed in this paper. The apparent motion of the human subject with respect to the background is detected using optical flow analysis, while the RANSAC algorithm is used to filter out unwanted interested points. From the remaining key interest points, the human subject is localized and the rectangular area surrounding the human body is segmented both horizontally and vertically. Next, the percentage of change of interest points at every small blocks at the intersections of horizontal and vertical segments from frame to frame are accumulated in matrix form for different persons performing the same action. An average of all these matrices is used as a feature vector for that particular action. In addition, the change in the position of the person along X-axis and Y-axis are cumulated for an action and included in the feature vectors. For the purpose of recognition using the extracted feature vectors, a distance-based similarity measure and a support vector machine (SVM)-based classifiers have been exploited. From extensive experimentations upon benchmark motion databases, it is found that the proposed method offers not only a very high degree of accuracy but also computational savings.
Autism is a neuro-developmental condition that is characterized by a number of unconventional behaviors such as restricted and repetitive activities. It is often largely attributed to deficiency in communication and social interaction. Therefore, it is difficult to make autistic individuals, especially children, to comply with researches that aim at comprehending this condition. However, with the availability of non-invasive eye-tracking technology, this problem has become easier to deal with. The following research probes into the visual face scanning patterns and emotion recognition between 21 autistic and 21 control or TD (typically developing) children when displayed pictures of 6 basic emotions (happy, sad, angry, disgusted, fearful and surprised). Tobii EyeX Controller was used to attain the gaze data and the data was processed and analyzed in MATLAB. The results revealed that children with autism look less at the core features of the face (eyes, nose and mouth) while scanning faces and have more difficulty in perceiving the correct emotion compared to the typically developing children. This atypical face scanning and lack of preference to the core features of the face can be the reason why autistic individuals have trouble understanding others' emotions and an overall incompetency in communication and social interaction. To delve more into this, further eye-tracking, neuroimaging and behavioral studies should be done in integration.