logo
    RFnet: Automatic Gesture Recognition and Human Identification Using Time Series RFID Signals
    18
    Citation
    30
    Reference
    10
    Related Paper
    Citation Trend
    Keywords:
    Identification
    Interface (matter)
    Feature (linguistics)
    SIGNAL (programming language)
    The paper is being investigated, surveyed and conducted for the purpose of presenting an overview of human hand gesture recognition, including the general steps of hand gesture recognition, the common method and technique. Each of these steps, research directions and a summary of some successful hand gesture models have been announced. We also apply the knowledge learned in the school to analyze, build the program to recognize hand gestures. The program uses image processing techniques to help users interact with computers using common gestures. Users will interact through a camera connected to the computer by hand gestures. The program will analyze the gesture recorded and take action that the user wishes based on hand gestures. This application hopes to replace the mouse and keyboard when using the computer, giving users more flexibility and ease methods.
    A EEG signal-based emotion recognition system was designed. The system was developed to operate as a user-independent system, based on MAPL (minority affective picture library), EEG-signal database obtained from multiple ethnic objections. The system consisted of preprocessing, feature extraction and pattern classification stages. Preprocessing and feature extraction methods were devised so that emotion-specific characteristics could be extracted. a simple experiment was carried out, and the classification result is about 56.4%, which indicated that minorities emotion problems can be studied based MPAL emotion recognition system.
    Data pre-processing
    Feature (linguistics)
    SIGNAL (programming language)
    Summary In the three‐dimensional human‐computer interaction, the identification of dynamic and static gestures is a very important and challenging work in the field of machine vision, In this paper, we propose a new gesture recognition system. Leap Motion device is a kind of equipment, which is specially used for hand recognition, which can get the feature data to realize the gesture recognition in real time. The system is mainly composed of the following two parts. For static gestures, we use a kind of feature information based on the distance, direction, and bending degree of the fingertip, and bring the support vector machine into the training to realize the static gesture recognition. For dynamic gestures, we use gesture length as a benchmark to reject non‐key gestures and preprocess frames with abnormal gesture sequences. The average recognition rate of static gestures reaches 99.98%, and the recognition rate of dynamic gestures reaches 96.20%. The experimental results show that the algorithm has a good effect on gesture recognition, and it is suitable for the simple interaction between gestures, people and people and daily communication of daily communication barriers.
    Benchmark (surveying)
    Feature (linguistics)
    Citations (10)
    Our real-time continuous gesture recognition system addresses problems that have previously been neglected: handling both gestures that are characterized by distinct paths and gestures characterized by distinct hand poses; and determining how and when the system should respond to gestures. Our probabilistic recognition framework based on hidden Markov models (HMMs) unifies the recognition of the two forms of gestures. Using information from the hidden states in the HMM, we can identify different gesture phases: the pre-stroke, the nucleus and the post-stroke phases. This allows the system to respond appropriately to both gestures that require a discrete response and those needing a continuous response. Our system is extensible: in only a few minutes, users can define their own gestures by giving a few examples rather than writing code. We also collected a new gesture dataset that contains the two forms of gestures, and propose a new hybrid performance metric for evaluating gesture recognition methods for real-time interaction.
    Citations (35)
    Gesture recognition is an upcoming technology which can change entire technology and our work in daily life. Gesture means signs made by human beings which originate from face, hands or any part of the body. These gestures can be captured using scanning or video methods and by processing these gestures in to human signals. Gesture recognition is a technology which is aimed at interpreting human gestures with the help of mathematical algorithms. This type of user interface using gesture has advantages of ease of access and human machine.
    Interface (matter)
    Citations (0)
    Hand gesture detection is a project which recognizes the gesture of hands and detect accordingly. Hand Gesture recognition is an important technique for creating user-friendly interfaces. Hand gesture is recognized by robots, for example, can take human commands, and those who are deaf or who cannot speak, can recognize the sign language for communication. Hand gesture recognition in video games could help by allowing players to use gestures to interact with the game rather than a controller. Moreover, to account for the infinite number of possible hand positions in three dimensions, such an algorithm must be more robust. It must also be capable of working with video rather than static images.
    American Sign Language
    Citations (0)
    This paper proposes the control of smart home environments such as lights and curtains using body gestures. We use a forward spotting scheme that executes gesture segmentation and recognition simultaneously. The start and end points of gestures are determined by zero crossing from negative to positive (or from positive to negative) of a competitive differential observation probability that is defined by the difference of observation probability between the maximal gesture and the non-gesture. We also use the sliding window and accumulative HMMs. We apply the proposed simultaneous gesture segmentation and recognition method to recognize the upperbody gestures for controlling the curtains and lights in a smart home environment. Experimental results show that the proposed method has a good recognition rate of 95.42% for continuously changing gestures.
    Sliding window protocol
    Home Automation
    Citations (16)
    This paper proposes the control of smart home environments such as lights and curtains using body gestures. We use a forward spotting scheme that executes gesture segmentation and recognition simultaneously. The start and end points of gestures are determined by zero crossing from negative to positive (or from positive to negative) of a competitive differential observation probability that is defined by the difference of observation probability between the maximal gesture and the non-gesture. We also use the sliding window and accumulative HMMs. We apply the proposed simultaneous gesture segmentation and recognition method to recognize the upperbody gestures for controlling the curtains and lights in a smart home environment. Experimental results show that the proposed method has a good recognition rate of 95.42% for continuously changing gestures.
    Home Automation
    Sliding window protocol
    Citations (18)