Time series matching for biometric visual passwords

2017 
User authentication through silent utterance of a secret phrase, a biometric visual password, has been previously attempted mainly using image based features extracted from video. Using state of the art face tracking, this problem can be framed as a high dimensional time series matching problem covering the motion of a select set of lip points. One major advantage is the small amount of training data needed. We deploy the time and shape correspondence (TSC) matching algorithm given its superior performance when dealing with multidimensional signals with shape and in the presence of noise. We report the results of a user study with 22 participants uttering the password "siggraph rocks". This data base along with other human action data bases we created for gait and gestures are made publicly available for comparison studies by other researchers.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    2
    Citations
    NaN
    KQI
    []