Eye tracking to establish a hierarchy of attention with an online fashion video

2014 
Engaging customers online is fast becoming a focus for entrepreneurs, researchers and marketers as it offers a platform with a lower barrier of entry and is heavily utilized among the tech savvy millennial generation (aged 18-24) through social applications currently such as Facebook, Twitter, Instagram and Snapchat. Often, using it as an entertainment source to replace television shows, online videos for today’s millennial generation have become the new and socially acceptable way to interact with peers as well as with various brands. This research explores how fashion videos have risen in prominence and how they are perceived by the millennial generation. It explores attitude formation and online image processing to generate a hierarchical list of traits which participants focused upon when viewing a fashion video. The result is an established order to engage, direct and hold the attention of audiences for the longest time possible; eyes of the people in videos, lips, motion, size, images, colour, text style and, lastly, position. Using optometric or eye gaze tracking technology to capture where participants directed their focus, a comparison of what participants believed they valued to what they actually focused upon was demonstrated, with 30 semi-structured interviews and pre/post questionnaires to measure perceptions of the portrayed brand. This article articulates how the hierarchy generated is based upon a social referencing scheme for the viewer, followed by an attribute information search, to the situation and branded objects portrayed. Additionally this study, unsurprisingly, found that participants did not fully remember the full videos that they were exposed to, nor the content upon which they had directly focused. However, it is important to note that participants could recall considerably more amounts of information when their eye pupils dilated, presenting an opportunity for additional research. Shapeshifting Conference: Auckland University of Technology 14-­‐16 April 2014 2
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    2
    Citations
    NaN
    KQI
    []