Computer vision based recognition of behavior phenotypes of laying hens

2005 
Automated individual animal behavior surveillance, by means of low-cost cameras and computer vision techniques, has the ability to generate continuous data providing an objective measure of behavior, without disturbing the animals. The specific purpose of this current study was to develop an automatic computer vision technique that is capable of continuously measuring the trajectory and rotation behavior of an individual laying hen and recognizing six different behavior phenotypes (standing, sitting, sleeping, grooming, scratching, pecking). For this purpose, a model-based algorithm has been developed, based on the fact that behavior can be described as a time-series of different subsequent postures. The quantification of the hen’s posture consists of its position, orientation and a set of parameters describing its shape, obtained by fitting a point distribution model to the hen’s outline. Applying this algorithm to subsequent images in a video sequence, the successive values of the hen’s posture parameterization represent the hen’s behavior within that sequence. The time series of positions and orientations of the hen directly quantify its trajectory and rotation behavior. From the time series of shape parameters, the behavior is classified as one of six behavior phenotypes for which the system is trained. Training was done by modelling a template for each behavior phenotype from training video sequences with that phenotype, labelled by a trained ethologist. For the behavior classification the template matching technique was used, where the behavior in a new video fragment is classified as the behavior type for which the model gives the best match. The system was tested on a set of over 14000 video fragments of a single hen in a cage, each fragment containing one of the six behavior types. The average classification rate was between 70%- 96%, except 21% for pecking, due to an unreliable tracking of the chicken’s head. Best results were obtained for sleeping (96%) and standing (90%).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    5
    Citations
    NaN
    KQI
    []