A neural architecture for visual motion perception: Group and element apparent motion

1989 
A neural network model of motion segmentation by visual cortex is described. The model's properties are illustrated by simulating on the computer data concerning group and element apparent motion, including the tendency for group motion to occur at longer ISIs and under conditions of short visual persistence. These phenomena challenge recent vision models because the switch between group and element motion is determined by changing the timing of image displays whose elements flash on and off but do not otherwise move through time. The model clarifies the dependence of short-range and long-range motion on a spatial scale. Its design specifies how sustained response cells and transient response cells cooperate and compete in successive processing stages to generate motion signals that are sensitive to direction-of-motion, yet insensitive to direction-of-contrast. Properties of beta motion, phi motion, gamma motion, and Ternus motion are explained. A number of prior motion models are clarified, transformed, and unified, including the Reichardt model, Marr-Ullman model, Burt-Sperling model, Nakayama-Loomis model, and NADEL model. Apparent motion and real motion generate testably different model properties. The model clarifies how preprocessing of motion signals by a motion OC Filter is joined to long-range cooperative motion mechanisms in a motion CC Loop to control phenomena such as induced motion, motion capture, and motion after effects. The total model system is a motion Boundary Contour System (BCS) that is computed in parallel with the static BCS of Grossberg and Mingolla before both systems cooperate to generate a boundary representation for 3-D visual form perception.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []