Machine Learning

First-Person Perceptual Guidance Behavior Decomposition using Active Constraint Classification

Tagged: ,

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 1 year, 9 months ago.


  • arXiv
    5 pts

    First-Person Perceptual Guidance Behavior Decomposition using Active Constraint Classification

    Humans exhibit a wide range of adaptive and robust dynamic motion behavior that is yet unmatched by autonomous control systems. These capabilities are essential for real-time behavior generation in cluttered environments. Recent work suggests that human capabilities rely on task structure learning and embedded or ecological cognition in the form of perceptual guidance. This paper describes the experimental investigation of the functional elements of human motion guidance, focusing on the control and perceptual mechanisms. The motion, control, and perceptual data from first-person guidance experiments is decomposed into elemental segments based on invariants. These elements are then analyzed to determine their functional characteristics. The resulting model explains the structure of the agent-environment interaction and provides lawful descriptions of specific perceptual guidance and control mechanisms.

    First-Person Perceptual Guidance Behavior Decomposition using Active Constraint Classification
    by Andrew Feit, Berenice Mettler
    https://arxiv.org/pdf/1710.06943v1.pdf

You must be logged in to reply to this topic.