Pursuit and perception both require accurate information
about the motion of objects. Recovering the motion of objects
by integrating the motion of their components is a difficult
visual task. Successful integration produces coherent global
object motion, while a failure to integrate leaves the
incoherent local motions of the components unlinked. We
compared the ability of perception and pursuit to perform
motion integration by measuring direction judgments and
the concomitant eye-movement responses to line-figure parallelograms
moving behind stationary rectangular apertures. The apertures
were constructed such that only the line segments corresponding
to the parallelogram's sides were visible; thus, recovering
global motion required the integration of the local segment
motion. We investigated several potential motion-integration
rules by using stimuli with different object, vector-average,
and line-segment terminator-motion directions. We used
an oculometric decision rule to directly compare direction
discrimination for pursuit and perception. For visible
apertures, the percept was a coherent object, and both
the pursuit and perceptual performance were close to the
object-motion prediction. For invisible apertures, the
percept was incoherently moving segments, and both the
pursuit and perceptual performance were close to the terminator-motion
prediction. Furthermore, both psychometric and oculometric
direction thresholds were much higher for invisible apertures
than for visible apertures. We constructed a model in which
both perception and pursuit are driven by a shared motion-processing
stage, with perception having an additional input from
an independent static-processing stage. Model simulations
were consistent with our perceptual and oculomotor data.
Based on these results, we propose the use of pursuit as
an objective and continuous measure of perceptual coherence.
Our results support the view that pursuit and perception
share a common motion-integration stage, perhaps within
areas MT or MST.