Relating Human Gaze and Manual Control Behavior in Preview Tracking Tasks with Spatial Occlusion

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

31 Downloads (Pure)

Abstract

In manual tracking tasks with preview of the target trajectory, humans have been modeled as dual-mode “near” and “far” viewpoint controllers. This paper investigates the physical basis of these two control mechanisms, and studies whether estimated viewpoint positions represent those parts of the previewed trajectory which humans use for control. A combination of human gaze and control data is obtained, through an experiment which compared tracking with full preview (1.5 s), occluded preview, and no preview. System identification is applied to estimate the two look-ahead time parameters of a two-viewpoint preview model. Results show that humans focus their gaze often around the model’s near-viewpoint position, and seldom at the far viewpoint. Gaze measurements may augment control data for the online identification of preview control behavior, to improve personalized monitoring or shared-control systems in vehicles.
Original languageEnglish
Title of host publicationProceedings of the IEEE International Conference on Systems, Man, and Cybernetics
Subtitle of host publicationMyazaki, Japan, 2018
Pages3430-3435
DOIs
Publication statusPublished - 2018
EventSMC 2018: IEEE International Conference on Systems, Man, and Cybernetics - Myazaki, Japan
Duration: 7 Oct 201810 Oct 2018
http://www.smc2018.org/

Conference

ConferenceSMC 2018: IEEE International Conference on Systems, Man, and Cybernetics
Abbreviated titleSMC 2108
CountryJapan
CityMyazaki
Period7/10/1810/10/18
Internet address

Fingerprint Dive into the research topics of 'Relating Human Gaze and Manual Control Behavior in Preview Tracking Tasks with Spatial Occlusion'. Together they form a unique fingerprint.

Cite this