Skip to main content

Highlight

Modeling Gaze-Following

Achievement/Results

NSF-funded researchers Joshua Lewis, Hector Jasso, Gedeon Deak and Jochen Triesch at the University of California at San Diego have developed a real-time computer simulation of infant/caregiver interaction. In the first figure, the simulated caregiver is shown in its three-dimensional environment holding and looking at a toy. The camera is controlled by our infant learning mechanism.

Researchers in Gedeon Deak’s lab have collected hours of footage of natural interactions between infants and caregivers in their homes, and the simulation uses this data to control the caregiver in a highly realistic way. The simulated infant uses the visual information it receives in order to decide where to look next by calculating the saliency of the objects in the center of its vision (see second figure). Over time the simulated infant learns that it is more interesting to look in the direction the caregiver is looking rather than the opposite direction. In the developmental literature this behavior is called “gaze-following” and it arises in the first few months of life for normally developing infants. Infants with developmental disabilities, such as autism spectrum disorders and Williams syndrome, often do not exhibit gaze-following behavior, and thus our model can help us better understand how these deficits occur and how certain caregiver behaviors might mitigate or exacerbate them.

Joshua Lewi is an NSF IGERT (Integrative Graduate Education and Research Traineeship) fellow in the Vision and Learning in Humans and Machines Traineeship program at UCSD run by Professors Virginia de Sa and Garrison Cottrell. This work is an excellent example of how modeling human vision and human interactions (with machine learning) can help us better understand normal and abnormal courses of human development. It was presented at the Society for Research in Child Development’s 2009 meeting.

Address Goals

This work is important for addressing the issue of how infants learn to follow gaze. This issue is important to understand as infants with autism spectrum disorders and Williams syndrome do not exhibit the normal behavior. The 3D simulation environment is an important experimental tool that could be used by other researchers.

This work is an excellent example of how modeling human vision and human interactions (with machine learning) can help us better understand normal and abnormal courses of human development.