FP-PaulIvanov

From CS 294-10 Visualization Sp10

Jump to: navigation, search

Contents

Proposal

Group Members

  • Paul Ivanov

Description

Though static visualization of eyetracking data has been successfully integrated by commercial vendors for their turn-key market research and user experience systems, it remains the main avenue of academic research. Proceedings of ETRA 2010. By far the most frequent use of eyetracking in academia consists of static stimuli [[Trend and Techniques in Visual Gaze Analysis - COGAIN 2009]]. Coupled with the constraints inherent in the primary mode of communicating results among scientists, namely journal articles and poster presentations, it's not surprising that static visualizations remain the mainstay in the field.

Single images can effectively communicate specific aspects of the data, but their construction remains a bottleneck for exploratory analysis when the user has not yet formulated interesting questions because he does not have a "feel" for the data. During the exploration phase, the burden of transforming questions into the code which generate static representations falls on the user, delaying discovery by increasing both the time spent in the exploratory cycle as well as the frustration of the user.

There exists one notable exception to the static representation, the so called "bee swarm" approach, which animates multiple cursors indicating the centers of gaze across subjects or trials. And while it restores a natural representation of time, a key dimension which is otherwise either collapsed or, with the exception of standard position vs time traces, ill-represented in static visualizations - "bee swarms" suffer from impoverished implementations, limiting the user to a rendered movie version of the data, with the ability to seek around in time, but without the ability to change or query the data being displayed.

Beyond beeswarms

My project will implement the standard approaches of displaying data, including gaze traces, summary bar chars, scatter plots, bee swarms, etc., but with a heavy adherence to The Mantra Schneiderman citation, such as on-demand highlighting and filtering by velocity, position, fixation duration, saccade length, pupil area. Ramloll et al. IV'04, also did this

Additionally, unique to my project are : - Domain specific transfer function (color and opacity map) which is user editable.

   - "Psychic" look-ahead event displays - indicating the target (future)
     landing of a saccade in flight.
   - "Path vignette" specification - allowing the filtering of specific
     trajectories, regardless of position on screen .
   - Data displays in fixation-centered coordinates - collapsing across
     absolute gaze position, to expose relative patterns of movement within
     specified epochs.
   - Event aware direction analysis - for filtering of either incoming or
     outgoing saccades for a specific fixation location.

While many of the aspects of the proposed project exist in commercial software, these packages come bundled and are tightly coupled with the vendor's specific eye trackers. Additionally, due to their proprietary nature, integrating the academic user's own code for the questions that interest him is limited by what the vendors make available in their SDK, or may not an option at all.

Additionally, using the web browser as the primary way of viewing and interacting with the data will tremendously lower the barrier of entry for other viewers, such as collaborators.

I am unaware of any tool tailored for academic research which has these capabilities. I intend to release this tool under an open source license.

Initial Problem Presentation

Midpoint Design Discussion

  • Link to slides here

Final Deliverables

  • Link to source code and executable
  • Link to final paper in pdf form
  • Link to final slides or poster


[add comment]
Personal tools