Use your eyes to control a video watching interface powered by your laptop webcam.
Web based gui to be used alongside GazePointer (non-commercial use).
Check out a demo video here made for the UIUC 2021 Virtual Undergraduate Research Symposium with slides. Sponsor Zillow's Choice at UIUC WCS Explorations 2021.
-
Download GazePointer (windows only) onto your computer and calibrate that using the interface.
-
Open
index.html
on any web browser. You might need to just your zoom level till everything fits on the screen properly.
We set up large buttons below and above the video for simple controls. The buttons change color when the gaze hovers over and if there is a long enough dwell then the action is performed. We use mouse on and off events to keep track of when the gaze enters and exits a button.
GazePointer has mouseless calibration that is fairly accruate compared to altneratives like webgazer.js but is also worth looking into if you're considering your own webcam based projects.
- webgazer.js offical page and video intro
- GazeCloudAPI and article compared it with webgazer. The article also has a repo
- Visuabilty and repo which is a hackathon project using web gazer also using
ReactJS
. - Focus point heatmap using webgazer repo for a hackathon.
- Webgazer drawing and repo for a hackathon.
- The Truth About Webcam Eye Tracking - more on the marketing user research side but also useful to look into
- Stanford CSLI Eye-Tracking Workshop Summer 2013
- Eye Tracking for Everyone paper
- Mismatch - How Inclusion Shapes Design book
- Eye Tracking the User Experience: A Practical Guide to Research book
- The Privacy-Invading Potential of Eye Tracking Technology
- Eye Tracking Wikipedia
index.html
the entry point pagemain.css
has some barebones styling kept simple on purpose to demonstrate the systemmain.jss
has the dwell code and video element actionssunset_trim.mp4
andapollo.mp4
are sample videos for the demo, but these can be replaced with anything else supported by<video>
Most of our limitations came from using a laptop webcam but if you use something in the ~$250 range like the entry level Tobii eye trackers you will get better results. Keeping this in mind we were limited by
- Low accuracy
- Low precision
- Low sampling rate (12-15 Hz)
- Limited library support
- Frequent recalibration (either after 10 minutes or lighting/position change)
- Low camera video resolution
- Differing head angles and positions (6 degrees of movement)
- Add
webgazer
support - React support
- ClassTranscribe Integration
- Cross OS support
- https://github.com/szydej/GazeFlowAPI can give raw access to coordinates that we can work with