[Let’s have fun with eyeCan!]
This is a brief history of eyeCan’s software. – Part 1
As you may know, eyeCan project is a project inspired by a project named Eyewriter(www.eyewriter.org). We were strongly motivated that we can help lots of people build their own eye-mouse with the fantastic idea from Eyewriter. So we started building an easy-to-make hardware and window-based software to use the hardware as an eye-mouse. See the Eyewriter 1.0 based hardware below.
Eyewriter team provided a software (http://code.google.com/p/eyewriter/) which consists of 3 parts – eye-tracker system, calibration system, and a simple box-game which we can play with our eyes. After some camera setting and calibration steps, users can control cursor on the screen and play the box-game using eyes. The Eyewriter team also provides a Drawing App with which users can make fantastic graffiti using eyes.
However, our immediate challenge was that the Eyewriter software does not enable users to control the actual mouse cursor, which means, technically, we could not directly use the Eyewriter as an actual eye-mouse as we expected. So we decided to build our own software based on Eyewriter’s eye-tracker – which is later called ‘eyeCan’
[eyeCan 0.1b – look at the ugly, purple-ish color of the background – was a huge mistake]
The very first version eyeCan consists of Eyewriter’s eye-tracker, calibration system, and a newly introduced mouse system. Also a pre-loaded a click-based keyboard application – Clickey – could help users to type using eyes. Mouse functions were implemented by translating eye action into mouse actions.
The idea was very simple – detect a meaningful eye blink (a 0.5 second long eye blink) and assign it to mouse click action. Similarly, double click can be done by assigning longer eye blink to double click action. With this piece of application, we can move mouse cursor and perform mouse click and double click – we can do a lot of things that we do with a computer mouse – open an application by double-clicking or surf through web by clicking on hyperlinks.
Another great idea was to use ‘out-of-screen’ events to provide additional functionalities – perform mouse/key actions when the user looks outside the PC screen.
– Out of top boundary: Scroll Up
– Out of bottom boundary: Scroll Down
– Out of left boundary + 0.5s eye blink: Alt-Tab
– Out of right boundary + 0.5s eye blink: Alt+F4
With these, it became much convenient for users to do web-surfing with eyeCan. We could easily navigate through websites and read contents from a browser.
At this time, we decided to apply our eyeCan to actual users who really need our new device – people who have disabilities and can only move their eyes. We had meetings with a lot of guys and finally found some candidates for our pilot test. However, after a few trials, we found out that our device, software, and even the installation service needed huge improvement. So we decided to officially kick-off our eyeCan project and started working on a new version of eyeCan – eyeCan 1.0b.
[eyeCan 1.0 beta – new buttons added]
The first official eyeCan version comes with new mouse control scheme (we won’t go deep down with software architecture lying behind this scheme). The biggest change was newly added buttons for mouse action selection – user can choose among click, double-click, right-click, drag, and scroll for default click action. By this, the software could support full mouse functionality. Moreover, the button themselves notice users about what they are doing and what they can do with the software. Message box above the buttons also help users to easily understand the functions of eyeCan software.
Some other amazing features are Click-helper and Error-compensator. Click-helper helps users to perform mouse click with better accuracy even when they have dexterity difficulties. With this feature, users can do several tasks that require greater accuracy than the eye-tracker itself provides. Error-compensator is useful when a mismatch between eye-gaze point and actual mouse cursor occurs. This mismatch usually occurs when there is a head movement or eyeCan device displacement. In this situation, Error-compensator helps users to easily remove mismatch without re-calibrating.
Another interesting change is the introduction of score system in calibration process. Calibration is required to compute the relationship between eye-position and mouse cursor, so the accuracy of calibration is very important for accurately controlling mouse. However, calibration is a very painful task. So doing it over and over will really make users exhausted. That’s why we introduced the score system – we change the boring and painful task to a joyful game. Also, we can numerically estimate the accuracy of calibration results.
This is the end of Part 1.
In Part 2, we will cover stories on eyeCan 1.1b ~ 1.2b.
+ There must be A LOT OF grammatical errors in this post. 🙂
[by Sang-won Leigh]