We are searching data for your request:
Upon completion, a link will appear to access the found materials.
People who cannot use their hands with full efficiency may find it extremely difficult to interact with touchscreen devices. Thus, Zack Freedman came up with Hypervisor.
The wearable device tracks the user's eye movements and pupil position to move a mouse cursor around on a device. Freedman entered the Hackaday Prize contest with the device to benefit the non-profit organization United Cerebral Palsy.
SEE ALSO: NEW CONTACT LENSES TO TREAT EYE DISEASE THAT CAN LEAD TO BLINDNESS
Hypervisor operates on a Raspberry Pi Compute Module 3+ embedded on a StereoPi baseboard. Sensors on the device all relay information to Raspberry Pi and it processes the data input. There are two infrared transmitters and receivers which process device information and gaze data. Two Camera Serial Interfaces (CSI) provide environmental data along with eye data for computer vision processing. And some visible spectrum LEDs tell the user about the state of the machinery, whether it's working or not etc.
So as to determine where you are looking, one of the IR CSI cameras feeds a continuous video stream to the Raspberry Pi Compute Module 3+ running OpenCV (which is a library focused on real-time-computer-vision). Meanwhile, a second camera is used to improve the accuracy of certain things like where you are looking and where your eye pupil is pointing at.
After all the calculations are complete, the device determines where you are looking at and places the cursor there. You can blink to click, squint to drag and drop, blink twice in rapid succession to scroll up and down. Devices are connected to a dedicated receiver that draws in data from the headset, so it knows where to position the cursor.
You can find the project page here, GitHub repository here,