If you need a software for analysing eye and mouse movements check out the OGAMA. The new version works with the Haytham gaze tracker.
A small update for the Haytham_Oculus code. Now you can get the data from the sensors of the oculus rift as well.
source code: http://goo.gl/EMqdS0
Recently we have been playing around with the oculus rift and the Haytham gaze tracker. The idea was to some how simulate a see through HMD and combining it with the haytham gaze tracker. The only thing we did was to display the scene image on the rift screen and attaching a eye camera to the head gear. The user will see his front view through the rift and we do the gaze estimation inside that image. We decided to share the code here so that other people that are interested in combining gaze tracking in the rift can use the code. In fact we didn’t spend too much time on it and we just wanted to make something that works on C#. We used the openCV for doing the barrel distortion and showing the stereo images but it was very slow. So we ended up using the OculusArDroneKinect project developed by Alessandro Colla which is using a simple pixel shader in XNA for rendering the views. Check out this interesting project that uses Oculus Rift and a Kinect to drive a Parrot Ar.Drone! Anyway, we modified parts of his code so that we could use in our prototype. Instead of making a client for the rift that connects to the haytham_server, all the changes are directly done in the server side. There is no stereo camera and we only attached a webcam to the rift for both left and right views.
An infrared wireless camera (from http://www.3rdee.com) is used for capturing the eye. You might need to cut and remove a small piece of plastic of the headset to make some room for the camera. I use an tick aluminium wire for mounting the eye camera as below:
In the video below you see the pupil tracking in the eye image captured by this camera. In fact the camera is fixed very close to the eye but still the image is not bad. Glint detection was not active in the video therefore gaze tracking was very sensitive to the movements of the headset (the reason was that we didn’t have time to attach a light source next to the camera and remove the built-in LEDs!)
After attaching the eye camera next to the eye and the front view camera in from of the headset you are ready to use the system. Download the source code from Here (if you are interested in improving the code make a branch of the Haytham project in svn!). You also need to install the XNA framework from <https://msxna.codeplex.com/releases>. After running the program start the eye and the scene cameras. If you want to try the system on your eyes you need someone else to help you adjusting the eye tracking sliders and calibrating the gaze tracker. Remember to connect the oculus to your computer (since we are not gonna use the sensors of the rift only connecting the VGA is sufficient) and make sure that the oculus is being detected as the second monitor of the system. You need to extend the main screen of you computer to the secondary monitor (oculus) inside Windows. You can see a button called Oculus at the top/right of the main window of the Haytham which is for showing the eye or scene image (you can choose which one) on the oculus. There is also a slider next to this button that allows you to change the disparity. Below you can see the front view image with gaze tracking inside the image shown on oculus rift: