Hello. We present in this paper an analysis of eye and head patterns, utilizing Pupil Invisible, during interactions with Virtual Humans to detect symptoms of depression. I hope you find it interesting. https://ieeexplore.ieee.org/abstract/document/10388134
Hi @user-1b4c6e ! Thanks for sharing it! Looks really interesting. Since the article is not Open Access, could you share how you used the eye tracking, what you learn from, or anything that you feel relevant for the community?
Hi Miguel. You can see an overview of the project in this video: https://www.youtube.com/watch?v=Gu3KmhmF6WA&t=1s&ab_channel=LabLENI Basically we analysed the eye-head differences of controls and depressive subjects using statistical testing. We analysed fixations, saccades, blinks (with the algorithms already imlpemented), accelerations, gyroscopy and orientation (extracting some statistics from the IMU signal), and others related to Areas of Interest. For this we used te Reference Image Mapping enrichment, and we are very happy with the results! Congrats for this. We postprocessed some heatmaps also with the fixations mapped.
Hi Niel. I seemed to have missed this. Yes, precisely, to use a smartphone camera to detect pupil dilation.
Note that our software is designed to be used with near-eye cameras, typically around 30 mm distance from the pupils. There wouldn't be a way to make it work with a phone camera, unfortunately.
In summary, we are very happy with the device, cloud, enrichment... It was very easy to work with the outputs that cloud exports.
thanks for sharing it!