🔬 research-publications


user-1b4c6e 22 January, 2024, 17:36:48

Hello. We present in this paper an analysis of eye and head patterns, utilizing Pupil Invisible, during interactions with Virtual Humans to detect symptoms of depression. I hope you find it interesting. https://ieeexplore.ieee.org/abstract/document/10388134

user-d407c1 23 January, 2024, 09:20:16

Hi @user-1b4c6e ! Thanks for sharing it! Looks really interesting. Since the article is not Open Access, could you share how you used the eye tracking, what you learn from, or anything that you feel relevant for the community?

user-1b4c6e 25 January, 2024, 13:37:59

Hi Miguel. You can see an overview of the project in this video: https://www.youtube.com/watch?v=Gu3KmhmF6WA&t=1s&ab_channel=LabLENI Basically we analysed the eye-head differences of controls and depressive subjects using statistical testing. We analysed fixations, saccades, blinks (with the algorithms already imlpemented), accelerations, gyroscopy and orientation (extracting some statistics from the IMU signal), and others related to Areas of Interest. For this we used te Reference Image Mapping enrichment, and we are very happy with the results! Congrats for this. We postprocessed some heatmaps also with the fixations mapped.

user-51bca2 24 January, 2024, 19:49:50

Hi Niel. I seemed to have missed this. Yes, precisely, to use a smartphone camera to detect pupil dilation.

nmt 25 January, 2024, 02:39:07

Note that our software is designed to be used with near-eye cameras, typically around 30 mm distance from the pupils. There wouldn't be a way to make it work with a phone camera, unfortunately.

user-1b4c6e 25 January, 2024, 13:39:22

In summary, we are very happy with the device, cloud, enrichment... It was very easy to work with the outputs that cloud exports.

user-d407c1 25 January, 2024, 13:45:59

thanks for sharing it!

End of January archive