Dear @marc (Pupil Labs) I have a question regarding Neon accuracy and indoor/outdoor recording. Looking at gaze video I have the impression that when exposed to tube tlight (that oscillate at 50 Hz) there is an extra jiitter/noise in the eye. Question 1: Could this be possible? Also, I tried under different lightning conditions. Question 2: Could sun reflection add jitter? It is not easy to draw conclusions because eye movements with too much light could be different.
Hi @user-cc819b! Most tube lights emit a very small amount of infra red light. Thus they should in theory not play a role for the imaging of the infrared eye cameras. Regarding the influence of sunlight our evaluations are not completely final yet. So far we are expecting a similar level of invariance to sunlight as with Pupil Invisible (i.e. a very high level). Some effect like like very squinted eyes in strong sunlight may affect the performance as well in some cases.
If you'd be willing to share the problematic recordings we'd be curious to take a look!
Hello! Is there a way to calculate saccade amplitude and velocity with Neon?
Within Pupil Cloud there is a fixation detector. If you have a setting that does not produce smooth pursuit movements, one could consider everything that is not a fixation to be a saccade. Using the available data you could calculate saccade amplitudes and velocities.
So the algorithms are definitely geared primarily towards fixation detection, but you can get saccade data as a byproduct.
Thank you, Marc!
Hi There! Can you pls tell me if the Neons have blink detection to measure drowsiness?
Hi @user-8179ec! As of today the blink detection algorithm in Pupil Cloud has not yet been ported to Neon, but this will happen within the month. This algorithms reports blink events and their duration, which you could use as the basis for calculating drowsiness.
Great work. Thanks @marc
Hi, I was wondering if anyone ever bought any Neon & Pupil Clouds to use on a driving simulator and if the resolution is still good even if the simulator video isn't great. I also don't quite understand if in addition to the video with the red dot, it returns something else. Let me explain better is there a way, for example, to know in a simulated environment how many times while driving the user has looked at the on-board computer? thanks
Hi @user-f0f5e2 ๐ I've personally used Neon in a driving simulator with good results in terms of resolution. Neon offers various measurements, including 2d gaze coordinates, blinks, and fixations, and there is a free update of the Neon Companion App coming in Q2/Q3 2023 that will include eye state and pupillometry. To map your gaze onto specific areas of interest in your driving simulator, you can utilize Marker Mapper (https://docs.pupil-labs.com/neon/enrichments/marker-mapper/) and Reference Image Mapper (https://docs.pupil-labs.com/neon/enrichments/reference-image-mapper/) enrichments in Pupil Cloud. If you're interested in learning more about Neon and its capabilities, feel free to contact info@pupil-labs.com to schedule a demo and Q&A video call.
Thank you very much @user-c2d375 for your quick response, I was wondering but can the Marker Mapper and Reference Image Mapper be calibrated even if the encountered objects change while driving?
The Marker Mapper enrichment requires the presence of Apriltag markers (https://docs.pupil-labs.com/neon/enrichments/marker-mapper/#setup) to define the area of interest. As long as these markers are visible in the driving simulator scene and there isn't excessive motion blur caused by sudden movements, the area of interest should be accurately detected during driving. On the other hand, the Reference Image Mapper enrichment doesn't require any markers. Instead, it builds a model of the environment to map gaze onto a picture of that environment. This tool works particularly well with static scenarios where the features remain consistent. From my direct experience of using Reference Image Mapper while driving, even if the encountered objects on the road are continuously changing, the fixed features of the car dashboard and on-board computer should still be successfully tracked, allowing for gaze mapping on the car interiors.
How does Neon handle slight gaze deviations, such as esotropia? Does it lose too much accuracy?
We have never had the chance to properly measure the effect of such things for Pupil Invisible or Neon. The accuracy will for sure suffer a bit and we have heard customer reports confirming that, but I am afraid we can't give a much more qualified answer than that.
We do have a 30-day return policy, so you could try it out and see if the performance remains good enough.
Thank you, Marc!
Hi there! Our lab is considering purchasing the pupil neon but before we do, we wanted to get a sense of what the raw data looks like - particularly the raw IMU data for head tracking. Would it be possible to get access to a sample recording and all the data in csv format? Thank you in advance!
Hey @user-ebe453 ๐ . Thank you for your interest in Neon! You can find an overview of the IMU data stream in our online documentation: https://docs.pupil-labs.com/neon/basic-concepts/data-streams/#movement-imu-data. For a detailed description of the IMU exported data, please refer to this link https://docs.pupil-labs.com/neon/reference/export-formats/#imu-csv
Hi. In previous question Pupil Lab members mentioned there will be an update on software that solved a bug in the IMU unit data collection and on the real-time API that allowed online access to Eye tracking and IMU data. Is there a date for this release? Thanks! Andrรฉs
Hey @user-b55ba6๐ . Thanks for following up ๐ You should be able to get your IMU data with Neon now by downloading the data from Pupil Cloud. Once you have uploaded your recording on Pupil Cloud, you can download the IMU data in csv file (right click on the recording and select Download > Timeseries Dataย + Scene Video). Please keep in mind thatย the IMU needs to calibrate in the beginning of every recordingย - this means that during this "calibration" phase you should rotate the module at various directions. After that, it converges to the correct orientation. Regarding your secondย question about the real-time API,ย the real-time streaming of IMU is on the roadmap with expected release date in late Q2 (~end of May). Hope this helps!
Hello, we are considering purchasing the pupil neon, however we would need a frame that can fit 5-6-7 year old children. Would that be feasible?
Hey @user-068eb9 ๐ . We are planning to release the "All fun and games" frame which is specifically designed for kids from ages 2-8.
Would you be able to give an estimated time of release?
Sure! The expected release date is in Q3 2023.
Thank you
Hi Pupil Labs Team - we just got our Neon device and could not get Pupil Capture (v3.5.1) to recognize the plugged in device, eventhough https://pupil-labs.com/products/neon/ states that these two would be compatible. Capture just treats it as if nothing was plugged in.
Further, when trying to open a downloaded recording file from Pupil Cloud and then opening it with Pupil Player, the error message
player - [INFO] pupil_recording.update.invisible: Transform Pupil Invisible to new style recording... player - [ERROR] launchables.player: InvalidRecordingException: This version of player is too old! Please upgrade.
showed up. So we have no means of using the tools we need from Pupil Player.
Could you give us a hint on what might be going wrong here?
Thank you!
Hi @user-e5a3fa! Sorry for the inconvenience there! We are a bit delayed on the update to Pupil Capture that makes it compatible with Neon. We ran into driver issues on Windows, but we are about to release a new version for Linux and MacOS that is compatible with Neon in the next few days. I hope that is workable for you! We will keep on working on the Windows issues to get that fixed too.
Recordings that were made with the Companion app, are not compatible with Pupil Player. So that part of the behaviour is expected.