@user-599b15 I've been trying to reproduce the issue you reported with various combinations of streaming / recording without success. Do you happen to remember some details about the recording session that had issues, for example was the streaming started before starting recording, or after?
Hello, I have a question about sanitizing the Pupil Invisible. What products can be used to wipe down the pupil invisible particularly the nose and earpiece and what areas must be avoided. I need to be able to sanitize the glasses when testing different participants to adhere to the new guidelines set by my university.
Hi @user-5882af! You can wipe them with wetted towles with an alcohol based solution. See also this page of the docs: https://docs.pupil-labs.com/invisible/hardware/#disinfecting-frames
@marc Thank you!
@user-0f7b55 sorry for the late reply. I did add a socket code at the models.py so that I can use the eye data in real time on the third-party app. Our procedures are 1. start the mobile app with OTG-on 2. open the monitor app 3. start streaming to the third party app 4. start recording trials Since we cannot use the imu data in real time and the eye timestamp seems fine, I assume the socket code I add should not mess up the mobile app. I did not send any data or trigger to the mobile app
@user-599b15 Are you able to reproduce this issue with that setup after you reported it initially?
@papr I am having some difficulties with the Pupil Invisble again. It seems that the world view camera does not want to properly connect to the glasses anymore. During a recording the connection is easily broken. This is also seen on the app. Sometimes there is no connection at all even though the glasses and the camera are physically connected. How can this problem be solved? Thank you in advance!
@user-3315ca Please contact [email removed] in this regard.
Hey there, I just exitedly unboxed the Invisible Glasses and during the first run it looked like the gaze representation was slightly off. I tested it with a calibration image on a screen and it's indeed off. Any tips?
@user-c4ef35 You can use the offset correction feature, which allows you to manually correct for a constant offset that can occur in subjects. Once you have connected the glasses, and before a recording, go into preview mode (bottom right of app screen), hold your finger on the screen and move the red circle to the correct point of regard. Then click apply. For best results, perform the offset at the viewing distance you will be recording at.
@nmt Thanks! Will give that a try!
hello people please help
my invisible stoped working, sometimes no scene when recording and other times scene recorded but no gaze circle
I factory reset One plus 6 but it still does not work
@user-1ed5b6 Please contact [email removed] in this regard.
thanks for help
@user-0f7b55 we just solved the problem. We were trying to get imu data by adding 'imu' to the sensor_ types in the model.py and forgot to erase it after knowing the mobile app doesn't send imu data. Sorry for the inconvenience.
Hello! Anybody has experience in processing Pupil Invisible data in MATLAB? I have some questions I could ask 🙂
The pupil invisible has arrived. Does it have a serial number because it is important for our inventory? I didn't find anything in the papers or on the frame.
@user-d1dd9a The glasses should have a small serial plaque at the tip of the left temple! It is is 5 letters and numbers.
I found that, but it was unclear if that was the serial number. I think it was 5 letters.
I see! Yes, that is the serial number. We use a relatively short sequence to make them easier to handle 🙂
thanks marc all o.k. now
will test the invisible soon
🙂
Hi @user-98789c would you be able to provide a concrete example of what you are trying to do?
Hi @user-98789c would you be able to provide a concrete example of what you are trying to do? @wrpI am very new to the device and I have just started with it. I am going to do a behaviour (decision making) study. My subjects will be watching some stimuli on a monitor and will react to it using the keyboard. What I need is for my subjects to be wearing the Pupil Invisible glasses, and their gaze data and pupil size/diameter would be recorded. Also, I need to interact with the Pupil data from MATLAB, and send event flag about my stimuli onset and trial number to be co-registered with the pupil time series data. I hope this was clear. Now I just need to get a jump start. I don't even know yet, where to look for the data recorded by Invisible, what data format it has, how I can extract features out of it, etc. It would be great to get some guidance on where even to start from! Thanks a lot 😊
Hey @user-98789c! thank you for clarifying your use-case! I'll try to address all your points:
Recording gaze data: For this you simply need to start a recording using Pupil Invisible. Gaze will be computed and recorded right away. You can also get a live preview in the Companion app.
Accessing data: If you enabled cloud upload of your recordings, everything you have recorded will be automatically uploaded to your cloud account (assuming the companion phone has an internet connection), where you can inspect and download your data. Alternatively you can also download the data directly from the phone via USB. https://cloud.pupil-labs.com/
Pupil size/diamter: Pupil Invisible does currently not produce any pupil diameter measurments, so this will not be available. Pupillometry data is currently only available using the Pupil Core device.
Stimuli on a monitor: If you are going to need the gaze data in relation to the monitor, i.e. where on the monitor did the subject look, rather than where in relation to the recorded scene video, you will need to track the monitor in the scene video. You can do this using the surface tracker plugin in the Pupil Player software (see link). We will also publish a version of this within Pupil Cloud very soon (probably next week). https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker
Recording stimuli onset/offset: Pupil Invisible supports the recording of events, which are tagged points in time. Your MATLAB code could send a trigger to the Pupil Invisible device to save an event corresponding to stimuli onset and offset. Those events will be saved as part of the recording. https://docs.pupil-labs.com/developer/invisible/#events
I hope this is giving you a head start! Let us know if you have further questions!
Hey @marc so if pupil diameter is definitely needed in my experiment, there is no other way than to purchase the Pupil Core device?
Thanks a lot @marc for your generous guidance! I'm going through it all, step by step. I'll let you know if I'm stuck 🙂
Hello all, we want to analyse pupil core or invisible data for a group analysis and I have two questions.What is the most efficient way to code the data from fixation to fixation, in order to create a heat map (and other metrics) on a template picture. This was called semantic gaze mapping by smi and is also possible in tobii. We also have a tobii pro license so a lame solution would be to export pupil data to tobii input format. Is that possible?
@user-95fe83 For Pupil Core and Invisible, you can create a heatmap (and calculate and export other metrics) by defining the template picture as a surface using our surface tracker plugin in Pupil Player software (see link). We will also publish a version of this within Pupil Cloud very soon. https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker.
@user-95fe83 Note that the usage of the surface tracker requires the placing of AprilTag markers in the environment. We do not currently have a solution for this without markers. You can import Pupil Core and Invisible recordings e.g. in the iMotions software, which offers marker-less tracking. I do not know of a tool that would allow you to open the recordings in Tobii Pro Lab. Our recordings have an open data format that would in priciple allow you to convert them to anything, but I do not know if Tobii's data format is public, which would make the conversion possible.
Thanks for the info. I already tried these markers and it works well. However in this case it is impossible to use them. Other tips of what methods to use to code mobile eyetracking of pupil hardware by fellow researchers? Tooolboxes, scripts, coding software,...
@user-98789c Yes, that is correct!
Thanks 🙂
Hello everyone. The automatic upload isn't working since a few days on my smartphone... does anyone had the same problem and could fix the problem? I tried to turn on and off the upload, re-startet the phone.... but there is still no upload to the cloud....
@user-7a8fe7 What are you observing in the recordings view?
Does the progress stay at 0%?
there is even no 0%, even if I try to start it manually
@user-0f7b55 recordings view is what I have measured. so it is ok
@user-7a8fe7 I meant the recordings list view UI, apologies if it was not clear.
@user-0f7b55 ok when I click on "recordings" there is the list with all recordings. some recordings earlier checked with the upload symbol, one (big) record is paused, the next record is paused and all other next recordings there is no progress to see (is it what you mean?)
@user-7a8fe7 Yes, thank you. What is the version of the companion app you are using?
@user-0f7b55 it is 1.0.0-proud, last actualization on 15.nov 20
@user-7a8fe7 Can you please try to log out (from settings menu) and log in again?
@user-0f7b55 it is uploading now, thank you very much. i'm very sorry that i did not tried it before
@user-7a8fe7 No problem, please let me know if you experience similar issues again.
@user-0f7b55 ok i will do -thank you very much again!!
Hi all, Does anyone has like a checklist of all the steps to take care of, before starting to record with Pupil Invisible? from plugging the glasses to the cellpone and so on.. and maybe also after the recording?
@user-98789c To simply record gaze data, literraly all you have to do after connecting the cable and starting the app is pressing the record button. Once you are done hit the stop button and confirm saving the data and that is it.
Other things that might be useful depending on your use case:
To conveniently access the data you might want to upload it to Pupil Cloud. This need to be enabled in the settings first. Alternatively you can download it directly from the phone using a USB connection.
If your subject has a constant error in the gaze prediction (you can check if this is the case also in the live preview), you might want to correct it using the offset correction feature. To do that open the live preview, touch and hold the screen, and drag the gaze circle into it's correct position. We expect this to happen for rougly 10% of the population.
You might want to record other data, e.g. demographic information on the subject, using a template. This can act essentially like a questionaire filled out on the Companion phone.
Let me know if you have further questions!
Thank you Marc for the reply. Two more questions: 1- Do you know of any open-access scripts for processing gaze data? 2- Do you know any papers where they have recorded and analyzed gaze data in their experiment, using Pupil Invisible?