invisible


Year

user-0f7b55 02 November, 2020, 13:59:24

@user-599b15 I've been trying to reproduce the issue you reported with various combinations of streaming / recording without success. Do you happen to remember some details about the recording session that had issues, for example was the streaming started before starting recording, or after?

user-5882af 05 November, 2020, 15:55:40

Hello, I have a question about sanitizing the Pupil Invisible. What products can be used to wipe down the pupil invisible particularly the nose and earpiece and what areas must be avoided. I need to be able to sanitize the glasses when testing different participants to adhere to the new guidelines set by my university.

marc 05 November, 2020, 15:58:47

Hi @user-5882af! You can wipe them with wetted towles with an alcohol based solution. See also this page of the docs: https://docs.pupil-labs.com/invisible/hardware/#disinfecting-frames

user-5882af 05 November, 2020, 15:59:24

@marc Thank you!

user-599b15 06 November, 2020, 15:00:07

@user-0f7b55 sorry for the late reply. I did add a socket code at the models.py so that I can use the eye data in real time on the third-party app. Our procedures are 1. start the mobile app with OTG-on 2. open the monitor app 3. start streaming to the third party app 4. start recording trials Since we cannot use the imu data in real time and the eye timestamp seems fine, I assume the socket code I add should not mess up the mobile app. I did not send any data or trigger to the mobile app

user-0f7b55 06 November, 2020, 15:03:55

@user-599b15 Are you able to reproduce this issue with that setup after you reported it initially?

user-3315ca 07 November, 2020, 11:39:22

@papr I am having some difficulties with the Pupil Invisble again. It seems that the world view camera does not want to properly connect to the glasses anymore. During a recording the connection is easily broken. This is also seen on the app. Sometimes there is no connection at all even though the glasses and the camera are physically connected. How can this problem be solved? Thank you in advance!

papr 09 November, 2020, 09:12:25

@user-3315ca Please contact [email removed] in this regard.

user-c4ef35 11 November, 2020, 12:39:00

Hey there, I just exitedly unboxed the Invisible Glasses and during the first run it looked like the gaze representation was slightly off. I tested it with a calibration image on a screen and it's indeed off. Any tips?

nmt 11 November, 2020, 12:50:05

@user-c4ef35 You can use the offset correction feature, which allows you to manually correct for a constant offset that can occur in subjects. Once you have connected the glasses, and before a recording, go into preview mode (bottom right of app screen), hold your finger on the screen and move the red circle to the correct point of regard. Then click apply. For best results, perform the offset at the viewing distance you will be recording at.

user-c4ef35 11 November, 2020, 12:50:55

@nmt Thanks! Will give that a try!

user-1ed5b6 12 November, 2020, 09:30:33

hello people please help

user-1ed5b6 12 November, 2020, 09:31:28

my invisible stoped working, sometimes no scene when recording and other times scene recorded but no gaze circle

user-1ed5b6 12 November, 2020, 09:31:54

I factory reset One plus 6 but it still does not work

papr 12 November, 2020, 09:32:19

@user-1ed5b6 Please contact [email removed] in this regard.

user-1ed5b6 12 November, 2020, 09:32:37

thanks for help

user-599b15 13 November, 2020, 15:30:16

@user-0f7b55 we just solved the problem. We were trying to get imu data by adding 'imu' to the sensor_ types in the model.py and forgot to erase it after knowing the mobile app doesn't send imu data. Sorry for the inconvenience.

user-98789c 14 November, 2020, 18:14:04

Hello! Anybody has experience in processing Pupil Invisible data in MATLAB? I have some questions I could ask 🙂

user-d1dd9a 16 November, 2020, 12:05:35

The pupil invisible has arrived. Does it have a serial number because it is important for our inventory? I didn't find anything in the papers or on the frame.

marc 16 November, 2020, 12:15:26

@user-d1dd9a The glasses should have a small serial plaque at the tip of the left temple! It is is 5 letters and numbers.

user-d1dd9a 16 November, 2020, 12:24:12

I found that, but it was unclear if that was the serial number. I think it was 5 letters.

marc 16 November, 2020, 12:27:41

I see! Yes, that is the serial number. We use a relatively short sequence to make them easier to handle 🙂

user-d1dd9a 16 November, 2020, 12:35:20

thanks marc all o.k. now

user-d1dd9a 16 November, 2020, 12:35:52

will test the invisible soon

user-d1dd9a 16 November, 2020, 12:35:54

🙂

wrp 17 November, 2020, 02:54:44

Hi @user-98789c would you be able to provide a concrete example of what you are trying to do?

user-98789c 18 November, 2020, 19:20:10

Hi @user-98789c would you be able to provide a concrete example of what you are trying to do? @wrpI am very new to the device and I have just started with it. I am going to do a behaviour (decision making) study. My subjects will be watching some stimuli on a monitor and will react to it using the keyboard. What I need is for my subjects to be wearing the Pupil Invisible glasses, and their gaze data and pupil size/diameter would be recorded. Also, I need to interact with the Pupil data from MATLAB, and send event flag about my stimuli onset and trial number to be co-registered with the pupil time series data. I hope this was clear. Now I just need to get a jump start. I don't even know yet, where to look for the data recorded by Invisible, what data format it has, how I can extract features out of it, etc. It would be great to get some guidance on where even to start from! Thanks a lot 😊

marc 19 November, 2020, 08:57:40

Hey @user-98789c! thank you for clarifying your use-case! I'll try to address all your points:

  • Recording gaze data: For this you simply need to start a recording using Pupil Invisible. Gaze will be computed and recorded right away. You can also get a live preview in the Companion app.

  • Accessing data: If you enabled cloud upload of your recordings, everything you have recorded will be automatically uploaded to your cloud account (assuming the companion phone has an internet connection), where you can inspect and download your data. Alternatively you can also download the data directly from the phone via USB. https://cloud.pupil-labs.com/

  • Pupil size/diamter: Pupil Invisible does currently not produce any pupil diameter measurments, so this will not be available. Pupillometry data is currently only available using the Pupil Core device.

  • Stimuli on a monitor: If you are going to need the gaze data in relation to the monitor, i.e. where on the monitor did the subject look, rather than where in relation to the recorded scene video, you will need to track the monitor in the scene video. You can do this using the surface tracker plugin in the Pupil Player software (see link). We will also publish a version of this within Pupil Cloud very soon (probably next week). https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker

  • Recording stimuli onset/offset: Pupil Invisible supports the recording of events, which are tagged points in time. Your MATLAB code could send a trigger to the Pupil Invisible device to save an event corresponding to stimuli onset and offset. Those events will be saved as part of the recording. https://docs.pupil-labs.com/developer/invisible/#events

I hope this is giving you a head start! Let us know if you have further questions!

user-98789c 24 November, 2020, 19:46:47

Hey @marc so if pupil diameter is definitely needed in my experiment, there is no other way than to purchase the Pupil Core device?

user-98789c 19 November, 2020, 23:25:07

Thanks a lot @marc for your generous guidance! I'm going through it all, step by step. I'll let you know if I'm stuck 🙂

user-95fe83 20 November, 2020, 11:01:33

Hello all, we want to analyse pupil core or invisible data for a group analysis and I have two questions.What is the most efficient way to code the data from fixation to fixation, in order to create a heat map (and other metrics) on a template picture. This was called semantic gaze mapping by smi and is also possible in tobii. We also have a tobii pro license so a lame solution would be to export pupil data to tobii input format. Is that possible?

nmt 20 November, 2020, 11:56:05

@user-95fe83 For Pupil Core and Invisible, you can create a heatmap (and calculate and export other metrics) by defining the template picture as a surface using our surface tracker plugin in Pupil Player software (see link). We will also publish a version of this within Pupil Cloud very soon. https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker.

marc 20 November, 2020, 12:38:54

@user-95fe83 Note that the usage of the surface tracker requires the placing of AprilTag markers in the environment. We do not currently have a solution for this without markers. You can import Pupil Core and Invisible recordings e.g. in the iMotions software, which offers marker-less tracking. I do not know of a tool that would allow you to open the recordings in Tobii Pro Lab. Our recordings have an open data format that would in priciple allow you to convert them to anything, but I do not know if Tobii's data format is public, which would make the conversion possible.

user-95fe83 20 November, 2020, 14:53:38

Thanks for the info. I already tried these markers and it works well. However in this case it is impossible to use them. Other tips of what methods to use to code mobile eyetracking of pupil hardware by fellow researchers? Tooolboxes, scripts, coding software,...

marc 24 November, 2020, 19:47:46

@user-98789c Yes, that is correct!

user-98789c 24 November, 2020, 21:20:58

Thanks 🙂

user-7a8fe7 30 November, 2020, 09:41:17

Hello everyone. The automatic upload isn't working since a few days on my smartphone... does anyone had the same problem and could fix the problem? I tried to turn on and off the upload, re-startet the phone.... but there is still no upload to the cloud....

user-0f7b55 30 November, 2020, 09:44:23

@user-7a8fe7 What are you observing in the recordings view?

user-0f7b55 30 November, 2020, 09:45:09

Does the progress stay at 0%?

user-7a8fe7 30 November, 2020, 09:53:01

there is even no 0%, even if I try to start it manually

user-7a8fe7 30 November, 2020, 09:53:38

@user-0f7b55 recordings view is what I have measured. so it is ok

user-0f7b55 30 November, 2020, 09:56:01

@user-7a8fe7 I meant the recordings list view UI, apologies if it was not clear.

user-7a8fe7 30 November, 2020, 09:58:42

@user-0f7b55 ok when I click on "recordings" there is the list with all recordings. some recordings earlier checked with the upload symbol, one (big) record is paused, the next record is paused and all other next recordings there is no progress to see (is it what you mean?)

user-0f7b55 30 November, 2020, 09:59:50

@user-7a8fe7 Yes, thank you. What is the version of the companion app you are using?

user-7a8fe7 30 November, 2020, 10:01:35

@user-0f7b55 it is 1.0.0-proud, last actualization on 15.nov 20

user-0f7b55 30 November, 2020, 10:02:43

@user-7a8fe7 Can you please try to log out (from settings menu) and log in again?

user-7a8fe7 30 November, 2020, 10:03:58

@user-0f7b55 it is uploading now, thank you very much. i'm very sorry that i did not tried it before

user-0f7b55 30 November, 2020, 10:04:28

@user-7a8fe7 No problem, please let me know if you experience similar issues again.

user-7a8fe7 30 November, 2020, 10:05:23

@user-0f7b55 ok i will do -thank you very much again!!

user-98789c 30 November, 2020, 10:44:31

Hi all, Does anyone has like a checklist of all the steps to take care of, before starting to record with Pupil Invisible? from plugging the glasses to the cellpone and so on.. and maybe also after the recording?

marc 30 November, 2020, 11:14:26

@user-98789c To simply record gaze data, literraly all you have to do after connecting the cable and starting the app is pressing the record button. Once you are done hit the stop button and confirm saving the data and that is it.

Other things that might be useful depending on your use case:

  • To conveniently access the data you might want to upload it to Pupil Cloud. This need to be enabled in the settings first. Alternatively you can download it directly from the phone using a USB connection.

  • If your subject has a constant error in the gaze prediction (you can check if this is the case also in the live preview), you might want to correct it using the offset correction feature. To do that open the live preview, touch and hold the screen, and drag the gaze circle into it's correct position. We expect this to happen for rougly 10% of the population.

  • You might want to record other data, e.g. demographic information on the subject, using a template. This can act essentially like a questionaire filled out on the Companion phone.

Let me know if you have further questions!

user-98789c 01 December, 2020, 11:48:54

Thank you Marc for the reply. Two more questions: 1- Do you know of any open-access scripts for processing gaze data? 2- Do you know any papers where they have recorded and analyzed gaze data in their experiment, using Pupil Invisible?

End of November archive