invisible


Year

user-a50a12 01 June, 2022, 12:45:39

A question with respect to GDPR compliance of pupil cloud. For a scientific study where human participants are visible, and audio is enabled we would need explicit consent of the participants for the usage of pupil cloud (our standard procedure is that identifiable data such as video is just used on local systems and servers of the institution). Do you have any documents, or a pre-formulated text that could be used a) in the consent forms for the participants, b) to convince a local GDPR official of our institution and the ethics committee, that pupil cloud would be safe to use, and fully GDPR compliant?

marc 01 June, 2022, 13:56:06

Hi @user-a50a12! We do not have documents specifically for that purpose, but those files deal with the matter: 1. FAQ file regarding data privacy: https://docs.google.com/document/d/18yaGOFfIbCeIj-3_GSin3GoXhYwwgORu9_7Z-grZ-2U/export?format=pdf 2. Our privacy policy: https://pupil-labs.com/legal/invisible-companion-privacy/

user-a50a12 01 June, 2022, 15:00:00

thanks a lot! If there would be legal problems: is there any way to make all the calculation that are possible in the cloud (such as surface tracking, face-tracking) also locally, or are these exclusive to the cloud? I explored pupil player and was not successful so far in doing the same stuff in there, but I may have to dig deeper.

marc 01 June, 2022, 15:16:35

Only partially. The surface tracking is possible in Pupil Player as well if you enable the surface tracker plugin. But e.g. face mapping, fixations and blinks are not available in Pupil Player.

user-ffc367 02 June, 2022, 08:10:34

Hi. The timestamps of my exported recordings are not in UTC. They seem to start with 0 at the beginning of the recording. What do I have to change that the timestamps are stated in UTC. Well "0" is wrong. Actually its 2.454.803,00

papr 02 June, 2022, 08:20:41

Hi, yes, that is the case for Pupil Player exported recordings. Player transforms the recording in-place to a compatible format which includes shifting the start time to zero instead of the Unix epoch.

https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/pupil_recording/update/invisible.py#L147-L152

You basically need to revert the linked function above:

import json
import pathlib

info_json_path = pathlib.Path("path to info.invisible.json")
info_json_content = json.loads(info_json_path.read_text())
start_time_ns = info_json_content["start_time"]
start_time_s = start_time_ns * 1e-9

exported_timestamps + start_time_s
user-ffc367 02 June, 2022, 08:55:48

Thank you. One more question. Is there a better/direct way to get β€œreadable” timestamps in UTC, than opening the recording with pupil player, then export to csv and convert them with a python script?

marc 02 June, 2022, 08:59:24

Most platforms should be capable of converting this type of timestamp format to something human readable. In Python you can e.g. do

import pandas as pd

df = pd.read_csv(gaze_path)
pd.to_datetime(df["timestamp [ns]"], unit="ns")
user-b811bd 05 June, 2022, 02:28:01

Hi, is there any possibility to change the color of the green circle/ red dot when upload the videos?!

nmt 05 June, 2022, 07:08:14

Hi @user-b811bd. Check out the Gaze Overlay enrichment: https://docs.pupil-labs.com/invisible/explainers/enrichments/#gaze-overlay You can change the visual parameters of the circle using that

user-057596 09 June, 2022, 13:32:16

How do display the QR code on the Pupil Invisible Companion app in order to allow you to control and access the Monitor app and is there a link to the last Monitor app version?

marc 09 June, 2022, 13:35:15

Hi @user-057596! If your Companion app is up to date, you should have a "Streaming" entry in the menu. This will show you the QR code as well as the address of the monitor app. You need to open this link in your browser. Note that the Monitor app is no longer a desktop app that you have to install (so there is no link to the newest version), but it's a browser-based app hosted by the Companion app itself, so the Companion phone and the device you want to open the Monitor app on need to be connected to the same network.

See also here for details: https://docs.pupil-labs.com/invisible/how-tos/data-collection-with-the-companion-app/monitor-your-data-collection-in-real-time.html

user-057596 09 June, 2022, 13:38:00

That explains it as I have version 1.4.14

user-057596 09 June, 2022, 14:13:04

Thanks that worked perfectly.

papr 10 June, 2022, 13:40:58

I do not know this for sure. @marc will be able to answer this when he is back in the office next week.

user-e0a93f 10 June, 2022, 13:41:27

Ok thanks πŸ™‚

marc 13 June, 2022, 07:31:37

Hi @user-e0a93f! The blink detection algorithm is working off of movement patterns in the video and can not estimate something like an eye closure percentage or similar.

The gaze algorithm itself does not have a confidence value. In our experience estimating a general confidence score is also difficult. Confidence estimates for ML-based regression algorithms is an open research problem.

Detecting half open eyes should be possible, but the question is to what a degree this is a reasonable indicator of confidence. Did you have trouble with slightly closed eyes?

user-e0a93f 13 June, 2022, 13:19:00

Hi @marc, thanks for your answer! 1. What do you mean by "off of the movement patterns" ? 2. I understand, I am also worried about the threshold I will have to set to determine when the eyes are "too closed to give a good prediction of the gaze orientation". But I don't really have a choice because when I look at the the gaze estimates, the output make no sense when the eye are not fully opened. Therefore, I cannot trust my data. I have three options: i) finding a way to discard the "bad data", ii) disclosing that I have a problem, but some of my metrics would be largely affected by the problematic data or iii) not trusting at all and putting to trash part of my experiment which I would really like to avoid! 3. You said that it is possible to detect half opened eyes, do you have any suggestions on how to do that?

user-46e93e 14 June, 2022, 19:20:06

Hi there! I have two pupil invisible devices. When I connect one pair of glasses to the one plus phone and open the app, the regular pop ups to connect to eye view/camera view dont always pop up. When looking at the recording screen the eye view circle highlights and seems ready to go but the camera one stays greyed out. I tried to troubleshoot and look for some help here https://docs.pupil-labs.com/invisible/troubleshooting/ and it may be an OTG issue? when i click the link for additional help i get a 404 error page. I tried the same glasses with the other one plus device and got the same issue

marc 15 June, 2022, 07:45:20

Hi @user-46e93e! If the scene camera connects successfully but the eye cameras do not, this sounds like a hardware issue to me. Please write to [email removed] referencing this conversation to facilitate a repair! The OTG issue would only happen on a OnePlus6 device and it should affect both camera types if it occurs. Thanks for reporting the broken link though, we'll fix that!

marc 15 June, 2022, 07:41:04

1) It's a machine learning algorithm that is using optical flow features to detect blinks. So essentially it detects the movement patterns of the eye lid during a blink. 2) What causes your subjects to have their eyes half open? Is this during blinks or while looking downwards? Or are they actively squinting their eyes? 3) I am afraid I do not have a concrete suggestion. This would not be an easy thing to do. In theory this should be possible using machine learning if you had an appropriate data set, but it would be a lot of effort.

user-e0a93f 15 June, 2022, 13:01:19

Thanks for your answer! 2) I am conducting an experiment with trampolinisit, so while they are rotating their eye have great movement amplitude (yes sometimes they are looking really downward, but it also happen when they are looking upward). I think they cannot afford to blink because it would too time consuming, so I think it is a compromise (half closed eyes allows the eyes to get "washed and hydrated" vs not too much info is lost while the eyelid is closed) 3) Ok thanks, I'll look into it. I think I've found a way to get around by measuring the organization of the scan path.

user-6826a6 15 June, 2022, 10:46:03

Hi, I am new to pupil labs. I started working with it about a month ago and all seemed fine, I'm delving a bit deeper now and have come across a new issue. when I open the app with the invisible glasses connected, there is a notification "calibration not synced! tap for instructions" instructions just say to connect to the internet and it will download the calibration data...but the device is connected to the internet. Unsure if this is related but it also will not upload recordings to the cloud, issue seems to have arisen at the same time. The companion device is a oneplus 6, android version 8.1.0.

marc 15 June, 2022, 10:49:15

Hi @user-6826a6! This sound like an authentication issue. Could you try logging out of the app and back in? Also make sure the app is updated please!

user-6826a6 15 June, 2022, 10:52:57

Hi @marc Perfect That's solved it thank you!

user-648ceb 15 June, 2022, 11:29:41

I want to interface eye-glasses with Matlab/Python to control a second device based on eye gaze AND motion data (IMUs). But as I understand this is not possible as Core allows for Matlab/Python interfacing, but does not have IMUs, whereas Invisible has IMUs, but no option for data to be recorded from Matlab/Python. Is this correctly understood?

papr 15 June, 2022, 11:31:25

Hi! You can receive realtime data from Invisible using Python but IMU data is not yet supported. See https://docs.pupil-labs.com/invisible/how-tos/integrate-with-the-real-time-api/introduction/ for an introduction.

user-648ceb 15 June, 2022, 11:42:01

Thanks, 'yet' as in it will come? soon πŸ˜‰

papr 15 June, 2022, 11:46:11

It is on the road map but we do not have a release date yet.

user-3df8f2 17 June, 2022, 14:51:20

Hello! Quick clarification to the real-time gaze signal. What does a 120+ Hz sampling mean? This seems to fluctuate a lot and even goes up to 198Hz in some cases. How do we establish the sampling rate of the device? What factors influence the inter-individual differences in sampling rate?

user-27f0bc 17 June, 2022, 18:31:21

Hello there! We have a couple of Pupil Core eye trackers in our lab but we are borrowing another lab's Invisible glasses just to compare and see if we should acquire the Invisible. I just have a couple questions: 1. Does the Invisible Companion app only work for a specific phone that it comes with (One Plus 6/8) or can we use a different Android phone? 2. Can we use tablets/laptops instead of the to use it? If so, which app could we use? Thank you!

user-03e279 18 June, 2022, 14:12:53

Hello there! I would like to conduct a visual search experiment using the API. I would like the participants to scan a room looking for a person with specific traits (e.g. a red hat) and have the software save the timestamp of the first fixation on that object, so that the participants do not have to press a button to indicate that they found the object in question. It would help me a lot if you could help me to decide whether i should use an image based solution (if the red hat is fixated, the object has been found (is such a solution even possible?)) or a time based solution (if any fixation is longer than x seconds, the object must have been found). Thank you!

marc 20 June, 2022, 07:00:13

This is a bit complicated, because it is a lot due to internal processes of Android. When starting a new recording, the phone provides ~200 Hz real-time gaze for about a minute. Then Android processes start to kick in that reduce the resource consumption reducing the framerate over time down to ~120 Hz. We do not have control over that and its not behaving exactly the same every time, but it should always start at ~200 and then go down.

user-3df8f2 20 June, 2022, 11:42:48

A follow-up, would it be possible to fix it to a lower sampling rate, say 60Hz just so everything is recorded at the same sampling?

user-3df8f2 20 June, 2022, 09:24:45

This was really helpful, thank you!

marc 20 June, 2022, 07:06:37

Hi @user-27f0bc!

1) Yes it only works with those specific phone models: OnePlus 6, OnePlus 8 and OnePlus8T. This is due to the very specific hardware and software requirements the gaze pipeline has. Those phone are available for purchase from most consumer electronic stores though, and you could use one purchased from anywhere. Note that only a specific range of Android version is supported though, see here for details: https://docs.pupil-labs.com/invisible/explainers/glasses-and-companion-device/#android-os

2) The Pupil Invisible glasses need to always be attached to a phone running the Pupil Invisible App. If you attach them to a computer directly you will not be able to run the gaze estimation pipeline or upload recordings to Pupil Cloud. You can however connect the glasses to a phone and then monitor and remote control the glasses from a laptop. See here for details: https://docs.pupil-labs.com/invisible/how-tos/data-collection-with-the-companion-app/monitor-your-data-collection-in-real-time.html

user-27f0bc 21 June, 2022, 00:03:10

Hi Marc, thank you so much for all the links and information you've provided!

marc 20 June, 2022, 07:21:55

Hi @user-03e279! I think a time-based solution might not be very robust. While having found the person in question might trigger a prolonged fixation, having a few longer fixations during the search is not unlikely, which would make it difficult do detect the final fixation.

Detecting the first time gaze false onto the red hat would be possible, assuming you have an algorithm that is able to detect the red hat in the image. This should work robustly, but there is of course a chance that despite gaze falling onto the hat, the subject does not yet actually recognize the target as what they are looking for.

user-03e279 20 June, 2022, 13:59:40

Thank you very much!

marc 20 June, 2022, 11:46:19

There is no setting to make this happen automatically, but yu could of course drop samples for a lower but more consistent frequency. If you use the simple mode of the real-time API this would be relatively easy. The data receiving methods like receive_gaze_datum report the newest available sample. You could call this method at 60 Hz.

user-f408eb 22 June, 2022, 00:17:22

Hello dear PupilLabs team, I am currently working with Pupil Invisible on a study at our department. It would be useful if the AprilTags used to define the Areas of Interest could be recognised automatically. Is this possible?

wrp 22 June, 2022, 02:55:26

@user-f408eb hi πŸ‘‹ Could you give us a summary of the study so that we have some context?

You can use Marker Mapper: https://docs.pupil-labs.com/invisible/explainers/enrichments/#marker-mapper - to define surfaces. The markers will be detected automatically (if they are visible/large enough to be visible) when you run the Marker Mapper enrichment in Pupil Cloud.

user-4bc389 23 June, 2022, 03:18:12

Hi Is there any observation distance limit for invisible? What is the appropriate distance? Thank you

marc 23 June, 2022, 06:07:16

Hi @user-4bc389! No there is no upper limit. For distances <1 m you can get parallax errors, but for long distances there is no problem.

user-011cbf 23 June, 2022, 14:01:39

We are looking into buying the invisible glasses, we want to use them for eye tracking of wayfinding when passengers find their way in airports for example. I was wondering if you order the invisible do you get the android device with our order or do you need to order that separately? (i'm asking since you can also buy the android device separately on the accessoires page). Also if you order the invisible do you need to buy any other extras? (cloud costs etc?)

nmt 23 June, 2022, 14:38:42

Hi @user-011cbf πŸ‘‹. The Android device is included – no need to order separately (some users require multiple devices, hence we have that option on the accessories page). Pupil Cloud is also included, so essentially that’s everything you need to start eye tracking and enriching your data!

user-a98526 24 June, 2022, 03:53:45

Hi@marc, is there a way to calculate the angular velocity for Pupil-invisible.

marc 24 June, 2022, 07:06:08

Hi @user-a98526! Do you mean the angular velocity of the Pupil Invisible glasses themselves, or of the gaze signal? For the glasses you get the rotation speed in degrees/s from the IMU. For the gaze signal you could calculate the velocity based on the azimuth and elevation values of the gaze points, which give you a measure in degrees.

Both data is included in the raw data export, see here https://docs.pupil-labs.com/invisible/reference/export-formats.html#imu-csv https://docs.pupil-labs.com/invisible/reference/export-formats.html#gaze-csv

user-6e2996 26 June, 2022, 18:46:28

Hi, my device continuously reports recording failure and sensor failure during recording session. It is so unreliable. What should I do with it? Thanks in advance.

nmt 27 June, 2022, 06:50:52

Hi @user-6e2996 πŸ‘‹. Please reach out to [email removed] and if possible, include some screenshots of the error messages for debugging.

user-1cf7f3 27 June, 2022, 19:03:51

Hi! I have couple of questions about the blink detection. 1. Is there also any data in the export that indicates blink frequency, or is that something you would need to calculate based on the blink data? 2. Can you combine events and blink data in one export? So the aim would be to see how the blinking was within a certain timeframe

marc 27 June, 2022, 19:40:55

Hi @user-1cf7f3! The blink data is included in the the raw data export, which also contains events. The blinks are essentially a list of blink events with timestamps, which you can use to calculate the blink frequency! See also here: https://docs.pupil-labs.com/invisible/reference/export-formats.html#blinks-csv

user-1cf7f3 28 June, 2022, 06:19:23

Got it. Thanks Marc!

user-6e2996 27 June, 2022, 20:54:45

Hi, I emailed pupil labs about debugging. When should I expect to hear back from you guys? The hardware issue has impeded our ability to conduct research. So, I'd like to get this solved soon. Thank you for your understanding.

nmt 28 June, 2022, 06:39:40

Hi @user-6e2996. We have received your email and will follow up with you asap!

user-ce3bd9 29 June, 2022, 15:07:51

Hi all, i need to create an heatmap but i didn't use AprilTag during my test. Is it possible?

user-ce3bd9 29 June, 2022, 15:08:05

or I need to use an external software?

mpk 29 June, 2022, 15:14:47

@user-ce3bd9 maybe you can use the RIM feature? https://docs.pupil-labs.com/invisible/explainers/enrichments/#reference-image-mapper You need a scanning video of the area you want to make the heatmap. Sometimes you can use a normal video from your experiment for this too.

user-ce3bd9 29 June, 2022, 15:15:22

thank you!

user-ce3bd9 29 June, 2022, 15:18:14

@mpk i did a test on a restaurant menu, so people moved it sometimes. I will try do you have other suggestions?

mpk 29 June, 2022, 15:20:50

It might just work if you do a scanning video with a blank background and the menu being the only thing in view.

user-ce3bd9 29 June, 2022, 15:31:12

ok i will try. thank you so much

user-ce3bd9 29 June, 2022, 15:32:14

another question, i read on your website that mobile app will not no more supported... it the app i use with invisible?

marc 29 June, 2022, 15:45:40

What app are you referring to exactly? The Pupil Invisible Companion app will not be deprecated and is supported as always.

marc 29 June, 2022, 15:46:12

The old real-time API called NDSI was deprecated, in favor of a new API. But NDSI is still supported in the Companion app as well.

user-7c6eb3 29 June, 2022, 19:06:12

is it possible to sync outputs from recording sessions with data gathered from other types of physiological sensors? thinking in particular this one https://www.emotiv.com/epoc-x/

marc 29 June, 2022, 19:54:17

I am not sure what particular syncing options this device offers, but data from Pupil Invisible is saved with UTC timestamps for every sample, which in principle allows you to sync the data with anything.

user-cc819b 30 June, 2022, 10:41:30

I'm having trouble connecting to the invisible with the real-time API, both using automatic device discovery and with a specified IP address. The device and monitor are on the same local network and MDNS and UDP traffic are enabled. For our application we can't run the network from a phone hotspot.

Is there anything else we can troubleshoot here?

papr 30 June, 2022, 10:43:04

Please make sure that you are running the latest version of the Companion app πŸ™‚ If you do already, could you please check if you can ping the device via the terminal?

user-28ac9f 20 July, 2022, 09:38:39

How did you check if mDNS and UDP are enabled?

user-cc819b 30 June, 2022, 10:46:58

yes you're right, it was the Companion app version, sorry! Thanks for the quick response

user-cc819b 30 June, 2022, 13:08:09

is it possible to stream the intertial and audio data with the real-time API, along with the gaze and scene data?

marc 30 June, 2022, 13:09:14

No, this is currently not possible. We plan on including this in the future, but there is no concrete release plan for it yet.

user-cc819b 30 June, 2022, 13:45:40

ah ok, thanks

user-cc819b 30 June, 2022, 13:46:21

final thing, is it possible to set up an LSL relay on the mobile device capturing data from the invisible?

papr 30 June, 2022, 13:47:19

Hey! The LSL relay does not work on mobile. You will need to run this on a desktop PC. Check out https://pupil-invisible-lsl-relay.readthedocs.io/en/stable/

user-cc819b 30 June, 2022, 13:46:51

some of our existing setup uses LSL streams from the Core

user-cc819b 30 June, 2022, 13:50:30

thanks this looks very useful, I will have a look

user-cc819b 30 June, 2022, 13:51:57

I assume in this case though that the timestamps streamed directly from the realtime API will be more accurate than those reported by the LSL relay

papr 30 June, 2022, 13:54:47

The relay will assume that the PC and phone are synced via NTP. When receiving data from the realtime API, the relay converts the timestamps to LSL time based on that assumption.

You are right that NTP sync is not perfect. That is why we have a way to post-hoc transform time between Pupil Invisible/Cloud and LSL time in a very accurate manner. See https://pupil-invisible-lsl-relay.readthedocs.io/en/stable/guides/time_alignment.html

user-cc819b 30 June, 2022, 13:52:30

since there will always be some jitter even after NTP

user-cc819b 30 June, 2022, 13:53:01

is there any advantage to using the LSL in this case that I am missing?

papr 30 June, 2022, 13:55:06

Recording to LSL makes only sense if you have other sensors that you want to record, too.

End of June archive