πŸ•Ά invisible


user-82e5bd 01 May, 2021, 14:51:30

hi, i have some problems with some data. There are some video`s were the circle is stucked at one place or where i dont have the gaze circle.

wrp 03 May, 2021, 08:26:01

@user-82e5bd Hi πŸ‘‹ please send an email to info@pupil-labs.com along with your serial number and/or order ID and we can provide support.

user-0defc6 03 May, 2021, 11:31:56

Hello. Just after the update, the glasses stopped being recognized by the mobile phone. However, they also do not work in the PC with Pupil Capture. I tried another (data transfer) cable and it doesn't work either. Any suggestions? Thanks

papr 03 May, 2021, 13:11:15

Please contact info@pupil-labs.com in this regard.

user-ef03be 04 May, 2021, 11:32:39

Hello, I'm trying to use the Network API for the Pupil Invisible. For this purpose I installed the ndsi module. But when I try to import ndsi there is an error because a module named formatter can't access to the frame module. Here is the repository with the files : https://github.com/pupil-labs/pyndsi/tree/master/ndsi. Could you help me with this ? Thanks

papr 04 May, 2021, 11:34:46

Please see this conversation for reference

https://discord.com/channels/285728493612957698/633564003846717444/732964406509436969

user-ef03be 04 May, 2021, 11:38:28

Thanks, but I managed to install pyndsi (with ffmpeg and turbojpeg). It says that the module was installed succesfuly. So i don't understand how it cant found the frame file.

papr 04 May, 2021, 11:39:30

Could you please post the complete error message?

user-ef03be 04 May, 2021, 11:39:41

Sure

user-ef03be 04 May, 2021, 11:39:45

Chat image

user-ef03be 04 May, 2021, 11:42:43

First I thought this was because it couldn't import the frame.pyx file so I tried to use : import pyximport; pyximport.install() but it still doesn't work

papr 04 May, 2021, 11:43:53

There are build requirements and runtime requirements. The former are fulfilled. The latter are not. Please see https://discord.com/channels/285728493612957698/446977689690177536/796747481693028443

user-ef03be 04 May, 2021, 11:44:55

Ok thanks, I'll see that

user-ef03be 04 May, 2021, 11:56:43

So I noticed that I don't have a frame.pyd file. Maybe it is the problem. But I don't know how to get this file. Could you help me please ?

papr 04 May, 2021, 11:57:29

Where you able to find the ndsi folder in the site-packages? If so, could you please share the contents of that folder?

user-ef03be 04 May, 2021, 11:59:28

Sure

Chat image

papr 04 May, 2021, 12:00:24

frame.cp39-... is the file that you need to drag onto Dependencies_Gui

user-ef03be 04 May, 2021, 12:02:02

Ok so it says the same as the discussion you sent it must be the same problem

user-ef03be 04 May, 2021, 12:21:11

Thanks for your help

papr 04 May, 2021, 12:21:32

Were you able to solve it based on the previous conversations?

user-ef03be 04 May, 2021, 14:19:54

Yes thanks

user-9d72d0 04 May, 2021, 12:25:01

Hallo all! I have a question regarding the power supply, i couldnt find an answer online yet. The question is: If I want to make long recordings using the pupil invisible, is there a possibility to have an additional external power supply attached to the phone while the pupil invisible is recording? Somthing like a y usb-c cable or something similar? Did anybody try this?

nmt 04 May, 2021, 12:47:49

Hi @user-9d72d0, it is not possible to charge the phone while using the USB connection. Based on our limited feedback, those who have tried using y adaptors have run into issues where the Android operating system switches to charging mode automatically thus blocking our connection. We usually recommend hot swapping to a freshly charged phone to extend battery life, or with careful planning you can leverage the warp speed charger's fast charging capabilities in between recording sessions.

user-9d72d0 04 May, 2021, 20:48:06

@nmt thank you very much for your detailed answer! By Hot swapping, do you mean its possible to continue a running recording session on a second android device if the first needs to be recharged? Does the second android device needs to be bought at pupil labs or is any android satisfying specific requirements fit to run the pupil invisible?

marc 05 May, 2021, 07:07:47

@user-9d72d0 it is not possible to continue the same recording on the second phone. You would have to start a new recording, but the process of switching the phones is very quick. You can use any OnePlus 8 (and the no loger available OnePlus 6) phone. That is the only model that is supported by the Companion App. You can purchase it through us or any other retailer.

user-9d72d0 05 May, 2021, 10:31:58

Thank you also very much for your answer!

user-42a6f7 05 May, 2021, 09:48:08

Hey, i am using the pupil labs wearable eye tracker on windows, and when i use the pupil capture program, it tells me that it cannot connect to device

wrp 05 May, 2021, 09:49:44

@user-42a6f7 looks like you double posted to both πŸ’» software-dev and here. I will remove your post in πŸ’» software-dev

Are you using Pupil Invisible or Pupil Core eye tracker?

user-42a6f7 05 May, 2021, 09:50:09

Pupil core

wrp 05 May, 2021, 09:50:31

ok, let's migrate to your post in the πŸ‘ core channel. In the future just post once on single channel.

user-42a6f7 05 May, 2021, 09:51:21

Okay awesome

user-df9629 05 May, 2021, 18:48:49

Hi, I have a few queries: 1] In the documentation, gaze ps1.raw fps is 65 Hz but its respective .time file is 55 Hz. [I opened both the files and the .raw file had twice the number of datapoints as in .time file, which is (as expected and) correct.]. My query here is what is the sampling rate? 2] I also noticed gaze_200.raw and .time, are you just interpolating the data? If yes, any reason for choosing 200 Hz? Can we get more options to choose for interpolation like 30 Hz or 100 Hz or 500 Hz on the cloud? Thanks!

marc 06 May, 2021, 08:11:24

Hi @user-df9629! 1) The real-time framerate you get depends on the Companion device you use. On the OnePlus 8 it is ~65 Hz, on the OnePlus 6 it is ~55 Hz. We did update our documentation to 65 Hz once we started using the OnePlus 8, but I guess we missed a spot! Do you remember where you saw the 55 Hz? I'll also try to find it myself!

2) The eye cameras of Pupil Invisible are recording at 200 Hz. After a recording is uploaded to Pupil Cloud, the gaze data will automatically be recomputed at 200 Hz based on the full framerate eye videos. Thus, the data is not interpolated but is actual 200 Hz. The real-time signal computed on the phone is limited by the phone's computational power. This data could of course be further interpolated to 500 Hz (or downsampled), but we do not offer a feature that would do that for you automatically.

user-df9629 06 May, 2021, 16:27:40

Thank you @marc ! 1) That clarification helps a LOT. In the documentation file, for the file_name gaze ps1.time ~fps mentioned is 55 while ~fps for gaze ps1.raw is 65 Hz. 2) I am VERY pleased to know it is not interpolating! Thank you once again!

user-b14f98 06 May, 2021, 16:08:22

Hey guys, does the exported data in gaze_positions.csv have the offset applied?

user-b14f98 06 May, 2021, 16:11:01

...and question #2: Does Pupil Labs have access to the cloud data?

user-b14f98 06 May, 2021, 16:11:13

...beyond automated processing to calculate gaze? From the perspective of a researcher, this has implications for privacy of human subjects data that could get a bit sticky on our end.

user-e0b992 10 May, 2021, 13:44:25

Can you get 200Hz data without the cloud? Asking for someone in China.

marc 10 May, 2021, 14:33:16

Hi @user-e0b992! No, currently only recordings uploaded to Pupil Cloud receive the 200 Hz gaze data through post-hoc cloud processing.

user-e0b992 10 May, 2021, 16:12:25

Are there any workarounds for working from China?

user-fb5b59 11 May, 2021, 08:44:31

Hey! Will the IMU data be streamed in the future or will it be only accessible via the post processing later?

papr 11 May, 2021, 08:44:57

It is already being streamed to my knowledge πŸ™‚

user-fb5b59 11 May, 2021, 08:47:14

Ah okay, because at first it was only the gaze and the image which was streamed and the IMU were "only" already implemented, but not data was streamed

papr 11 May, 2021, 08:49:30

Are you referring to the example from the docs?

user-fb5b59 11 May, 2021, 08:52:13

I did an implementation to receive the gaze and image live from the PI. During this time I also integrated the IMU sensor streaming, but as far as I know the IMU data was not streamed at that time

papr 11 May, 2021, 09:30:37

Please give it a try again and let us know if you have trouble receiving the data.

user-fb5b59 11 May, 2021, 10:22:23

https://discord.com/channels/285728493612957698/633564003846717444/834345115304067092 Here I also got only the response regarding using only recorded data and my current implementation does not receive any IMU data

nmt 11 May, 2021, 10:42:12

Hi @user-fb5b59. IMU data is streamed live and you should be able to receive it by adding a few lines of code to the network API example: https://docs.pupil-labs.com/developer/invisible/#network-api (I have just tested this). What is your current implementation?

user-fb5b59 11 May, 2021, 13:48:07

Thanks! Actually I just missed out, that I set the IMU sensor to not supported yet in my code. Now it can connect to the sensor but sometimes I get value errors even on the smartphone app, if I activate the sensor. I will recheck this.

Edit: The error values correspond to the IMU sensor (e.g., SensorUuid 456d57f5-daef-4607-a593-7f4583b1974a for the IMU Sensor and on the smartphone "Value 456d57f5-daef-4607-a593-7f4583b1974a of t..." is displayed

papr 11 May, 2021, 11:42:32

@nmt Thanks for confirming that it works!

user-b14f98 11 May, 2021, 12:56:01

Hey folks, I have to ask these questions again, because it looks like they were lost to the sands of time. 1) does the exported data in gaze_positions.csv have the offset applied? 2) Does Pupil Labs have access to the cloud data? THis is important with regard to the privacy of participants in our studies.

nmt 11 May, 2021, 14:27:30

I'm not sure that I understand how you are receiving the data stream. Please can you describe your implementation in more detail?

user-fb5b59 11 May, 2021, 14:51:13

https://docs.pupil-labs.com/developer/invisible/#network-api I am using this code, but converted to C++. It is working fine for the Gaze and Video Sensor. In this example, the gaze and world video ( see SENSOR_TYPES = ["video", "gaze"] and sensor name comparison with "PI world v1" and "Gaze") is enabled. Now I added the IMU sensor (sensor name comparison with "BMI160 IMU") and it is not working well for me, but this might be an implementing issue on my side. Of course, it would be helpful to get an example including the IMU Sensor addtionally

papr 11 May, 2021, 14:54:45

You should be able to extend the example by subscribing to the "imu" sensor type and processing the sensor data. Please note, that there is a known bug in pyndsi which causes the imu timestamps to not be converted to seconds. The v4 protocol publishes timestamps in nanoseconds. For backwards compatibility, pyndsi converts these usually to seconds (but not for IMU data at the moment).

Please provide full error messages and a clear description of the issues such that we can get more precise feedback.

user-fb5b59 12 May, 2021, 14:38:25

Actually the error is fixed. Problem was just using the wrong flags in "set_control_value" and "refresh_controls" for communicating with the device. Thank you very much for your fast support πŸ™‚

user-d8879c 12 May, 2021, 00:32:34

How do i delete an annotation within pupil player ( i annotated a section of my video with the wrong annotation so I need to delete and correct it)

user-d8879c 12 May, 2021, 00:32:43

I have invisible eye-tracking glasses I had an individual wear them in an arena and the signs that were 150 feet away are blurry and you cannot see what the signage says. any way to fix this to make the video footage sharp?

papr 12 May, 2021, 07:29:47

Unfortunately, there is currently no option to delete annotations in Pupil Player. You would have to discard/delete the incorrect entries from the annotations.csv file after exporting.

user-d8879c 12 May, 2021, 17:50:43

That seems silly thank you!

user-9d72d0 12 May, 2021, 07:52:14

Hello all! Regarding the IMU Sensor, I also have a question: What are the units for the gyro and accel data recorded to the .raw files? In the documentation I only found the graphic showing the IMU coordinate system orientation relatively to the pupil invisible itself.

wrp 12 May, 2021, 07:53:21

Please see raw recording data format here: https://docs.google.com/spreadsheets/d/1e1Xc1FoQiyf_ZHkSUnVdkVjdIanOdzP0dgJdJgt0QZg/edit#gid=254480793

This doc is also linked via Pupil Invisible dev docs: https://docs.pupil-labs.com/developer/invisible/

user-9d72d0 12 May, 2021, 07:56:57

Ah yes, sorry and thank you very much!

wrp 12 May, 2021, 07:58:09

You can also get IMU data in convenient CSV format if you use cloud raw data exporter enrichment: https://docs.pupil-labs.com/cloud/enrichments/#raw-data-exporter (scroll down to IMU section)

user-9d72d0 12 May, 2021, 08:17:09

All right, thanks also for pointing this out!

marc 12 May, 2021, 08:24:12

Hi @user-b14f98! I am sorry if we have missed your previous post!

1) Yes, the gaze offset is applied everywhere where gaze data is accessible, including the CSV download in Player.

2) From a purely technical point of view Pupil Labs does have access to the data, since we control the server where it is stored. This access is however strictly regulated. Only members of our server reliability team have access and all activity is logged. We have a strict policy of not accessing user data unless on user request or to investigate bugs. Legally this is regulated in our terms of use and privacy policy, which you can find here: https://pupil-labs.com/legal/

user-b14f98 12 May, 2021, 18:35:41

Thanks, Marc!

wrp 12 May, 2021, 08:24:26

@user-9d72d0 are you using Pupil Cloud / Have you tried any of the enrichment features yet?

user-9d72d0 12 May, 2021, 08:27:32

Not yet, but I will surely try it out for testing purposes.

user-7192ac 12 May, 2021, 13:58:04

Hello, the lab I'm in recently moved, and we're trying to decide whether to buy a new invisible or buy our invisible that we bought a year ago from our previous institution at a slightly discounted price. So, we were wondering, have there have been any significant hardware changes in the last 1 year?

marc 12 May, 2021, 14:03:26

Hello @user-7192ac! No, there have been no significant hardware changes. New devices ship with a OnePlus 8 phone, which is more performant than the One Plus 6, but you could also purchase that phone separately (from us or anywhere else).

user-7192ac 12 May, 2021, 18:17:11

one more question, do you expect any updates to the hardware in the near future?

user-7192ac 12 May, 2021, 14:04:43

@marc Thank you for your speedy answer!

user-057596 12 May, 2021, 15:26:42

I’m just about to update our laptops to latest pupil core software and reconnect with our pupil invisible glasses. However I’ve completely forgotten where do you get the latest update for the pupil_invisible_monitor.exe. Thanks Gary

marc 12 May, 2021, 15:27:58

Hi @user-057596! You can find it on our Github page here: https://github.com/pupil-labs/pupil-invisible-monitor/releases/tag/v1.3#user-content-downloads

user-057596 12 May, 2021, 15:28:45

Thank you Marc, you can tell how long my mind has been on furlough πŸ˜‚

user-508ae0 12 May, 2021, 15:57:18

Hey! What is the maximum data rate at which IMU data can be streamed? Currently I can receive IMU data at only approx. 3 Hz.

user-d8879c 12 May, 2021, 17:24:48

I have invisible eye-tracking glasses I had an individual wear them in an arena and the signs that were 150 feet away are blurry and you cannot see what the signage says. any way to fix this to make the video footage sharp?

user-d8879c 12 May, 2021, 17:24:56

How do i delete an annotation within pupil player ( i annotated a section of my video with the wrong annotation so I need to delete and correct it)

user-d8879c 12 May, 2021, 17:25:11

Help please ^^

papr 12 May, 2021, 17:25:47

Please see my response above https://discord.com/channels/285728493612957698/633564003846717444/841940203861311488

user-d8879c 12 May, 2021, 17:51:04

Do you know anything about this?I have invisible eye-tracking glasses I had an individual wear them in an arena and the signs that were 150 feet away are blurry and you cannot see what the signage says. any way to fix this to make the video footage sharp?

user-d8879c 12 May, 2021, 18:53:50

Do you know anything about this?I have invisible eye-tracking glasses I had an individual wear them in an arena and the signs that were 150 feet away are blurry and you cannot see what the signage says. any way to fix this to make the video footage sharp?

wrp 13 May, 2021, 01:52:35

We do not have any solutions built-into Pupil Cloud. This is indeed a general challenge as the signs are very far away. It would seem that you want to "enhance" the images or increase the spatial resolution post-hoc in order to have a "sharp" image of the far away text/image. There are methods to do this that fall into the category of Super Resolution. There has been quite a bit of progress on this topic recently using machine learning. See: https://paperswithcode.com/task/image-super-resolution/latest

Image super-resolution (SR) techniques reconstruct a higher-resolution image or sequence from the observed lower-resolution images

I am not 100% sure what the SotA is for video; but interesting problem 😸 Perhaps others here have some thoughts on the topic.

user-98789c 13 May, 2021, 12:21:24

Among all the lenses with different vision corrections, is there a way to know which is which? or it's only their left/right shape and their thickness that's different?

marc 14 May, 2021, 09:49:14

Yes, the left/right shape is different and the thickness, although the latter is difficult to see. Thus, it is important to not mix them up in the box.

user-7192ac 13 May, 2021, 14:52:48

Just want to ask this question again: do you guys expect any updates to the invisible hardware in the near future? I'd really appreciate an answer, even if it's "We can't answer that question." Thanks!

marc 14 May, 2021, 08:11:41

We do not expect any changes within the next couple of months!

user-94f03a 14 May, 2021, 08:54:43

@papr @marc & team cheers for including the changes on how surface_markers are detected and reported in the new pupil_player release. Makes a big difference!

papr 14 May, 2021, 08:57:18

Thanks for your input on that end!

user-98789c 14 May, 2021, 10:33:14

I'm trying to run Pupil Invisible Monitor. Any idea how to resolve this error:

C:\Users\CVBE\Downloads\pupil-invisible-monitor-master>python setup.py install Error calling git: "Command '['git', 'describe', '--tags', '--long']' returned non-zero exit status 128." output: "b'fatal: not a git repository (or any of the parent directories): .git\n'" Traceback (most recent call last): File "setup.py", line 29, in <module> version=str(get_version()), File "C:\Users\CVBE\Downloads\pupil-invisible-monitor-master\deployment\_packaging\utils.py", line 74, in get_version version = pupil_version() File "C:\Users\CVBE\Downloads\pupil-invisible-monitor-master\deployment\_packaging\utils.py", line 58, in pupil_version raise ValueError("Version Error") ValueError: Version Error

papr 14 May, 2021, 10:34:22

You need to use git clone if you want to run from source, instead of downloading the zip file from source. I highly recommend using the bundled application though https://github.com/pupil-labs/pupil-invisible-monitor/releases/download/v1.3/pupil_invisible_monitor_windows_x64_v1.3-10-g1f171da.zip

user-98789c 14 May, 2021, 10:36:10

perfect, thanks πŸ‘

user-398a7a 14 May, 2021, 13:02:54

Hey guys, I am looking at your documentation at docs.pupil-labs.com right now to figure out the options to integrate a Pupil Invisible in our measurement environment. My current understanding is as follows: Using a Pupil Invisible & the Companion app I will be able to access all data (camera video, gaze + imu data) via the NDSIv4 API. Is this correct? I'd appreciate an answer - getting a little confused by the load of different documentations pages right now πŸ˜…

papr 14 May, 2021, 13:05:28

Generally correct, NDSI is for realtime access though. Alternatively, you can access the data post-hoc programmatically via our Pupil Cloud API.

user-398a7a 14 May, 2021, 13:11:34

Perfect. Realtime access would be the goal as it will enable us to sync the data with our other systems. Could you give me a ballpark estimate on the latency induced by the gaze detection? Another question: Is a purely offline usage a problem for any of the systems?

papr 14 May, 2021, 13:16:52

I do not have an estimate on the latency right now. You can check it by running this ndsi script https://docs.pupil-labs.com/developer/invisible/#time-synchronization

By pure offline usage, do you mean not using Pupil Cloud?

user-398a7a 14 May, 2021, 14:01:00

I already found the script, unfortunately we are still in the planning-phase and I didn't get my hands on one of your devices yet. Will check asap.

Not using Pupil Cloud would be the idea. Just Pupil Invisible + the Companion app and additionally some other code to retrieve data via the API while the experiment is running. For various reasons we will not be able to allow any data transfer to external servers.

user-98789c 14 May, 2021, 15:08:00

Is there a plugin available for realtime or post-hoc heat map extraction from Invisible recordigs?

user-7192ac 15 May, 2021, 13:05:01

Thank you marc, one final question directly from the PI of my lab: "If we want to buy some models in the next 6 months, should we wait?"

marc 17 May, 2021, 10:58:25

There will be no hardware update in the next 6 months. We are of course working on things, but nothing has a foreseeable release date yet.

marc 17 May, 2021, 08:17:03

Hi @user-b10192! Since the Pupil Invisible gaze estimation pipeline is not based on pupil detection, there is no corresponding confidence signal either. We do actually have a classifier in place that makes reasonable estimates about whether or not the glasses are being worn, but this data is currently not available in real-time. The classifier results are saved in the worn ps1.raw files and are immediately available after finishing a recording.

user-b10192 17 May, 2021, 02:09:24

Hi there, We are currently display the images from two eye cameras via NDSI network to ensure whether the participant is wearing the glasses correctly or not. However we could not rely on manual checking. If we can calculate\retrieve the confidence from the data of left and right eye cameras, that would be really helpful for us. Is there anyway to calculate\retrieve the confidence of the pupils from the data of eye cameras? We are using "data.bgr" to display the eye and world cameras. Is there any other attributes of "data" such as confident or something like that?

user-98789c 17 May, 2021, 09:22:05

any replies here? πŸ˜€

papr 17 May, 2021, 09:31:25

You can use the marker mapping feature in Pupil Cloud or the surface tracker in Pupil Player (both are post-hoc; real-time is currently not supported for Invisible)

user-98789c 17 May, 2021, 11:33:05

Is there some kind of server codes or settings that I can apply so that Pupil Capture and Pupil Invisible Monitor recognize my Invisible device also on public networks?

papr 17 May, 2021, 12:19:41

No, usually public networks are setup to not allow peer-to-peer communication for security reasons.

nmt 17 May, 2021, 12:21:43

You can usually set-up a WiFi network on the PC running Invisible Monitor, and connect the Companion device to that, thus bypassing the public network.

user-98789c 18 May, 2021, 09:27:26

do you maybe have a link to instructions on how to do this in Windows?

user-7192ac 17 May, 2021, 13:21:22

Thank you so much for taking the time to answer these questions! We love the invisible and look forward to seeing what other amazing eye trackers you release in the future! P.s. I have my hopes up for a kid friendly version of the invisible πŸ₯Ί

nmt 17 May, 2021, 13:24:54

Hi @user-7192ac, for reference, we have found that Pupil Invisible works well for ages 3 and up. The head strap: https://pupil-labs.com/cart/?pi_strap=1 can be used to ensure a secure fit.

marc 17 May, 2021, 13:37:04

@user-7192ac Further improving the support for children is definitely on our list for future iterations! Thanks for the feedback! πŸ‘

user-508ae0 17 May, 2021, 17:07:40

Hello Pupil Labs Team! I'm trying to stream data from pupil invisible to my computer in real-time. Receiving of the world video and the gaze data works fine. However I can receive data from the IMU only every ~300 ms and there is 60 messages in each "data batch". Is it possible to reduce the size of this data batch in order to increase the rate in which IMU data is sent. Or is there another method for real-time transmitting of the IMU data? For my tests I used this example script: https://docs.pupil-labs.com/developer/invisible/#network-api

user-508ae0 18 May, 2021, 11:51:41

I'll duplicate my question in case it was overlooked

marc 18 May, 2021, 09:56:02

@user-98789c The keyword to look for is "mobile hotspot". Essentially the laptop is acting as a router to create it's own wifi and other devices can connect to that wifi. These should be the according instructions: https://support.microsoft.com/en-us/windows/use-your-pc-as-a-mobile-hotspot-c89b0fad-72d5-41e8-f7ea-406ad9036b85

user-98789c 18 May, 2021, 10:16:13

Perfect, thanks. I was trying the same thing creating a hotspot; I had set the public wifi as trusted (private), and it didn't work. Now I set it back to public, and it's working!

user-17da3a 18 May, 2021, 11:40:35

Hi guys! I have two questions regarding IMU data I received via Pupil Cloud enrichment. How could we get the head rotation data in degrees from the IMU? As the recorded values are the measures of angular speed and not the rotation of head in degrees? As mentioned before, relative head-pose measurements usually have a drift error, could you please let me know how big the drift is and what are the possible ways to correct it? Thanks!

marc 18 May, 2021, 11:49:00

Hi @user-17da3a! The IMU we use does have an accelerometer but no magnetometer. Thus, 2 out of three dimensions (pitch and roll) of the relative head-pose can determined without drift. We have an implementation of the Madgwick algorithm, which calculates those drift-free poses from the acceleration data, available in a Pupil Player plugin. This is not yet directly compatible with the CSV download of the IMU data from Pupil Cloud, but would work with raw recordings downloaded from cloud. https://gist.github.com/N-M-T/ec8071bd211db287f4879e0b48874505

The yaw can not be determined drift-free without a magnetometer. We do however not have an evaluation based on which I could quantify exactly how large this drift is.

marc 18 May, 2021, 12:06:42

Sorry for the slightly delayed response, I had to consult with our engineering team on that topic. Using the current NDSI protocol there is no way to change the batch size or to increase the rate of sending the data. We are currently working on a new streaming protocol, which will make it easier to configure such parameters and might make it possible to increase the data rate. It will still take at least several weeks for the new protocol to be released though. The underlying issue here is a buffer within the IMU sensor that makes it difficult to pull the data in real-time.

user-508ae0 18 May, 2021, 12:14:54

Thank you for your detailed answer!

user-59adb5 18 May, 2021, 12:11:49

Hello guys ! I'm trying to stream Pupil invisible video with LSL but as I'm totaly new to LSL i couldn't find out how to do it. If you could link me a tutorial it would be great !

marc 18 May, 2021, 13:09:45

Hi @user-59adb5! We do not have tutorial I could point you too, but you can find an implementation of a LSL "relay" for Pupil Invisible here that should allow you to integrate with Pupil Invisible: https://github.com/labstreaminglayer/App-PupilLabs/

marc 18 May, 2021, 12:15:11

Your welcome! πŸ‘

user-59adb5 18 May, 2021, 13:10:41

Thanks !

papr 18 May, 2021, 13:11:56

I also highly recommend reading their documentation at https://labstreaminglayer.readthedocs.io/info/intro.html It is important to get a rough idea on how it works behind the scenes in case you run into any issues.

user-1cf7f3 19 May, 2021, 09:16:25

Hi! My team encountered an issue with a few recordings. We did a total of 24 observations that day, but we can only see the first 10 in the cloud. It seems like after a point the recordings weren't being saved anymore, even though we have been recorded as usual. If we do trial recordings now they do work and are properly stored. Any ideas on what could have gone wrong?

marc 19 May, 2021, 09:36:14

Hi @user-1cf7f3! If you look into the list of recordings on the phone, are the recordings missing there as well?

user-1cf7f3 19 May, 2021, 11:00:06

yes, they are also not there in the phone

marc 19 May, 2021, 11:03:04

@user-1cf7f3 Could you check what version of the app is installed on the phone? (to check press and hold the app icon -> app info)

marc 19 May, 2021, 11:06:11

Also, if you connect the phone to a computer and browse the internal storage of the phone (you need to switch the phone to file transfer mode to access the internal storage, let me know if you need instructions for that), what are the contents of the Pupil Invisible folder on the phone?

user-1cf7f3 19 May, 2021, 11:24:37
  • App version is 1.2.0
  • Inside that folder I also only see the video's that were properly saved and not the other ones. But I do see something odd. I'll put the screenshot
user-17da3a 19 May, 2021, 11:14:36

Hi @marc ! I just repeat my first question, which might be overlooked. How could we get the head rotation data in degrees from the IMU? As the recorded values are the measures of angular speed and not the rotation of head in degrees? I also want to know a bit more about the timestamp of IMU which is in nanosecond. I need the time points in second, but when I transfer them in second the timestamp is still a very small value like "1616688788".

marc 19 May, 2021, 13:02:39

Hi @user-17da3a! I tried to answer your question here: https://discord.com/channels/285728493612957698/633564003846717444/844179767326277652

The timestamps are given in nanoseconds since the UNIX epoch, so 1616688788 seconds would translate to 25. March 2021 16:13:08 UTC. This is a fairly common format for timestamps and libraries like e.g. Pandas can typically convert those values to more date-like formats easily: pd.to_datetime(timestamps)

user-59adb5 19 May, 2021, 11:23:31

Hello guys, I have this error when I try to install Pupil Gaze Relay. Does someone know how to fix it please ?

Chat image

user-1cf7f3 19 May, 2021, 11:29:01

@marc this is the odd thing I see. And it is compatible with the time it stopped working (basically after 13 'o clock for the afternoon)

Chat image

user-1cf7f3 19 May, 2021, 12:48:35

On another note: what is the storage limit? I read here some time ago that there is no limit in the cloud, but in the app I do see that now, for example, I still have 10Gb - 1 hours 47 mins. This brings me to a second question: what is the difference between the cloud and the storage system the app refers to?

papr 19 May, 2021, 13:04:52

Hi, I saw that you also contacted [email removed] I will respond to all your questions via email. πŸ™‚

user-59adb5 19 May, 2021, 13:31:13

okay ! thanks alot πŸ™‚

marc 19 May, 2021, 13:07:21

@user-1cf7f3 The storage on the phone is limited by the phones harddrive, which is either 128 GB or 256 GB in size. Our recordings have a size of ~5GB/hour, so this determines how much can be stored in total. Recordings have to be deleted manually on the phone to free up space. The storage capacity in the cloud is currently unlimited. Since the phone is not actually entirely full, this should not have caused the phone to stop recording. I will consult with our engineering team regarding your screenshot.

user-1cf7f3 19 May, 2021, 13:10:08

Thank you so much Marc!

marc 19 May, 2021, 14:09:30

@user-1cf7f3 The recording you posted the screenshot of indeed looks broken. This may have been caused by a connection error during the recording or a hardware error in the Pupil Invisible device. If you see such behavior again we should consider sending your device in for repairs.

The missing recordings are more difficult to explain. Whenever a recording is started, a folder is created on the phone and the only way of removing this folder is explicit deletion. If the recording folder is missing entirely, this strongly suggests that either the recordings have not actually been started or they have been removed. Missing recording folders due to a software error are very unlikely.

Reasons why a recording may not start are: - Out of storage. This does not seem to be the case for you as you reported >1 hour left over storage. Also, a warning would have been displayed and the recording animation would not have started. - Out of battery. If battery is very low the app does not start a recording but instead displays a warning to the user that the phone is about to power off. - Hardware error. Again, this would display an error message.

In all those cases there should have been an error message and the recording would not have been stoppable and savable at the end, since it never started.

Reasons why a recording may have been deleted afterwards: - On recording-end the app asks if the data should be saved or discarded. Discarding requires a extra confirmation. - The recording may have been deleted via the UI or a file browser.

Did you 1) not see any error messages on the phone and 2) were you able to properly stop and save the recordings that are now missing?

user-b14f98 20 May, 2021, 16:24:51

Hey guys - feature request. Save both gaze offset and raw (not offset) data to the output gaze estimates.

user-b14f98 20 May, 2021, 16:25:05

I know that this can be recovered by reading in the invisible json.

papr 20 May, 2021, 16:28:16

That is correct. Isn't it sufficiently easy to use that? Writing two files means double the storage used for the same data. It also means more IO usage of the phone during the recording. And the phone's IO is already pretty busy with writing 3 videos.

user-b14f98 20 May, 2021, 16:29:50

I thought it was practically free to do, and it would be convenient and make it more explicit that the offset was already applied to the data stream. If IO bandwidth is an issue, then yes I understand why you aren't doing it.

user-b14f98 20 May, 2021, 16:32:53

FWIW, adding the offset to the data output, but not the raw output, seems a bit inconsistent with the ethos adopted with respect to the Core pipeline, which was to only output raw data.

user-19ed58 26 May, 2021, 14:42:01

Can I ask where I can download Pupil Player? Sorry, new in town.

marc 26 May, 2021, 14:58:54

Hi @user-19ed58! To download Pupil Player please visit https://pupil-labs.com/products/core/ and scroll down a bit.

user-19ed58 26 May, 2021, 14:44:37

I just got my Invisible, but its gaze seems a bit off. Is there any ways I can test it or calibrate it? Or whom I should talk to, please advise, thanks.

marc 26 May, 2021, 15:03:24

There is no classical calibration for Pupil Invisible. The accuracy you can expect from Pupil Invisible is evaluated here https://arxiv.org/pdf/2009.00508 For some subjects there is a constant bias in the predictions and this may be what you see. There is feature called offset correction that allows you to correct for this bias. To access this open the live preview in the Companion app, touch and hold the screen and move your finger to specify the correction.

user-1cf7f3 26 May, 2021, 15:37:36

Hi Marc. We didn’t get any errors at that point and we were able to save the files (or at least that was the impression, we could click on the save button πŸ™‚ ). At the moment though, we are experiencing something else funny. We get quite some error messages and broken recordings. Here’s a list of what happens:

  1. recording error ("we have detected an error during recording"). In this case it seems like the eye tracker keeps working.
  2. Recording failure ("the pupil invisible companion App stopped working during the current recording. We attempted to recover all recorded data. Please check that the last recording is playable and complete. If this problem persists, please reach out to our support team for help!"). In this case the eye tracker stops. This happened about 4 out of 8 times. The recording in this case stops but then we can start a new one.
  3. Sensor failure ("The intertial measurement Unit has stopped providing data. Please stop recording. Unplug the USB from your pupil invisible glasses and then plug it back in. If this behaviour persists, it may indicate a glasses or companion device issue. Reach out to our support team for help.”). 4.Vibration of the phone, but then once you turn the screen on and off it works. 5.Red blinking light when battery is actually doing ok

What I can think of is overuse of the phone or the glasses. What makes me think that is that we are carrying out a study where we need to do 30 minutes long recording - 15 minutes stop - 30 minutes recording - 1 hour break - same. The phone gets really warm too. Ideas?

marc 27 May, 2021, 10:22:27

Usually this amount of usage is not a problem. The phone does get warm as it is working at maximum capacity, but this should not lead to any errors. This looks very much much like a hardware error to me, where e.g. some sensor has a loose connection. We should organize a repair of your device. Could you please send an email to [email removed] referencing this conversation and mentioning the serial number of your device (see the tip of the left arm of the glasses) to ask for a repair?

user-19ed58 27 May, 2021, 15:26:51

Thanks, @marc

user-d1dd9a 29 May, 2021, 08:41:01

hi there...i installed the pupil invisible companion app on the OnePlus 6 companion device. when i want to start the app i get the message that the companion app isnt compatible with the android version.

user-d1dd9a 29 May, 2021, 09:02:19

I found out that the OnePlus 6 companion device is preinstalled with android 10. Now i read that this version isnt compatible with the companion app. But how is it possible that you deliver a OnePlus 6 companion device with an incompatible android version ?

user-997dee 31 May, 2021, 09:40:02

Hi! I've also been regularly getting the second error lately (i.e., companion app stopped working). Based on the contens of the error, it seems to me like it would sooner be a software error in the phone/app. I've checked the data from when it happened, and the gaze data seems to be recorded (red circle visisble), but not the data from the world cam (only seeing a grey screen). The phone also has a system update available (OnePlus8_O2_BETA_10), but I'm not sure if it's wise to do this (I'm worried the companion app may no longer be compatible then..)

marc 31 May, 2021, 11:43:03

Hi @user-997dee! Could you share a recording that features this error with [email removed] such that we could take a look at what has gone wrong?

Regarding the update: It looks like you are using a OnePlus8. For that device Android 11 is officially available and the USB bug is fixed again in that version. So if you update, just make sure to update all the way to Android 11!

marc 31 May, 2021, 11:32:19

@user-d1dd9a All OnePlus6 devices ship with the OS version that is installed from factory, which is Android 8. Android 10 is the highest version that is available for the OnePlus6, so if that is what is installed the phone must have been updated after shipping. Unfortunately, Android 10 is not compatible with the Pupil Invisible Companion app as you mentioned, because this Android version contains a bug related to how USB devices can connect with the phone.

To get the phone working again you will have to downgrade the Android version to 8 or 9. Please follow these instructions for that:

Rollback process:
1. Download the latest ROM package from http://oxygenos.oneplus.net.s3.amazonaws.com/OnePlus6Oxygen_22_OTA_015_all_1808102118_downgrade_68586451f7334c4d.zip.
2. Copy ROM rollback package to the mobile phone storage root directory.
3. Settings -> system updates -> top right corner icon -> local upgrade -> click on the installation package -> immediately upgrade -> system upgrade to 100%.
4. Select Reboot system now.
5. Update successfully.
Please note: This rollback will clear all contents of the phone, please make sure to back up.
alter rollback to the stable version of O can update to P version via OTA after global rollout 
user-d1dd9a 31 May, 2021, 11:56:21

Thx marc for that info. I tested the phone fresh out of the newly delivered pupil invisible box.

user-6a9ca1 31 May, 2021, 11:57:20

Hi, we're having some trouble with the PI since last week. We are running experiments and all of a sudden there are issues with the Phone/ recording site of the PI. We are getting the following error message "Recording failure" the PI stopped working ... please reach out to our support team" (this seems like a generic segfault catch btw). Note, we are only using it for this purpose and have not touched the phone otherwise. However, there is a software update for the phone available, but we are hesitant to update...

user-6a9ca1 31 May, 2021, 11:58:07

Ah I see that @user-997dee, my colleague has already started a thread...

marc 31 May, 2021, 12:19:36

That is peculiar! I am sorry if there was a mix-up on our side that lead to this undesired update. Our usual protocol is to leave the Android version as is.

user-d1dd9a 31 May, 2021, 13:08:29

No problem. I did the downgrade as you posted. All works but what i figured out that i cant charge with the dash charger anymore.

marc 31 May, 2021, 15:00:53

@user-d1dd9a I have not hear of an interaction like this with the Android version before. Are you using the USB-C cable that came with the charger?

user-d1dd9a 01 June, 2021, 11:50:03

Yes i used the bundled usb-c cable. With the Android 10 version i didn't had this problem. Now i can only do normal charge through usb taking this cable again. I think its a software problem.

End of May archive