We made a 1h+ recording and when reading / importing the raw data, it looks like the imu (timestamp) data ends earlier than recorded gaze data. Is it based on two different clocks, since there appears to be some kind of drift? (difference in last timestamp is around 30 seconds for a 1h 20 mins recording)
@user-1367ec which version of invisible companion did you use for this recording?
@user-0f7b55 "app_version":"0.8.20-prod"
@user-1367ec thanks, I will take a look and let you know if I have something concrete.
@user-1367ec is it ok for you to share this recording?
@user-0f7b55 I assume its enough to share without the video file?
@user-1367ec Correct. You can omit the video files.
@user-0f7b55 How do I share it with you easily?
@user-1367ec Usually, we recommend sharing them with [email removed]
@papr OK
@user-0f7b55 Shared files to data@pupil-labs.com
thank you!
Hey, I just read you Whitepaper on Pupil Invisible. Great work. I really liked the statistical framework you developed! One question remains even after reading the paper: Are you working on a calibration sequence that can be implemented in the Pupil Invisible gaze-estimation pipeline?
Hey @user-bf7a13! Thanks for your feedback! I am glad you enjoyed the paper. What we already have is the offset correction feature, which allows you to manually correct for a constant offset that might occur for some subjects. This could be considered a very basic calibration. We are looking into more complex calibrations for Pupil Invisible, to allow users to further increase the accuracy, but this is difficult to get right. The calibration of deep learning-based eye trackers is essentially an open research problem.
@marc thank you very much for the quick reply. And offering the regression- and model-based eye-tracking algorithm from Pupil Core also for Pupil Invisible is propably not an option either?
@user-bf7a13 You are welcome! You can apply the Pupil Core pipeline to Pupil Invisible images (it is not supported super well in Capture because we do not want to encourage it, but you can use Invisible glasses in Capture), but it works poorly. The reason is that the Core pipeline is based on pupil detection, but in the images recorded by the Invisible glasses the pupil is often barely visible or not visible at all. The lack of occlusion from the Invisible's eye cameras comes with the cost that pupil detection is not really possible anymore. Unless the application is restricted to a very small portion of the field of view, the Core pipeline is thus not applicable to Invisible glasses.
ok I see. Maybe we have to look into Pupil Core for data with better accuracy. Is there a whitepaper on Pupil Core aswell?
@user-bf7a13 yes you can find it here: https://arxiv.org/pdf/1405.0006 It is however somewhat outdated by now. The described method in the paper is our 2d mapping mode. The current default method utilizing a 3D model for slippage compensation is not described in the paper. The 3D model is originally based on this work by Swirski et al.: https://www.researchgate.net/profile/Lech_Swirski/publication/264658852_A_fully-automatic_temporal_approach_to_single_camera_glint-free_3D_eye_model_fitting/links/53ea3dbf0cf28f342f418dfe/A-fully-automatic-temporal-approach-to-single-camera-glint-free-3D-eye-model-fitting.pdf The white paper of Pupil Invisible is referencing several current 3rd party evaluations that also include Pupil Core in the related work section. Pupil Core is indeed capable of reaching higher accuracies, assuming you are able to control your recording environment sufficiently. That means indoor lighting conditions, not too dynamic subject movement (e.g. not doing sports) and a calibration + camera setup. If you would like to have a chat with us regarding your specific application and what product we would recommend, feel free to reach out to [email removed]
Thank you very much! I will get in touch!
@user-0f7b55 Did you guys had a chance to look at the data for possible timestamp issue?
@user-1367ec Yes. I will let you know once there is any update on this.
Just to clarify for me. In order to run the companion app, I have to sign up for a pupil cloud account as a must, but I can turn off the automatic upload during setup or later...right?
@user-d1dd9a That is correct. When first opening up the app it will already ask you about enabling/disabling automatic uploads, so you can turn it off right away if you want to.
You can always turn it back on/off later in the settings as well.
For our tests it is important that the data is not given outside or is only processed locally. So that the workflow looks like this: recording|detection with companion app -> transfer(maybe sd card) this data to pupil player for further analyzations and visualisations.
Can I trust that this functionality is always there next to the pupil cloud? Or is it likely that pupil cloud will become mandatory in the future ?
@user-d1dd9a yes we will continue to offer doing this without cloud.
@user-d1dd9a you could even turn on airplane mode on the Companion device while recording and until you have transfered the data off of the phone to be absolutely safe.
Thanks mpk and marc for the answers. Just another question. I know that there is also the possibility for the pupil core to carry out offline (post) recognition in the pupil player. In addition, the pupil invisible does not use cr-detection. Would it be possible to carry out a post-detection in the pupil player too, in that the neural network | ml-algo is executed locally in the program so that we can benefit from the entire 200 Hz ?
@user-d1dd9a Hi 🙂 Pupil Player is not able to run the Pupil Invisible gaze estimation pipeline. Therefore, neither Player's post-hoc pupil detection nor post-hoc calibration will work for Pupil Invisible recordings.
@user-d1dd9a Could you clarify what you mean by cr-detection
?
@user-d1dd9a Maybe not primarily relevant for you but regarding 200Hz gaze estimation: We will introduce this feature to Pupil Cloud soon.
o.k. but basically it should go when we transfer the pupil invisible gaze estimation pipeline to local pc maybe as standalone program or plugin in pupil player and use the companion app only as recorder. 🙂
cr-dectection -> cornea reflection detection
the pupil cloud is an interesting and innovative thing but for our test and usecase unfortunately critical
@user-d1dd9a The companion device can save gaze at around 70hz when configured right. Would that be enough for your usecase?
That would fit...better than our current solution. But what means configured right ?
@user-d1dd9a we are soon going to support the 1+8 as a companion device, we will also offer the option to future compress eye videos. If you turn this feature off and use the new phone you will get around 65hz gaze data.
Hi everybody, I'm having some issues with pupilcloud when uploading from Invisible Companion. Upload progress still stuck at 0% after several mins with very small recording test. Any idea why? Thanks
Hi, I have a few questions regarding the Invisible Companion app: is it possible to run the app in the background so that the user can work on a different app while the gaze data is published on the local network? If so, would it also be possible to access the published data in realtime on the same device (e.g. by the app the user is working with)? Or is there even a library available which could be included in an Android app that allows for direct access to the gaze data? The reason why I'm asking is that we are working on a research project where we intend to use the gaze data from the Pupil Invisible to detect if a user is mind wandering while using a custom Android app and if so the app should change its contents. Could that be achieved without having to connect the Pupil Invisible to a secondary device (smartphone) which then sends the gaze data to the primary device (tablet) that the user is working with? Ideally we would like to connect the glasses to the same device the user is working with. Thanks in advance
Hi @user-6294f5! Yes it is possible to run the Pupil Invisible app in the background while using another app, and yes it is also possible to get the gaze data in real-time from the phone. The gaze data is published through an API to the local network, so any device in the same network (including other apps on the same phone) can receive the data in real-time. See the documentation on that here: https://docs.pupil-labs.com/developer/invisible/#network-api Note however, that you can not run the Pupil Invisible Companion app on a tablet. The only devices supported by that app are the OnePlus 6 and (very soon) the OnePlus 8.
Hey @user-d6b597! Could you try the following to hopefully fix this:
1) Just to be safe: Ensure that the phone is connected to the internet by using e.g. the browser.
2) in the recordings overview use the context menu of the recording that is stuck to pause the upload and then resume it
3) If it still does not work fully restart the app by holding the app icon on the home screen, then selecting "App Info" and then "Force stop". Then go back to the home screen, start the app and see if the upload is now starting.
If that still does not fix it, please contact [email removed] so we can go into a bit more detail on what might go wrong here!
Hi @marc , I checked all these options but still not able to upload to pupilcloud. I'm sending an email, thanks for the support
Thanks for your quick response @marc, so if our use case requires that the user interacts with a tablet, then the only way would be to connect the Pupil Invisible to a OnePlus 6 / 8 which runs the Companion app and sends the data through the network API to the tablet?
@user-6294f5 Yes, that is correct!
ok thanks!
Your welcome!
@user-1367ec Can you please try Companion app version 0.8.26 from the play store? The IMU timestamp issue you've been experiencing should be resolved.
@user-0f7b55 Thank you - we will!
Hello! I am part of a lab interested in purchasing the Pupil Invisible. The website says the recording unit has 8 hours of storage and 100 minutes of run time. Does this mean that it can only run for 100 minutes at a time before needing recharging, and can hold 8 hours storage total? How do you record for much longer sessions than 100 minutes? I see there is an additional recording unit to use while the other is charging. Do you have to use each for 100 minutes each and keep switching back and forth?
Also, 1 more question: How does the Pupil Invisible prevent IR interference from outdoor light, as it uses IR illumination for its cameras? We currently have the Tobii 2 and it has very large IR interference.
@user-31ba3f Hi. Your understanding regarding runtime and storage is correct. Please be aware that Pupil Invisible only comes with one Companion device (recording unit) by default.
Regarding outdoor light: The neural network has been trained such that it is able to handle outdoor lighting situations. This is were Pupil Invisible shines in comparison to all other eye trackers.
@papr Ah, I see. Thank you for your quick response. How much is it to get an extra recording unit? When one has reached 100 minutes, do you have to physically remove the unit and plug it in to charge and then replace it with another recording unit? Or can both be simultaneously on the Invisible and switch between the two without interference?
@user-31ba3f The recording units are normal phones. The switching procedure would be: 1. Stop recording on phone A 2. Disconnect the glasses from phone A 3. Connect the glasses to phone B 4. Start a new recording on phone B
@papr Oh I understand now. Sorry for the confusion. Thank you very much!
@user-31ba3f No problem. Let us know if you have any other questions 🙂
@papr Thanks! Would this switching have to happen to run longer than 100 minutes of data, because phone A's battery would have to be recharged to record more? If so, would attaching a portable charger to phone A prevent this switching from having to be done?
@user-31ba3f Yes, battery life is the limiting factor. Unfortunately, the glasses occupy the USB-C slot which you would need to charge the phone. 😕
@papr Ahhh I see. So the glasses have to be attached to the phone via a cable using the USB-C slot?
@user-31ba3f correct. The cable is required for power and data transfer between Pupil Invisible glasses and Pupil Invisible Companion phone.
@papr Got it. One last question: would these eye trackers work for deviated eyes? Is the gaze estimation of an eye dependent on the estimation of the other eye?
@user-31ba3f That is a question for @marc. He will probably be able to respond tomorrow.
By the way, which recording duration are you looking for for your experiment / setup?
@papr Thank you. We would like to record normal activity over the course of a full day. Would data automatically be uploaded from the mobile recording device to the Cloud, or is that manual?
Is the data streamed to the Cloud in real-time, or can it be?
@user-31ba3f Data is automatically uploaded as soon as the recording is saved and there is a wifi connection. The data is not being streamed during the recording.
@user-31ba3f A full day is a very long time for a mobile device. 🙂 But the companion app is so easy to use that subjects should be able to handle switching the recording units themselves.
@papr Thank you very much! You have been so helpful, I appreciate it. I'm sure I will have some more questions soon. Have a good rest of your day!
@user-31ba3f You too!
Hey @user-31ba3f! We have not yet tested the Pupil Invisible glasses on a subject with deviated eyes, so I can not tell you with certainty how they would perform in that case. If that is a possibility for you, I would recommend to try it out yourself. You can return any products you purchased from us within 30 days and get a full refund, so you could trial them during that period.
Regarding the usage of an additional phone to extend the recording duration, please note while we are using regular phones, we only support two specific phone models right now. The OnePlus 6, which is included with the glasses, and (very soon) the OnePlus 8. The glasses do not work with other models. On the OnePlus 8 the battery life is about 2.5 hours.
Let us know if you have further questions! 🙂
2.5 hours in relation to an ongoing recording session or as general use ?
@user-d1dd9a We usually provide the battery life in a typical recording setting: Fully charged battery at the beginning and turned-off display after starting the recording.
Edit: The app automatically stops ongoing recordings once the phone reaches very low battery levels. This is when we stop our battery life time measurement.
O.k. thanks papr. In the case of supporting the OnePlus 8 device what means: "very soon" exactly ? And does this also include the pro version of the OnePlus 8 device ?
@user-d1dd9a Unfortunately, we cannot give an exact estimate in this regard. Although, the OnePlus 8 Pro should work in theory, too, we are not including it in our internal tests. In other words, we will not be supporting it officially.
No problem. I didn't want to be too indiscreet. But can you explain your answer "we will not be supporting it officially" to me. Does that mean, i can install and run the app but when i run into problems i dont get support ? How you deal with unsupported devices or better, how is the restriciton expressed ? Wont i be able to install and run the app on such a device ?
@user-d1dd9a It means that we do not include the device in our internal tests. Therefore, it is possible that there are small differences between the devices that can lead to unforeseen issues. E.g. the OnePlus 8 Pro has a slightly bigger screen. It is possible that some of the UI does not look as intended.
You should be able to install the app on an OnePlus 8 Pro. Feel free to report any issues that you encounter with this device. We will try to reproduce them on the supported devices. Should we successfully reproduce the issue we will fix it, of course. We won't be able to do so in case the official phones do not exhibit the issue.
o.k. but would it be possible to install the app maybe on my samsung galaxy s6 edge ?
The OnePlus 6 you ship as part of the invisible bundle box...is that the one with 8GB RAM | 256GB internal storage ?
@user-d1dd9a I think Samsung Galaxy phones do use a different chipset than the OnePlus phones, which means that the app will almost certainly not work correctly. Most likely the app will also not be available in the PlayStore for that model, as we are filtering the availability based on the hardware constraints. On the OnePlus8 Pro the hardware is similar enough for the app to be available, but as @papr mentioned, issues might still occur. We strongly recommend using the officially supported devices.
Thanks marc, now it is clear to me...your restrictions based on hardware requirements not on a certain device model.
Yes, at least regarding the visibility in the PlayStore. The implementation of the gaze pipeline is tuned to the exact hardware available on the phone, and USB connection to the glasses is also working differently for different manufacturers. Thus we can not support more than a hand full of devices at a time.
o.k. and that is also checked up ?
@user-d1dd9a Could you elaborate on what you mean by that?
Maybe as part of the installation routine...
the hardware specs of the device will be checked first to go further.
@user-d1dd9a We will not be adding such tests. If you want the device to work as described, please use the official phones. We can't help you if you run into issues when using any other phone.
Maybe to provide a bit more reasoning here, I think @marc 's previous description was too simplified. With Pupil Invisible Companion we are pushing the phones to their limits. A lot of work is going into getting as much performance out of the device as possible. We cannot provide this type of optimization for every phone. Which is why we are restricting the number of supported devices to a minimum.
Thanks papr for that, its clear right now to me.
Iam wondering the OnePlus devices dont have a sd card slot so the transferring of the recorded data to the host device for further analysations have to go either over bluetooth, wifi or usb data cable. Just for the usecase without the cloud.
@user-d1dd9a It is possible to transfer the recordings via USB.
Hello, is PupilLaps using a specific email for requesting an offer ?
@user-d1dd9a Please follow these steps to request an offer: 1. Visit https://pupil-labs.com/products/ 2. Add the items of interest to the Cart 3. Proceed to the checkout page 4. Fill in the order form 5. Check the "Request quote" check box at the bottom 6. Click "Submit" to request an offer
Thanks for that.
The prices do not include VAT?
VAT calculated based on delivery country Select the delivery country within the "Billing address" section and the VAT display will update automatically.
o.k. so the prices on the page are without vat...that comes additionally based on delivery country.
@user-d1dd9a Correct
Hello, is the audio sensor going to by available through the network API anytime soon?
@user-38c591 We are not planning on supporting audio streaming in the near future. Nonetheless, I will note down you question as feedback for our development team.
okay, thanks
@papr But is the audio actually sent in the stream or it just that the reception is not implemented on pyndsi?
@user-38c591 The video is streamed on a frame-by-frame bases and does not include audio.
okay I understand
thanks
@user-38c591 You are welcome 🙏
Hello @papr, I am having some difficulties with the Pupil Invisible. Sometimes I have recordings with no scene video but with gaze fixation. Other times I have recordings with a scene video but without gaze fixation. When I start and end the recording the two circle bars are not interrupted. How is this caused and how can I solve this problem? Thank you!
@user-3315ca Is it possible for you to share an example reocrding showing this behavior with [email removed] This would help us determine the problem!
@user-1367ec Can you please try Companion app version 0.8.26 from the play store? The IMU timestamp issue you've been experiencing should be resolved. @user-0f7b55 Hi kam, we have updated the app to version 0.8.26 but the timestamp from the imu did not match with the eye camera. Also, the time length was different. For a 160sec recording, the difference is around 0.7 sec. I have tried to align both the start timestamp and end timestamp but there were no consistency between files. Is there any suggestion to match the data? Thanks.
@user-599b15 hey, not all sensors start and stop recording at same exact time. Also, the sensors record at different sampling rates. What do you mean by aligning start and end timestamps? The recommended way to correlate gaze and imu data is to find gaze-imu data pairs whose timestamps are as close together as possible. Alternatively, you will have to resample the data streams, by using interpolation.
@user-599b15 hey, not all sensors start and stop recording at same exact time. Also, the sensors record at different sampling rates. What do you mean by aligning start and end timestamps? The recommended way to correlate gaze and imu data is to find gaze-imu data pairs whose timestamps are as close together as possible. Alternatively, you will have to resample the data streams, by using interpolation. @papr I understand that due to the sampling frequency, I can use the interpolation and pair with the nearest timestamp. However, the timestamp doesn't match at all. For example, I have a recording that the eye camera started at 1603213059322553000 and ended at 1603213220150532000, while the imu started from 1603212438930888000 to 1603212599036848000. In this case, there is no way that I can align them. What I can do is to assume they start at the same time or end at the same time (with a few milliseconds difference due to the sampling rate). With the previous versions, I was using interpolation and it worked fine. It seems like it is not the case this time, so I was wondering if you have any suggestion. Thanks.
@user-599b15 Ah, thank you very much for the clarification. The starting timestamps seem to be off by 10 minutes. @user-0f7b55 Please have a look at that.
@user-599b15 Can you please provide this recording for analysis? You can omit the video files.
@user-599b15 Can you please provide this recording for analysis? You can omit the video files. @user-0f7b55 Where should I send it to?
@user-599b15 You can share it to data@pupil-labs.com
@user-599b15 You can share it to data@pupil-labs.com @user-0f7b55 I've sent it.
thanks
@user-599b15 Hey, we have not received your email yet. Could you please check if the correct address was used?
@papr Sorry, it gave me errors when sending it. I have shared the link to the email provided above.
@user-599b15 I have just received the email 👌
@user-599b15 which of the recordings in the archive do you refer to?
@user-0f7b55 The one I mentioned above is "AssistByVORbrace1Random2". Actually, besides "Homebrace0Random0", the others all have the same issue
@user-599b15 was it by any chance that this Homebrace0Random0 was the first recording right after PI glasses were connected?
yes
@user-599b15 I couldn't immediately reproduce this issue. Would it be possible to make a quick series of recordings and check if it is happening every time for you?
@user-0f7b55 Sure. I will send it to you in an hour.
@user-599b15 I just need to know if it is always happening.
@user-0f7b55 Sorry, I wasn't able to replicate the scenario now cause I don't have the PC that has Invisible Monitor App installed with me right now. The timestamp looks fine now without the Invisible Monitor App. I think whenever I'm using Invisible Monitor App at the same time this timestamp issue happens.
@user-599b15 thank you for specifying that, it is important information!
Sorry for the confusion. I didn't realize that is the problem cause we usually use it along with the Invisible Monitor App
@user-599b15 it might be the problem, I will let you know what we find out!