๐Ÿ•ถ invisible


user-fb5b59 01 July, 2022, 08:24:53

Hey, I saw that you can receive a response from the mobile device for things like IP address, Battery Level, Serial Number of Glasses and so on (more or less your discover_one_device() functionality in your real time API). Is there anywhere in the documentation noted which commands have to be send to the phone and what messages will be received in response? We can currently start/stop local recordings on the phone remotely from our tooling, but I would like to receive the additional information as well. Thanks!

papr 01 July, 2022, 08:27:57

Hey ๐Ÿ‘‹ Are you using the simple or async/advanced API in your tooling? (Are you planning on using the Python API at all?)

user-fb5b59 01 July, 2022, 08:28:52

No, it is a C++ integration which is not using any of your python scripts

papr 01 July, 2022, 08:32:26

Ok, got it! There are two ways to get this kind of device information: 1. Via HTTP requests on the /api/status enpoint 2. Via websocket updates

See the Under The Hood guide for details https://pupil-labs-realtime-api.readthedocs.io/en/latest/guides/under-the-hood.html#get-current-status

user-fb5b59 01 July, 2022, 08:39:59

The implementation with ndsi (and e.g. zyre related) is in a way outdated and does not support such things I guess?

papr 01 July, 2022, 08:41:13

ndsi is a whole different API and has been deprecated. The only reason to use it atm is to receive eye video and IMU data. But we will be adding that to the new API, too.

user-fb5b59 01 July, 2022, 08:49:11

Okay, thanks. Is there any data when the NDSI integration will no longer be working?

papr 01 July, 2022, 08:50:46

We don't have a end-of-life date yet.

papr 01 July, 2022, 08:52:55

@user-fb5b59 That said, and given that there is no NDSI c++ reference implementation, I would not recommend investing any efforts into building a c++ integration for it.

user-fb5b59 01 July, 2022, 08:56:12

Already did this two years ago and receiving the world video, eye gaze, single eye images, IMU data is working fine ๐Ÿ™‚ Just saw some updates on your page (e.g., regarding these battery status and so on) and was thinking about just integrating them...but yes, won't do this know and we might need a refactoring later on to support the new libs.

user-fb5b59 01 July, 2022, 10:56:11

I am able to connect to the ws://pi.local:8080/api/status and will receive the status message on connection. Is there any message I can send via WebSocket so I get a new status update in response?

papr 01 July, 2022, 11:02:53

No, I don't think so. But the connection will push updates as soon as something changes, e.g. sensor disconnect. If you want to explicitly pull the current status, use the http get request

papr 01 July, 2022, 11:28:39

Btw, did you open source the client somewhere?

user-fb5b59 01 July, 2022, 11:29:58

No

user-fb5b59 01 July, 2022, 13:35:39

Just one additional question regarding the IMU usage in the future with the new API: will it still be send in chunks of size 60 or directly live streamed?

mpk 01 July, 2022, 13:47:15

@user-fb5b59 we will keep sending in chunks of 60 this is actually the IMU doing the chunking when sending the data via i2c to the USB bridge not the companion app or the network api host.

user-fb5b59 01 July, 2022, 13:48:43

Okay, got it. Thank you very much for your support, have a nice weekend.

user-3b5a61 04 July, 2022, 01:08:18

Hi folks, I recently changed from pupil core to pupil invisible. In my studies using pupil core, I could easily use the Pupil Player software to calculate fixation positions. However, with the Invisible that does not seem possible. How to calculate fixations using the Invisible and Pupil Player?

marc 04 July, 2022, 07:13:27

Hi @user-3b5a61! The fixation detector implemented in Pupil Player is not compatible with Pupil Invisible's gaze signal. The signal contains a bit more noise, which the algorithm can not handle.

We have developed a new algorithm, which works great with Pupil Invisible and also is much better in compensating for head movements of the subject. This algorithm is however only available in Pupil Cloud. Fixation data will be calculated on upload and the recording downloads will include the fixation data.

user-3b5a61 04 July, 2022, 01:08:44

Thanks in advance ๐Ÿ™‚

user-3b5a61 04 July, 2022, 07:27:11

Thank you so much, @marc :)

user-5c56d0 05 July, 2022, 02:08:55

Dear sir

Thank you for your management. Excuse me. Could you answer the following question? Best regards


My questions are as follows

Q1 What are the steps to follow when connecting the eyewear to a smartphone?ใ€€ใ€€Is there anything else besides "Enable OTG"?

Q2 Does the eyewear normally work as soon as the eyewear is connected to the smartphone?ใ€€My eyewear has been difficult to be recognised as connected since it was first purchased.

Q3. Is there a solution to my problem?


My situation is as follows.

ใƒปWhen the eyewear (Invisible) is connected to the smartphone (smartphone enclosed with the purchase), it takes a very long time for the eyewear to be recognized (It cannot be recognized somedays). Experiments cannot be carried out.

ใƒปThe Scene Camera Icon and Eye Camera Icon in the attached image do not light up.

ใƒปThe eyewear gets hot and electricity is running.

ใƒปYesterday morning, eyeware worked after about 10 minutes of connection. In the afternoon it never worked. Today, it also didn't work.

ใƒปMy use of the Invisible eyewear so far is about one hour or less (total recording time is less than one hour).

wrp 05 July, 2022, 02:38:01

Hi @user-5c56d0 thanks for getting in touch and for the detailed report.

A number of questions: 1. What version of Android is running 2. What device OnePlus 6 or OnePlus 8/8T. If OnePlus 6 you will need OTG enabled. OnePlus8/8T does not have this requirement. 3. Are you using the original cable we supplied to connect the glasses to the phone?

user-5c56d0 05 July, 2022, 02:20:15

Chat image

user-5c56d0 05 July, 2022, 02:43:10

Thank you for your answer. The following is my state.


  1. What version of Android is running Futami:ใ€€ It is 8.1.0.

  2. What device OnePlus 6 or OnePlus 8/8T. If OnePlus 6 you will need OTG enabled. OnePlus8/8T does not have this requirement. Futami:ใ€€It is OnePlus 6A.

  3. Are you using the original cable we supplied to connect the glasses to the phone? Futami:ใ€€Yes, it is.

Q4 Is the operation "Enable Application Lock" necessary?

wrp 05 July, 2022, 02:46:02

Thanks for the response @user-5c56d0 1. Android v 8.1.0 is supported ๐Ÿ†— 2. OnePlus 6A 2.1. OTG: Required โœ… 2.2. Enable Application Lock: Required to enable app to run in background even when phone screen is off.

wrp 05 July, 2022, 02:47:58

@user-5c56d0 If you are not able to get up and running, I suggest that you contact us at info@pupil-labs.com to follow up so that our Hardware team can help debug and diagnose the issue.

user-5c56d0 05 July, 2022, 02:49:28

@wrp Thank you.

Could you tell me how to do the operation "Enable Application Lock" necessary? I once did it last year, but I don't know how to do it now.

marc 05 July, 2022, 06:57:21

To enable the application lock please follow this instruction video:

user-5c56d0 05 July, 2022, 08:34:46

Thank you very much. I was very saved.

I'm sorry.

Could you give me the following?

In my attached image, how to go from (1) to (2).

marc 05 July, 2022, 08:36:23

The card in the bottom of image 2) only appears during the initial setup of the app (when you open it for the first time). This is only a tutorial though, you do not need to access this view. You can simply execute the steps for locking the app from the screen in image 1)!

user-5c56d0 05 July, 2022, 08:34:58

Chat image

user-5c56d0 05 July, 2022, 08:40:32

Thank you very much.

Is the follwoing action is correct? I have to change from "un-lock" to "Lock"(i.e., from (3) to (4) in my attached Image). My original one was "un-lock".

marc 05 July, 2022, 08:43:52

I can't see images 3 and 4, but yes, this is exactly what you need to do.

user-5c56d0 05 July, 2022, 08:40:37

Chat image

user-5c56d0 05 July, 2022, 08:44:06

Chat image

user-5c56d0 05 July, 2022, 08:44:22

I'm sorry. This is my intended image.

user-5c56d0 05 July, 2022, 08:47:53

Thank you

user-5c56d0 05 July, 2022, 08:47:55

In addition, the other application "Pupil Mobile" shows the following error.

"com. pupillabs. exceptions. Has No Sensor Permission Exception" in my attached image.

user-5c56d0 05 July, 2022, 08:48:06

Chat image

wrp 05 July, 2022, 08:48:56

@user-5c56d0 Pupil Mobile is not compatible with Pupil Invisible and is depreciated as an application.

user-5c56d0 05 July, 2022, 09:33:22

Thank you for your reply. I understand.

user-5c56d0 05 July, 2022, 09:44:51

Iโ€™m sorry for my trouble. Could you answer the following?


Q1 In general, how long after connecting the eyewear to the smartphone is the eyewear recognized?

Q2. I have set the application lock to lock ON (i.e. locked), but 10 minutes have passed without the eyewear being recognized. Regarding the operation "Enable Application Lock", is it correct to change the application from unlocked to locked?ใ€€The reason I ask this is because the eyewear worked yesterday morning with the application unlocked.

marc 05 July, 2022, 10:01:13

The application icon should always be locked. Otherwise there is a chance that the app shuts down while recording.

The hardware should connect to the app almost instantly. The behaviour you describe, where it sometimes connects, sometimes it does not, and sometimes it takes a while, indicates that there might be a connection issue in the hardware.

This might be an issue with the cable, or an issue within the Glasses themselves. Please contact [email removed] and ask them for a repair. You can reference this conversation. The hardware team will then diagnose the issue with you and provide a hardware replacement if necessary.

I hope we can get you up and running again quickly! Sorry for the inconvenience!

user-5c56d0 05 July, 2022, 10:05:14

Thank you.

The eyewear was not recognized when a Thunderbolt cable was used. Is USB-C & USB-C cable the appropriate cable? Such as the following products.

user-5c56d0 05 July, 2022, 10:20:15

Thank you. From the following, I assumed that the problem was with the eyewear. I was also surprised to hear that the eyewear should be recognized as soon as it is connected to the smartphone via USB. The reason is that the eyewear has not been recognized easily since I first purchased it.

ใƒปYour original cable could not be used for the eyewear, but my headphones could be charged with it. Therefore, I can assume that your original cable is not broken.

ใƒปI tried the following two types. (1) one Thunderbolt cable I have and (2) two Type C cables that are not Thunderbolt cables. As a result, the eyewear was not recognized by the smartphone in all three cases.

nmt 05 July, 2022, 11:16:38

Hi @user-5c56d0. We are responding via email ๐Ÿ‘

user-455cca 05 July, 2022, 14:11:12

Hi! I have questions about setting up an experiment with 3 invisible eye tracking systems in a conference room setting (4.1m x 7.5 m). Three subjects sit in those positions shown in the figure. We want to track their head and eye movements. We have tried putting QR codes (apriltag) on the wall and the table. However, when we threw the data in the Pupil Player and did the camera localization, the camera pose was not fully developed. We want to know how big and how many of them we need in this room size so that we could get solid 3D-model and head pose. Is it possible tracking three goggles at the same time in this big environment? What is your advice working in this big environment with several goggles? Thanks in advance!

Chat image

papr 05 July, 2022, 14:18:10

Hi! To get a solid 3d model, I recommend making a dedicated "scanning recording" * which you can use to build the head pose tracker's model. Afterward, copy the file to the other recordings. This has the advantage that all three PI recordings will be localized within the same model.

* See the tutorial for reference https://youtu.be/9x9h98tywFI?t=15

user-455cca 07 July, 2022, 19:52:24

Thanks for the response!! How do I copy and paste the model to the next recording?

user-7e5889 06 July, 2022, 03:29:01

hi! Iโ€™m using the enrichment named reference image map to convert pupil invisible videos to heat map but it always failed and warned me use another video. I tried several times but still failed. I wonder how to success?

nmt 06 July, 2022, 06:39:01

Hi @user-7e5889 ๐Ÿ‘‹. Would you be able to describe the environment in which you're using the reference image mapper?

user-7e5889 06 July, 2022, 17:29:26

I used both windows10 64bit chrome cable and mac OS Monterey safari wifi

marc 06 July, 2022, 10:19:23

@user-c23af2 You can visualize fixations (rather than raw gaze) as a polyline in Pupil Cloud. You need to open a recording player and then enable the Fixations toggle in the bottom right. (Note: we are only currently releasing this feature and it might take until about next Monday to become available for all recordings). You can however not export this visualization yet.

A visualization of a polyline of the raw gaze is not directly available and you'd have to generate it yourself. You can download the gaze data in CSV format and then visualize it e.g using Python and matplotlib.

marc 07 July, 2022, 07:07:52

Hi @user-7e5889! @nmt 's question was aiming at something different. The Reference Image Mapper does not work in every physical recording environment. It requires the environment to be sufficiently rich in visual features. So e.g. tracking an empty table top in an empty room would probably not work. Could you describe your application and recording environment a little bit? If possible, you could also share your scanning recording with [email removed] so we can take a direct look.

user-7e5889 07 July, 2022, 07:28:13

I tested 2 objects: a painting and a whiteboard in my lab.

Chat image Chat image

user-7e5889 07 July, 2022, 07:35:04

After I created an enrichment and press the 'start' button, then it seems to be loading (p1) but soon went back to the previous status (p2). Then I refreshed the webpage, it seems to be in progress again (p3). But after several minutes, I refreshed the webpage one more time, it showed error 100%(p4).

Chat image Chat image Chat image Chat image

user-7e5889 07 July, 2022, 07:39:55

the detail is shown in the screenshot below and the enrichment ID is 4ff6da06-5525-4aca-8c41-29e4e69afb25

Chat image

user-c23af2 07 July, 2022, 09:23:41

Hi Pupil Labs, Hi Pupil Labs, I am carrying on an experiment and I am using Pupil Invisible. I would like to know how to visualize the gaze positions with a polyline for each gaze position.

marc 07 July, 2022, 09:33:54

Hi @user-c23af2! Please see my response slightly above: https://discord.com/channels/285728493612957698/633564003846717444/994185790944985118

marc 07 July, 2022, 09:33:16

Thanks @user-7e5889! This make sit much clearer! If you get this error state, this means that either the environment you're in is unsuitable, or the scanning video of the environment is insufficient. I have not seen your scanning videos, but I think in your case the environment is the issue. With the plain white board, the solid gray floor and the rather texture less background objects and walls the algorithm does not have enough visual features to hold on to. The image you try to track is also a rather fuzzy/blurry one with not too many hard features.

Possible fixes for this would be: 1) make sure the object you are trying to track is more feature-rich. E.g. if it's possible you could swap to a different painting that is less fuzzy. Depending on the research question swapping out the object of interest is of course not always possible.

2) Make sure the object of interest is located in a feature rich environment. If you could move the whiteboard in front of a wall that has large texture that would help. Feature rich objects next the object of interest would also help.

Regarding paintings: we are about to release an example project next week demonstrating the reference image mapper in a gallery with paintings. Maybe that can serve as an example.

user-7e5889 08 July, 2022, 01:17:35

Thank you for your reply. I'm considering changing the environment and then conducting the test one more time. And I wonder whether the scanning video in the third step is about the environment without the target object or with it?

user-d1dd9a 07 July, 2022, 14:14:35

Had a recording with the invisible today. Unfortunately, I found out later that no scene video was recorded.

user-d1dd9a 07 July, 2022, 14:16:21

Could there be an error in the export?

user-d1dd9a 07 July, 2022, 14:17:48

I didn't get an error message from the companion app while recording.

papr 07 July, 2022, 14:26:25

Can you right-click the recording in Cloud and go to Download->Download Raw Data and list the files in the recording? Does it include any files with the word "world" in it?

user-d1dd9a 07 July, 2022, 14:18:06

that surprises me

user-d1dd9a 07 July, 2022, 14:29:29

I don't use the cloud service. But the two "PI world" files are missing in the exported directory.

papr 07 July, 2022, 14:31:58

In that case, this means the scene camera was simply not connected during the recording which is a supported use case by the app and therefore does not warn you about it.

user-d1dd9a 07 July, 2022, 14:35:39

But the camera was attached the whole time.

user-d1dd9a 07 July, 2022, 14:38:16

Or do you mean it was disabled in the app?

papr 07 July, 2022, 14:42:18

One cannot disable the scene camera on its own via software. When you open the recording in Player, does it include gaze data?

user-d1dd9a 07 July, 2022, 14:48:50

I can load the recording into the player but when I play the recording I see and hear nothing.

papr 07 July, 2022, 14:53:24

No green circles? That would most likely mean that the glasses were not connected at all or (in case of OnePlus 6) OTG was not enabled.

user-d1dd9a 07 July, 2022, 14:55:40

I think that was the problem. OTG was disabled.

user-d1dd9a 07 July, 2022, 14:57:41

I'm surprised I was able to record anyway.

papr 07 July, 2022, 15:00:05

The app is designed to work even if the device disconnects during the recording. That includes situations where the device is "not connected" during the start. Regarding OTG, the app will display a red usb icon in the main view if OTG is disabled.

But I can follow your expectation here. I will forward your feedback to our Android and Design team!

user-d1dd9a 07 July, 2022, 15:02:53

Thank you for that. If things have to be done quickly on site, then unfortunately you make mistakes. I'll pay attention to that in the future.

user-455cca 07 July, 2022, 19:55:01

Also, that means all three subjects do not need to do the scanning process right? We just need to create that ahead of time, them anyone could wear the goggles and walk into the scene. Am I getting this right?

papr 07 July, 2022, 19:55:57

Correct!

papr 07 July, 2022, 19:57:01

I don't remember the name of the file. Please check the head pose tracker docs. If this information is missing I will update the docs accordingly tomorrow

user-455cca 07 July, 2022, 21:00:47

I got it. Thank you!!

marc 08 July, 2022, 06:55:14

Let us know if you run into trouble again with the next test! The object of interest must be included in the environment when recording the scanning video!

user-7e5889 08 July, 2022, 07:47:45

I have changed the environment and another test on this painting. It failed again. Then I tried a table full of objects. It has been in on progress 100% status for 5 minutes. Maybe failed again.

Chat image Chat image Chat image

marc 08 July, 2022, 07:49:56

The painting is suffering from the same problems again: the painting itself is fuzzy and the background is pretty monotone. The test with the table I would expect to succeed though! The table is visually very busy and should work for the algorithm. The computation (if it is successful) can take quite a while, definitely longer than 5 min, so give it some time!

user-7e5889 08 July, 2022, 07:55:01

Thank you! I think the UI had better be more user-friendly and tell the real status instead of showing '100%' all the time. About the painting, actually, my experiment is about the appreciation of a painting on a white wall. This enrichment maybe doesn't support my needs?

user-7e5889 08 July, 2022, 07:55:54

sad.

Chat image

marc 08 July, 2022, 08:01:54

Given that this failed as well, we should take a look at the scanning video and see if anything is going wrong there!

marc 08 July, 2022, 08:00:00

It depends on the painting! Again, we are releasing a demo dataset using the Reference Image Mapper in an art gallery, where it worked very well. I recommend you check it out once it is out and see what painting we have used. This should be the right tool for you.

user-7e5889 08 July, 2022, 08:02:15

I'll check the demon and choose the experiment paintings and then have another test But could you tell me why the table with a lot of objects failed? I'll send the video to the email address soon

marc 08 July, 2022, 08:02:11

Could you share the scanning video you made for the table with [email removed]

papr 08 July, 2022, 08:02:37

We will need to have a look at the scanning recording to tell ๐Ÿ™‚

user-7e5889 08 July, 2022, 08:13:35

I've sent the email named โ€˜The scanning video of the failed reference image mapper reported by bubblepepperโ€™. Please check the attachment.

papr 08 July, 2022, 08:49:37

Thanks! I reviewed it and I would recommend the following: In your recording, you hold the glasses in multiple static poses while moving very quickly between them. Please try moving more, but slowly and continuously. This allows the algorithm to leverage more points of view. Holding a single pose is not beneficial. Also, try to include top-down views and side-views as well. This will help the algorithm to build a more stable model.

user-7e5889 08 July, 2022, 08:51:30

Thank you for your advice. I'll test one more time soon.

marc 08 July, 2022, 08:51:02

A bit more variation in distance to the table could also help, i.e. moving a bit closer and further away during the video.

user-9429ba 08 July, 2022, 12:27:57

Hi @user-7e5889 ๐Ÿ‘‹ Check out the video in our documentation on how to make a scanning recording: https://docs.pupil-labs.com/invisible/explainers/enrichments/#setup-2 I would also recommend watching this explainer video https://www.youtube.com/watch?v=ygqzQEzUIS4 to get a better grasp of how the Reference Image Mapper algorithm works.

user-7e5889 08 July, 2022, 12:34:44

thank you! I have read and watched it before but I didn't notice that there are many points that should be paid attention to. I'll read and watch again. If there are some tips shown on the user interface of the process instead of another independent webpage, it will be more notable and useful.

user-9429ba 11 July, 2022, 11:55:21

@user-7e5889 Hi again! You now see some real examples from an art gallery in our new Demo Workspace: https://docs.pupil-labs.com/invisible/explainers/basic-concepts/#demo-workspace. Check out the Reference Image Mapper Enrichments with paintings, and accompanying scanning recordings there.

user-7e5889 11 July, 2022, 12:02:15

thanks! But I am afraid that I can't find this #demo-workspace on this page. Could you send a screenshot to me?

user-9429ba 11 July, 2022, 12:05:43

There's a link to Demo Workspace on the page. Let me know if you still have trouble accessing

Chat image

user-7e5889 11 July, 2022, 12:10:26

I think there is no area named 'demo workspace' on this page. Is there any issue about cache or version release

Chat image

marc 11 July, 2022, 12:13:21

Also, here is the direct link to the demo workspace for you to checkout: https://cloud.pupil-labs.com/workspace/78cddeee-772e-4e54-9963-1cc2f62825f9/recordings

papr 11 July, 2022, 12:11:17

Please try a "force reload" of the page. Your browser is caching an out-of-date version of the web page. You can force reload with command+shift+r

user-7e5889 11 July, 2022, 12:38:11

Thanks! I have accessed the page, but there was no reaction after I clicked the 'heatmap'. I recorded a video. I can send it to your email if needed.

Chat image

papr 11 July, 2022, 14:13:59

The issue with displaying heatmaps in Safari has been fixed

papr 11 July, 2022, 12:45:18

This works on Firefox if you want to use it in the mean time.

papr 11 July, 2022, 12:39:13

Which browser do you use? Safari?

user-7e5889 11 July, 2022, 12:39:19

yes

papr 11 July, 2022, 12:39:27

Let me try to reproduce the issue

papr 11 July, 2022, 12:42:02

I am able to reproduce the issue. I will forward it to our Cloud development team. Thanks for reporting the issue!

user-7e5889 11 July, 2022, 12:53:14

Thanks! I accessed it on chrome and it worked. Iโ€˜ll refer to these success examples and try again tomorrow.

user-d1dd9a 11 July, 2022, 13:37:32

I had the use case where I didn't have access to the internet. Is a constant online connection required to use the Companion App or the recording?

papr 11 July, 2022, 13:38:35

Hi, an active internet connection is only necessary for the initial setup and uploading recordings to Pupil Cloud. Once setup, it can be used without internet.

user-d1dd9a 11 July, 2022, 13:38:25

Or is it only used for login?

user-d1dd9a 11 July, 2022, 13:39:53

OK. So if I'm logged in, can I use the app offline?

papr 11 July, 2022, 13:40:05

Yes, that is correct

user-d1dd9a 11 July, 2022, 13:42:47

thx.

user-d1dd9a 12 July, 2022, 14:04:59

To use the Pupil Invisible Monitor, I tried to make the OnePlus a HotSpot. To connect me with the tablet to this one. Unfortunately, after successfully connecting, I cannot call up the Pupil Invisible Monitor.

papr 12 July, 2022, 14:07:20

Hi ๐Ÿ™‚ My apologies, but I am not quite sure if I am able to follow the issue correctly. Can you clarify if you are referring to the new web app or the deprecated desktop app. And what do you mean you "cannot call up"?

user-d1dd9a 12 July, 2022, 14:08:11

Yes, I mean the new web application.

user-d1dd9a 12 July, 2022, 14:10:14

There seems to be a problem with the DNS. Unfortunately, I also have no information about which IP the OnePlus has if it establishes a hotspot itself.

papr 12 July, 2022, 14:25:10

Have you checked the app's Streaming view? It displays urls (incl. one with the phone's ip) which can be used to connect to the phone.

user-d1dd9a 12 July, 2022, 14:12:04

So that I can call IP-OnePlus:8080 in the browser.

user-95de6f 13 July, 2022, 06:55:51

Hi, I have 2 invisible glasses. It seems they're intermittently failing to connect between the android and glasses after several sessions of data collection. Most of the time the both scene and camera icons are inactive (grey) when connected to the phone. Otherwise it takes a long time to connect. I've updated the phone and app, not working. What can i do?

marc 13 July, 2022, 08:54:05

Hi @user-95de6f! I have a couple follow up questions:

Are the Android phones you are using OnePlus 6 or OnePlus 8 devices? For OnePlus 6 there would be a couple of things to watch out for, on OnePlus 8 connection should work out of the box.

Are you using the cable that was included in the box to connect the glasses and the phone, or another 3rd party cable?

Do both sets of glasses and phone behave the same? I.e. both loose the connection sometimes?

When you say "it stops collecting data", what exactly does that mean? Does the sensor show up as disconnected in the app and the circles stop filling up on the home screen? Or did you observe something else to come to this conclusion?

user-95de6f 13 July, 2022, 09:31:26

hi @marc sure, i'll ping separately

user-d1dd9a 13 July, 2022, 09:18:04

I tried that. Unfortunately, the browser returns the message that it cannot access the page.

papr 13 July, 2022, 09:32:08

Could you share a screenshot of the tablet's browser window attempting to connect to the phone's ip address and monitor port?

I am able to make the setup work (hotspot on phone, connect with external device).

Just to be sure: Have you tried force stopping and relaunching the app after setting up the hotspot on the phone?

papr 13 July, 2022, 09:18:38

Are you able to ping the device's ip address from the terminal?

user-d1dd9a 13 July, 2022, 09:23:43

Unfortunately, I have no way of doing this with the tablet or OnePlus.

papr 13 July, 2022, 09:24:04

What operating system is running on the tablet?

user-d1dd9a 13 July, 2022, 09:26:38

It's an android (android 11) device.

user-d1dd9a 13 July, 2022, 09:31:51

When I check the IP of the tablet it says something like 192.168.... but the IP of the OnePlus is supposed to be 10.59....etc. This shows me the streaming option in the companion app.

papr 13 July, 2022, 09:32:45

The ip of the tablet does not matter. The ip that matters is the one displayed in the app's Streaming menu.

user-d1dd9a 13 July, 2022, 09:33:10

O.k.

user-d1dd9a 13 July, 2022, 09:37:59

Does your hotspot have internet access?

papr 13 July, 2022, 09:39:36

It does but should not matter for the local connection of the web monitor app. ๐Ÿ™‚

marc 13 July, 2022, 09:38:51

Connection Problems with 2 devices

user-d1dd9a 13 July, 2022, 09:47:46

OK. I have now tried, as you said. With the streaming option, it is currently looking for the DNS service. Without the internet, this could be a problem.

papr 13 July, 2022, 10:00:40

I am able to reproduce the issue. Here is what's happening when the phone opens a hotspot: A. The phone has a public ip address that identifies it within which ever network it is currently connected to (might be none; in your case 10.59.x.y.z; can't be used to connect the monitor to) B. The phone gets a "hotspot" ip address that identifies the phone to all devices connected in the hotspot (this is what we are looking for).

Unfortunately, the phone's ui does not tell us (B) the hotspot ui. We need to use the tablet's ui to do that.

  1. On the tablet, open the wifi settings (it should be connected to the hotspot)
  2. Find the tablet's ip address (we do not need, but the ui for the next step is close)
  3. Find the "router ip address" (might be hidden behind an "advanced" dialogue or similar)

The "router" ip address corresponds to (B) and is the ip that you need to type into the browser (don't forget to add the port number! :8080)

user-d1dd9a 13 July, 2022, 09:49:24

It is waiting for DNS service...

user-d1dd9a 13 July, 2022, 10:31:50

I have the IP. However, when I enter it in the browser, I get the message: Website not available. ERR_CONNECTION_REFUSED

papr 13 July, 2022, 10:35:39

You might have forgotten to specify the port

user-d1dd9a 13 July, 2022, 10:38:23

I try an HTTP connection to "router" IP:8080

papr 13 July, 2022, 10:39:39

Have you installed any other third-party apps? e.g. a screen share application? Please try incrementing the port numbers by 1, e.g. 8081, 8082, etc

papr 13 July, 2022, 10:40:15

And can you confirm that the app is running?

user-d1dd9a 13 July, 2022, 10:46:47

Yes, the app is running. Have now tried to port 8085 without success.

user-d1dd9a 13 July, 2022, 11:52:03

Now it works! The error was that the RTSP service was not running.

papr 13 July, 2022, 12:03:55

Does the streaming menu continue to wait for dns?

user-d1dd9a 13 July, 2022, 11:58:41

Does this service start automatically?

papr 13 July, 2022, 12:06:08

How did you determine the service was not running? Did you do anything specific to enable it?

user-d1dd9a 13 July, 2022, 13:24:40

What I found is that the RTSP service only starts automatically when an internet connection is detected.

user-d1dd9a 13 July, 2022, 13:25:37

I get a notification that the service has started.

papr 13 July, 2022, 13:28:33

I spoke to our Android engineering team. The recommended setup is to have an external wifi of some sort, e.g. another phone providing the hotspot. Hosting the hotspot on the companion device technically works but is not officially supported. For me, the rtsp service is immediately available after setting up the hotspot (without wifi or internet connection) and fully restarting the app (force quit + launch).

user-d1dd9a 13 July, 2022, 13:32:52

If I do the same, then I only have the background service and the upload service. The upload service even though I turned off the automatic cloud upload.

papr 13 July, 2022, 13:41:46

But are you able to connect via the router ip and the corresponding port? It is very possible the rtsp service is only launched once requested via the http entrypoint api

user-d1dd9a 13 July, 2022, 14:41:52

Thanks for the extensive information and help.

user-d1dd9a 13 July, 2022, 14:39:44

I had to set up the environment because my tablet is not an LTE device, so I can't set up a hotspot with it. But I've now used my private smartphone for this, and it worked perfectly.

papr 13 July, 2022, 14:42:02

You are welcome!

user-d1dd9a 13 July, 2022, 14:45:00

This will help me during my next session to check that everything is working correctly while recording.

user-28ac9f 13 July, 2022, 15:28:27

Is anyone aware on how to disable the system update notifications? I tried this, but they still appear :-( https://www.nstec.com/how-to-stop-software-update-notifications-on-android/

marc 14 July, 2022, 07:13:23

I am not aware of a method to disable them unfortunately ๐Ÿ˜ฆ

user-28ac9f 15 July, 2022, 09:02:18

I managed to do it in a safer way with https://adbappcontrol.com/ ๐Ÿฅณ To use it, you need to enable dev mode and then USB debugging on the phone.

Afterwards I disabled the following package to get rid of the notifications and prevent any accidental system update * com.oneplus.opbackup

I also uninstalled some bloatware, namely * com.oneplus.membership (oneplus Red Cable Club) * com.oneplus.gameinstaller (onePlus game stuff) * com.oneplus.gamespace (onePlus game stuff) * com.facebook.services (Facebook) * com.netflix.mediaclient (Netflix) * com.netflix.partner.activation (Netflix)

So far everything seems to work fine. If I'd run into any problems, I'd report them here as FYI

user-28ac9f 14 July, 2022, 14:44:55

Have you ever done any tests with LineageOS?

user-fb64e4 14 July, 2022, 08:01:37

hello, I am trying to run a code that has used pupil_apriltags on windows 10. but I have problem installing the package. first it was about "problem building wheel" which got solved by changing a line in setup.py. now I'm getting another error "python setup.py develop did not run successfully. note: This error originates from a subprocess, and is likely not a problem with pip." and my pip is up to date. can anyone help me solve this?

papr 14 July, 2022, 14:46:25

The software is not tested on other OS than the specific OxygenOS versions for Android 8/9 (1+6) and 11 (1+8)

user-28ac9f 14 July, 2022, 14:51:38

I understand that this isn't officially supported; I just wanted to know whether you ever tried at all ๐Ÿ˜‰ I assume I might try and see if any of the issues from the troubleshooting guide comes up

marc 15 July, 2022, 09:03:49

Cool! Thanks for sharing it here @user-28ac9f! ๐Ÿ‘

user-2ecd13 15 July, 2022, 17:42:41

Hey, I don't know if this is the correct channel for my question, but I'd be happy to move it if need be.

We are using Pupil Invisible glasses and Pupil Player for analyses. Our lab needs to have the experimental room video footage in the background with the MET data overlayed on top. Is there anyway to have this setup?

Right now, if we drag the MET data folder into Pupil Player first, we can't resize the video. However, if we drag the background footage first, then we can't get the MET data overlayed on top, we can only see the MET video data without gaze data

nmt 18 July, 2022, 07:01:33

Hi @user-2ecd13 ๐Ÿ‘‹. Am I correct in assuming that you downloaded the recording from Pupil Cloud? If so, did you click 'Download Recording' or 'Download Raw Data'? The former contains human readable files, not meant to be opened in Player. The latter contains binary files that can be opened in Player (i.e. you'll see the video + gaze overlay). If you'd like to open the recording in Player, it's 'Download Raw Data' that you need (if in doubt, this is the larger download)

user-8d7781 18 July, 2022, 00:51:55

Hi guys, I am having some issues with the invisible companion app. When I start recording, it seems to be working until I try to stop the recording. It shows an error message saying "recording error" and then the app freezes. Is anyone able to help troubleshoot what is going on? I have tried clearing cache, resetting the app and turning the phone off and on again.

user-8d7781 18 July, 2022, 00:53:48

Attached is a photo of the error message.

Chat image

user-8d7781 18 July, 2022, 00:54:31

The recording seems to be fine for shorter sessions such as 2-3 minutes. However, when participants are spending 16-20 minutes recording, this is when we get the error message.

nmt 18 July, 2022, 07:03:55

Hi @user-8d7781 ๐Ÿ‘‹. Would you be able to share another photo of the error message? I can't seem to view the current one ๐Ÿ™‚

user-8d7781 18 July, 2022, 11:17:43

Chat image

user-8d7781 18 July, 2022, 11:18:46

When I received this message, the recording had cut off half way through the experiment. It also did not let me import the recording into pupil player, and said there was no available fixation data.

papr 18 July, 2022, 11:20:23

Hi, Please note that Pupil Player does not support detecting fixations from Pupil Invisible recordings.

marc 18 July, 2022, 11:22:50

Thanks for the screenshot @user-8d7781. Could you share the affected recording with [email removed] so we can take a look it?

user-8d7781 18 July, 2022, 11:35:49

Hi Marc, sure thing. I will be at work tomorrow and will email the recording through then. Thanks for this.

user-d4d4bc 19 July, 2022, 09:14:59

Hi,

We are planning to use the Invisible Glasses for tracking where the driver was looking while driving a vehicle. We had a few questions regarding that:

  1. How long should a person look at a specific point, e.g. a traffic light, or a pedestrian, at least, for the eye tracker to consider this a gaze and to stabilize around that point, and be sufficiently accurate?
  2. Is the precision of the identified point influenced by duration i.e. how long we look at that point.
  3. Is there a relation of accuracy and time to distance? i.e. if an object is further away, then do we need to look a little bit longer for the same duration?
  4. Should we do things like look around in our field of view after putting on the glasses or just start recording, no adjustments necessary?

Thanks. ๐Ÿ™‚

user-d4d4bc 19 July, 2022, 09:41:05

And is it possible to connect Pupil Invisible glasses with the Pupil Core Capture software? I would guess not, but just to confirm. ๐Ÿ™‚

marc 19 July, 2022, 11:48:24

Hi @user-d4d4bc!

1) + 2) The raw gaze signal is 200 Hz and the estimates are made independently per sample. That means that no stabilization is necessary and looking at the same point for a longer time does not affect the quality of the estimate. For fixation detection the gaze signal needs to remain stable on a target for a minimum duration though.

3) Again, the duration does not influence the accuracy. However, the distance of the object of regard does to a degree. If the subject looks at a target that is < 1 meter away, the gaze estimate will suffer from a "parallax error", which gets worse the closer the target is. This error introduces a constant offset in the predictions. For distances > 1 meter this error does not have an effect.

4) No further setup is necessary, the device works at full quality as soon as you put it on.

5) You can open recordings made with Pupil Invisible in Pupil Player. A bunch of features like obtaining fixations, blinks and gaze mapping are however only available in Pupil Cloud. It is not possible to use Pupil Core's gaze estimation pipeline in Pupil Capture after connecting Pupil Invisible directly to a computer.

Let me know if you have further questions!

user-25da8c 19 July, 2022, 12:54:20

hello guys, does invisible calculate pupil diameter ?

papr 19 July, 2022, 12:56:11

Hey, Pupil Invisible's eye camera angles are not suited to do so. Your current option for calculating Pupillometry is Pupil Core.

user-25da8c 19 July, 2022, 12:57:59

hey @papr turns out the university does have invisible . so core with tablet then for my custom app ?

papr 19 July, 2022, 13:01:25

If you are dependent on pupillometry, then yes. Note that uncontrolled light environments can make it difficult to get clean pupillometry data due to the pupillary light reflex. Maybe there are other non-pupillometry metrics that can help answer your research question (allowing the use of Invisible).

user-25da8c 19 July, 2022, 13:03:48

well the basis of the research is based on pupillometry but I will talk to my advisor if we can see other options . thanks again!

user-f36bd4 19 July, 2022, 15:48:45

Hello Pupil Labs Team , our pupil invisible oneplus 8 companion app flickers and the screen stays bright , is the issue known ?

marc 19 July, 2022, 15:50:30

Hi @user-f36bd4! No, I don't think I have heard of such an issue before. Could you maybe share a video of this behavior?

user-2ecd13 19 July, 2022, 17:53:47

@nmt Hey Neil, we have been exporting the raw eye-tracking data from the phones to the computer. So far we have not used cloud services.

For the video footage inside our room, we are running the video data through a video conversion script that was provided by pupil.

nmt 20 July, 2022, 11:16:16

Hi @user-2ecd13. Are you referring to the script that enables third-party videos to be loaded in Pupil Player and then manually synchronised with the Invisible recording?

user-8d7781 19 July, 2022, 23:59:45

Hi Marc, I just shared the google drive folder with that email. The folder was too large to send via email.

papr 20 July, 2022, 07:13:52

Hi, I had a look at why Player was not able to open the recording. Turns out, the shared recording is a partially upgraded recording. Pupil Player crashed during the initial recording upgrade due to PI world v1 ps2.mp4 being corrupted. To fix this issue, follow these steps: 1. Delete PI world v1 ps2.mp4 and PI world v1 ps2.time 2. Delete info.player.json 3. Rename info.invisible.json to info.json 4. Open the recording in Pupil Player

Player will re-attempt the recording upgrade and this time it should show you video from the first part that was recorded before the error appeared.

user-28ac9f 20 July, 2022, 09:38:50

I am trying to setup the Monitor. I have to figure out how to do this, in our IT sec setup.

AFAIK creating the hotspot on the companion device will not work at all, as mDNS is not supported in Android 11 (which is mandatory for the connection to the glasses to work) [1][2].

Thus I am trying to use a hotspot from my company phone, but it seems to partially break sometimes as well (the streaming works, but all actions will not; probably due to UDP blocking?)

[1] https://blog.esper.io/android-dessert-bites-26-mdns-local-47912385/ [2] https://issuetracker.google.com/issues/140786115

papr 20 July, 2022, 09:41:02

but all actions will not Could you clarify what you mean by that?

papr 20 July, 2022, 09:40:01

Hey, creating the hotspot on the companion device is not recommended. Rather, use a third device to create the hotspot and connect the companion and monitor device to that hotspot.

user-cd3e5b 20 July, 2022, 12:19:32

@papr this process should also work if instead of the cloud data I use the data obtained directly from the device, correct? If I'm not mistaken, I am downgrading from 200Hz (cloud) to ~66Hz (local), and that's pretty much it, right?

papr 20 July, 2022, 12:21:17

correct!

marc 20 July, 2022, 12:25:15

Note that since a recent update the real-time gaze estimation is much higher than 66 Hz though! Its >120 Hz now.

user-cd3e5b 20 July, 2022, 12:42:00

thank you @marc and @papr !

user-3b418f 20 July, 2022, 13:18:56

Hi, was just wondering what the cloud storage capacity was for the Pupil Invisible Glasses?

marc 20 July, 2022, 14:13:23

Hi @user-3b418f! The storage capacity in Pupil Cloud is unlimited!

user-3b418f 20 July, 2022, 14:51:26

Oh cool, would there ever be any plans to change that do you think? If pupil labs got bigger etc? Trying to work out data management from the perspective of higher education ๐Ÿ™‚

marc 20 July, 2022, 15:01:40

We might limit the capacity in the future, or setup something like a payed subscriptions for very large accounts. But nothing is currently planned and we'd make sure to announce a change like this early!

user-3b418f 20 July, 2022, 15:03:29

can I make a request in advance that if you do make plans, to create pop-ups or disclaimers that appear on the cloud website itself? We never check the generic intuitional email we used for the account creation ๐Ÿคฃ

marc 21 July, 2022, 07:18:54

Thanks for the feedback ๐Ÿ˜„ Yes, we'll make sure to it obvious also within the app itself!

user-b14f98 20 July, 2022, 17:45:17

Haven't used Invis in a while. I'm delighted to see the new fixation detector ๐Ÿ™‚

user-3b418f 20 July, 2022, 20:10:41

So basically the way to apply the various visual interpretations of the gaze data is by โ€˜enrichingโ€™ them. First create a project from the footage you want โ€˜enrichedโ€™, then within that project select โ€˜add new enrichmentโ€™ which will be the blue button at the top. Then follow the process of adding the enrichment you want. For the little circle that tracks the gaze data, select gaze overlay. Etc...

user-b14f98 20 July, 2022, 17:45:30

...at the same time, I see it in the cloud, but not when i download the recording?

user-b14f98 20 July, 2022, 17:45:35

What's going on there?

user-b14f98 20 July, 2022, 17:46:11

I right click, and select "download recording" (and not "download raw data")

user-b14f98 20 July, 2022, 17:46:23

I would have expected the gaze and fixation overlay on the raw video.

user-b14f98 20 July, 2022, 17:51:10

It is as if the option to download recording is just returning the raw data.

user-b14f98 20 July, 2022, 17:51:45

it returns a downloadable pupil labs folder "raw-data-export"

user-b14f98 20 July, 2022, 18:08:53

...and, why can I enable fixations when viewing one video, but not another?

user-3b418f 20 July, 2022, 20:11:08

Then once added, you can select the play button to process the new enrichment that should appear on the screen, and then download that.

user-b14f98 20 July, 2022, 20:12:24

I have created a project and have tried this. Something seems to be malfunctioning. And what about this issue that downloading only returns raw data, but never the processed video?

user-b14f98 20 July, 2022, 20:19:22

It looks like the application of the gaze overlay enrichment stalled for some reason, which may explain why one video lacked the overlay. One source of confusion is that another video had the overlay without being in a project, and without enrichments applied to it.

user-8d7781 21 July, 2022, 00:06:53

Hi Papr, thanks for this... I will try this shortly. On a side note, do we know why the PI world v1 ps2.mp4 was corrupted? It keeps happening for every recording now, where when we go to stop any recording, the app freezes and continues to record despite clicking stop. We aren't able to use this device because it keeps producing corrupted files, so would love to figure out why it's happening ๐Ÿ˜ฆ Thanks

papr 21 July, 2022, 07:42:28

Could you please use a second phone and film the steps to reproduce the issue? Regarding the corrupted scene video: There is one known issue where a disconnect while the in-app preview is opened can cause the app to crash.

user-8d7781 21 July, 2022, 01:19:09

I just tried deleting those files, and renaming the info file but when I go to put this into Pupil Player it says: 'Invalid Recording - There is no info file in the target directory'

marc 21 July, 2022, 07:37:38

@user-b14f98 Since the introduction of the fixation detector, fixation data is computed for every recording on-upload to Pupil Cloud. Fixation data has also been calculated post hoc for all recordings already in Pupil Cloud before the release. If you download a recording in CSV format, (i.e. "Download Recording" rather than "Download Raw Data"), the files should include a fixations.csv file that contains the fixation data. Let me know if that is not actually the case for any of your recordings!

The scene video included in this download is intentionally the unaltered original scene video with no gaze overlay visualizations applied. The idea is that you could use it to create any custom visualization on top of it, without having to deal with visualizations already in there. If you are interested in a video with gaze overlay, using the "Gaze Overlay" enrichment as @user-3b418f said would be the recommended way. If the computation of this enrichment is not finishing something might be wrong with it. Could you let me know it's enrichment ID (visible in "View Details") so we can take a look?

Outside of using the gaze overlay enrichment, there is currently no other way of obtaining such a video, except maybe screen recording your browser. The scanpath visualization for fixations which is now available in the recording player can currently not be exported as video. We are aware that the options for exporting scene video with visualizations are still pretty limited. We are currently working on two things to improve on this:

marc 21 July, 2022, 07:37:39

1) we will soon publish a Python library together with guides that make the handling of video, timestamp matching and the manual creation of custom visualizations much easier. 2) A bit further down the road still, but we plan on adding a bigger video rendering tool, which will allow you to create all kinds of visualizations on top of the scene video, including gaze, fixations, eye video overlay, face/surface detections etc. Release will not be before 2023 though.

papr 21 July, 2022, 07:40:11

Sounds like there might be a spelling mistake in there. The renamed file should info.json. If this is already the case, could you please share it here?

user-8d7781 22 July, 2022, 04:59:00

Hi Papr, I went to upload the new folder and saw that I was calling it info.json.json (json added by default) - sorry my mistake, I'm still very new to all of this. Thanks for your patience, it seems to be working fine now Appreciate it!

user-fb64e4 21 July, 2022, 17:32:48

Hello, I am using pupil invisible glasses and I want to use the scene video in my code. But I have problem connecting to the device. I can see the video in http://pi.local:8080/ but I get this in python using real-time API: (I use windows 10 and Python 3.10.4)

device = discover_one_device(max_search_duration_seconds=10)

frame, gaze = device.receive_matched_scene_video_frame_and_gaze() Traceback (most recent call last): File "<stdin>", line 1, in <module> AttributeError: 'NoneType' object has no attribute 'receive_matched_scene_video_frame_and_gaze'

Can someone help me with this?

user-b14f98 21 July, 2022, 18:06:33

Thanks, Marc. To reiterate, the issue is that the top video in the image below gave me the "fixation" toggle without any additional work, before it was in a project. However, I cannot seem to activate the same "fixation" toggle in the bottom video, with the enrichment ID 22e8850b-e664-48c3-b173-d14f498117ca

After noticing the difference I did add both videos to the same project ("cycling") with the same "gaze overlay" enrichment applied from the start to the end of the video. The issue was not resolved.

Notice that the bottom video has an "!" for an icon. Something may be wrong with it.

Chat image

marc 22 July, 2022, 07:23:27

Thanks for the screenshot, that's helpful! Indeed this exclamation mark indicates that something went wrong during the processing of this recording. Given that the fixation toggle is not showing, the error seems to have happened during the fixation calculation. Via the enrichment ID you shared we should be able to identify the recording, see what went wrong, and fix it! I'll keep you up to date!

user-f408eb 21 July, 2022, 19:52:38

@marc Dear Ladies and Gentlemen, After the University of Passau bought the Pupil Lab Invisible, we are now conducting a study with it. The following error message was displayed. I hope you can help us with this:

nmt 22 July, 2022, 07:05:05

Hi @user-f408eb ๐Ÿ‘‹. Thanks for sharing the screenshot. Please reach out to [email removed] and a member of the team will help you.

user-f408eb 21 July, 2022, 19:52:47

Chat image

papr 22 July, 2022, 07:25:58

If device is none that means that no device was discovered. Could you share details about your network setup?

user-fb64e4 22 July, 2022, 08:36:50

I connected the companion device and my PC to the same wifi router. I thought it was okay since the local app worked. is other setup needed for connecting?

marc 22 July, 2022, 08:08:45

We have restarted the processing of the failed recording, which has fixed the issue. The exclamation mark should now be gone and you should be able to toggle the fixation visualisation!

user-b14f98 22 July, 2022, 11:32:14

Thanks!

papr 22 July, 2022, 08:12:36

alternatively to

device = discover_one_device(max_search_duration_seconds=10)

you can also use

from pupil_labs.realtime_api.simple import Device
device = Device("pi.local", 8080)  # or use ip address instead of domain name
user-fb64e4 22 July, 2022, 08:37:32

does this mean that the device is not connected? Device(ip=pi.local, port=8080, dns=None)

the device is still 'NoneType' i think it did not work

user-011cbf 22 July, 2022, 09:05:58

We want to use the invisible for user testing on a digital screen. We want to track what the users see first and how they navigate through the prototype. Currently it seems that the glasses have a slight shift to the left if we look at the fixations. Is there a way to improve this? We have tested with several people. Another question we have is if there is a certain distance needed between the glasses and the screen?

user-c5f657 22 July, 2022, 09:20:58

Is it possible to get the realtime stream of the gaze data into Unity3D for pupil invisible?

nmt 22 July, 2022, 10:30:31

You can use you real-time network API (https://docs.pupil-labs.com/invisible/getting-started/understand-the-ecosystem/#real-time-api) to access Pupil Invisible's data streams. However, we have no Unity3D integration for Pupil Invisible!

nmt 22 July, 2022, 10:27:56

Hi @user-011cbf ๐Ÿ‘‹. Thanks for your questions!

For users looking at a digital screen positioned less than 1 m away, there may be some parallax error. You can use the offset correction feature to compensate for this. To access this feature click on the name of the currently selected wearer on the home screen and then click "Adjust". There are instructions in-app. The offset you set will be saved in the wearer profile and applied to future recordings of this wearer automatically. Important to note: * the offset is valid at the viewing distance it was set, which should* be okay for screen-based work.

Have you also checked out the Marker Mapper Enrichment? You can add April Tag markers to your screen and use the enrichment to obtain gaze positions relative to the screen: https://docs.pupil-labs.com/invisible/explainers/enrichments/#marker-mapper

user-011cbf 22 July, 2022, 14:02:42

Thanks so much this helped a lot to solve it.

user-c5f657 22 July, 2022, 11:08:53

Thank you for the response. The realtime stream works fine, and I wanted to get the marker id using april tags. However, I have some issues for installing pupil-apriltags, and could not fix it.

C:\Users\Admin>python Python 3.9.13 (tags/v3.9.13:6de2ca5, May 17 2022, 16:36:42) [MSC v.1929 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information.

from pupil_apriltags import Detector

at_detector = Detector(families='tag36h11', ... nthreads=1, ... quad_decimate=1.0, ... quad_sigma=0.0, ... refine_edges=1, ... decode_sharpening=0.25, ... debug=0) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "C:\Users\Admin\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\pupil_apriltags\bindings.py", line 285, in init self.libc = ctypes.CDLL(str(hit)) File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.9_3.9.3568.0_x64__qbz5n2kfra8p0\lib\ctypes__init__.py", line 374, in init self._handle = _dlopen(self._name, mode) FileNotFoundError: Could not find module 'C:\Users\Admin\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\pupil_apriltags\lib\apriltag.dll' (or one of its dependencies). Try using the full path with constructor syntax.

Any tip to fix the issue? I tried to update the binding.py as the recommendations of some guys, but it does not take into effect.

nmt 22 July, 2022, 11:37:00

This I do not know. Do you have any insight, @papr?

papr 22 July, 2022, 11:38:19

Could you please share the file contents of C:\Users\Admin\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\pupil_apriltags\lib\?

user-c5f657 22 July, 2022, 11:48:51

Here it is please!

Chat image

user-b5a4f6 23 July, 2022, 15:58:13

I spoke to our Android engineering team

user-cc819b 25 July, 2022, 07:59:27

Is it possible to send annotation events over a network to the pupil Invisible? If I remember correctly this was possible with pupil Core via 0MQ, does a similar solution exist for the Invisible companion app?

marc 25 July, 2022, 08:04:36

Hi @user-cc819b! Yes, this is possible via the real-time API. You can find an introduction here: https://docs.pupil-labs.com/invisible/how-tos/integrate-with-the-real-time-api/introduction/

user-cc819b 25 July, 2022, 08:07:11

thanks!

user-d4d4bc 25 July, 2022, 11:47:08

Hi,

I had a question. We have an application where we would like to do both, head pose tracking, and surface tracking simultaneously in the same vehicle.

For surface tracking, we were planning to use different sized markers.

For head pose tracking, as I understand, we are required to use the same sized markers.

I was thinking what we could do is, we could use 36h11 for head pose tracking, and run surface tracking only with Circle21h7 markers.

As I understand, head pose tracking is currently compatible only with 36h11 markers.

Would this be okay? As in would there be a problem with the head pose tracking algorithm, owing to different sized Circle21h7 markers placed in the vehicle?

In the future, is it planned that the head pose tracker starts detecting Circle21h7, which could lead to problems with the functionality?

Thanks. ๐Ÿ™‚

marc 25 July, 2022, 12:58:18

Hi @user-d4d4bc! This might work just fine, but some of the marker families can have false positive detections with markers of other families. As this is combining square and circular markers, I imagine this would not be the case, but I'd recommend validating this. If we change the compatibility of the head pose tracker to other families, we would most likely allow the user to select the family, so this should not be an issue.

An alternative to using different families would be to split the markers into two sets by their ID. I.e. use constant size markers for head pose tracking with a specific set of IDs, say 1-16. And varying size markers for surface tracking with IDs 17-32. For the surface tracking you can easily select which of the detected markers should be used for surface tracking. For the head pose tracker you'd have to generate the model without the other markers present (or covered).

user-2c09ac 25 July, 2022, 13:00:44

Hi there,

I have some really annoying issues while using the glasses with Invisible. I think there might be some hardware connection issue, as the glasses seems to randomly disconnect from the phone, with error messages and phone vibrations. This happens approximately 9 times over 10, and my team and I have not been able to record any usable signal from our experimentations.

Is there any way to fix this issue, or do we need a hardware replacement?

Thank you

marc 25 July, 2022, 13:02:04

Hi @user-2c09ac! This sounds like a hardware issue. Please reach out to [email removed] for a diagnosis and potential repair/replacement!

user-2c09ac 25 July, 2022, 13:02:29

Thaks Mark for the quick reply.

user-d4d4bc 25 July, 2022, 13:35:12

Thanks Marc. ๐Ÿ™‚

user-413ab6 26 July, 2022, 05:29:18

Hi! Question about Pupil Cloud: I have created a project and added two enrichments: raw data export and gaze overlay. I have downloaded this data to my local storage. Now when I add a new recording to the project and compute the enrichments, is there any way to download only the enrichments for the newer recording? Is there a way to download enrichments for each recording separately?

wrp 26 July, 2022, 05:46:39

Hi @user-413ab6 ๐Ÿ‘‹ You will need to download the entire enrichment. For Raw Data Exporter it's likey not that big of a deal because the download size should remain small. However, for Gaze Overlay the download can get quite large as it is videos, which can take a while.

This is good feedback and we will incorporate a way to make more granular downloads in future updates to Cloud.

user-413ab6 26 July, 2022, 05:54:44

Thanks for the prompt response.

user-4f3037 27 July, 2022, 06:06:09

Hello! Question about the blink detection: is there any chance to get the detected blinks as .csv file for data collected before the introduction of this feature in the Cloud? I am specifically talking for data collected in January, 2022. Thanks for your support! ๐Ÿ™‚

marc 27 July, 2022, 08:12:37

Hi @user-4f3037! On release of this feature we have initially not retroactively calculated blink data for all recordings in Pupil Cloud. Recently, we have however revoked this decision and added blink and fixation data to all existing recordings. So in theory your recordings should already have this data and the blinks.csv file should be included when selecting Download -> Download Recording or using the Raw Data Export enrichment.

If this is not the case for your recordings, then please let me know the according recording IDs, so we can invoke the calculation. Alternatively make a new project with all affected recordings and let me know the project ID if that is easier!

user-8a4f8b 29 July, 2022, 11:00:00

Is it possible to use Invisible with Android phone other than OnePlus 8 or 6?

marc 29 July, 2022, 11:06:53

Hi @user-8a4f8b! No, only those two models are currently supported. We have very specific requirements for the hardware and operating system and these things vary wildly between manufacturers and models, so we need to limit our support to just those devices. You can use any OnePlus 6 or 8 device though (assuming a compatible Android version), they do not need to be purchased from us.

user-8a4f8b 29 July, 2022, 11:31:42

thanks. I live in Japan, and OnePlus do not have Technical Conformity Mark (https://www.tele.soumu.go.jp/e/adm/monitoring/illegal/monitoring_qa/purchase/purchase.htm), which is necessary to use any smartphone in Japan legally. Do you know any smart phone that is legal to use in Japan for Invisible?

user-8a4f8b 29 July, 2022, 15:02:33

@marc or would it be possible to transfer/steam recorded data from OnePlus via USB to PC (i.e. without using wifi nor bluetooth)?

marc 01 August, 2022, 07:49:10

Update: I can confirm, if you purchase a OnePlus8 device in Japan, they do have the required certification. In case you have not made your Pupil Invisible purchase yet, you could buy it without the phone from us for a discounted price too.

marc 01 August, 2022, 07:33:31

Hi @user-8a4f8b! It is possible to transfer recordings vis USB to a computer and to operate Pupil Invisible fully offline (except in the very beginning you need to create an account and login to the app with internet). However, this way you would not be able to use Pupil Cloud, which means you'd not have access to e.g. blink and fixations data, or various gaze mapping tools, so generally this is not recommended.

There is no other model of phone that is currently compatible. If you'd purchase a OnePlus 8 device in Japan, might it have the according Japanese certification then? I am personally not an expert in how this certification works, but I know that we have a lot of Pupil Invisible users in Japan, who must have a way of dealing with the issue. ๐Ÿค” I'll consult with the team to see if anyone else has more insight in this!

user-8a4f8b 31 July, 2022, 13:33:05

I just found this. so it does seem possible: https://docs.pupil-labs.com/invisible/how-tos/data-collection-with-the-companion-app/transfer-recordings-via-usb.html

papr 01 August, 2022, 06:24:06

Please note that you cannot stream data in realtime via USB. You can only transfer the recordings once they are exported via the app.

user-430fc1 31 July, 2022, 15:01:45

Hello, what's the end-to-end sample delay for real-time streaming of gaze data with invisible, and at what frequency can the gaze data be accessed?

papr 01 August, 2022, 06:22:28

The delay will depend on your connection. You can either use Wifi or a USB-C hub with ethernet to connect your streaming client to the phone. I can try to measure the delay on the wired connection later. The realtime api is able to receive data at the full frame rate that the phone is producing.

End of July archive