πŸ•Ά invisible


user-a7b1b5 01 August, 2024, 07:49:32

Thanks a lot. One more thing. I added the reference picture and cliked the running button but it is not processing

Chat image

user-f6ea66 01 August, 2024, 08:14:02

Hi, I have an issue. Am not sure if this should be a troubleshooting question. I have the invisible companion (IC) app and I had two functioning workspaces on it. After some app start issues I "deleted the cache" in the phone app settings for the IC. Now it only sees the default workspace, which isn't the one I need to work in.

user-f6ea66 01 August, 2024, 08:14:21

In the pupil cloud my workspace is still there and seemingly working fine.

user-f6ea66 01 August, 2024, 08:14:41

I tried to google solutions, but nothing specific came up.

user-f6ea66 01 August, 2024, 08:15:59

In the app it says I should create a new workspace in the pupil cloud, which I tried. But it doesn't seem to ask me to connect to a specific phone or similar. I don't remember how I did it when I first made the pupil cloud account.

user-f6ea66 01 August, 2024, 08:16:36

So the question is - how do I either reconnect a workspace to a specific phone app - or just make an additional workspace would be fine as well

user-f6ea66 01 August, 2024, 08:16:38

Thanks!

user-480f4c 01 August, 2024, 08:16:51

@user-f6ea66 can you please create a ticket in the πŸ›Ÿ troubleshooting channel?

user-f6ea66 01 August, 2024, 08:17:01

Yes

user-f6ea66 01 August, 2024, 08:17:56

Did it, you can erase the above if you want.

user-529f78 02 August, 2024, 12:40:26

Hi, I have the following question: Is it possible for invisible to calculate the gaze (direction vector) in world coordinates? In imn data, there is no yaw available, since to magnetometer is not build in the glasses. However, is it somehow possible using gaze and imu to recalculate the gaze direction in world coordinates? May be using the video? Thank you in advance!

user-07e923 02 August, 2024, 12:46:45

Hey @user-529f78, thanks for getting in touch πŸ™‚ Regarding yaw computation, it's not possible if you want to compute absolute yaw. You can try to compute relative yaw, but then this will be affected by drift over time. See https://discord.com/channels/285728493612957698/633564003846717444/1030029919473913886

As for your other question, the issue here is that gaze is always provided with reference to the wearer. If you don't have a known position/coordinate in the environment (like a marker or something), then it's difficult to calculate this without using some kind of sophisticated object-detection/face-detection algorithm. So when that object moves out of the scene camera, then you don't know where it is, how far it is from the scene camera, etc.

To map gaze onto faces, you could try using the face mapper, but it won't work if the face moves out of the scene camera. It also won't tell you how far the faces are from the scene camera.

user-529f78 02 August, 2024, 12:45:32

Actually, what I want to find at the end is the distance (in degrees) to an object (e.g. face), even when the object is outside of the field of view. May be there is another, "easier" way to find it. I would be happy, if you could help me with it.

user-529f78 02 August, 2024, 13:02:35

@user-07e923 thanks for the fast reply. Yes, I see. When the object is out of the scene camera (and it is also moving) then of course it is difficult to calculate the distance. Yes, I am going to use the face mapper to calculate the distance when it is in the scene camera. My thoughts was that may be it is possible to interpolate the distance to the face when is is outside of the field, assuming that the object is still there. What do you think, how large is the drift?

user-529f78 02 August, 2024, 13:07:10

I have also another question, which confuses me due to the orientation of the axes. In the imu data, pitch and roll are the rotations around, correspondingly, y and x axes, as the standard definition, or around x and z, since z is directed down? How they are calculated from accelerometer and gyroscope data?

user-07e923 02 August, 2024, 13:21:36

Regarding IMU, I think this figure that shows how the IMU is oriented in Invisible should help clarify the data.

user-529f78 02 August, 2024, 13:10:15

Regarding the message: does Pupil Player works with Invisible data?

user-07e923 02 August, 2024, 13:19:45

Yes, Pupil Player works with Invisible data. Please download the "Pupil Player format" or the "Native Recording data" from Pupil Cloud.

user-f6ea66 02 August, 2024, 14:55:17

Hi, I downloaded the Pupil Player. My question is if i can somehow add the Marker Mapper enrichment in pupil player, or if that is only possible in the Pupil Cloud? Thx

user-480f4c 02 August, 2024, 15:01:41

Hi again @user-f6ea66! Yes, Pupil Player has a similar functionality, called Surface Tracker. To use it simply download the recording from Cloud (in Pupil Player Format) or transfer it directly from the phone to your laptop/PC, load the recording on Pupil Player, and select the Surface Tracker for the plugin manager. This plugin detects the markers and allows you to define surfaces similar to what you can do on Cloud with marker Mapper.

user-f6ea66 02 August, 2024, 15:02:33

Ok thanks!

user-f03094 06 August, 2024, 08:12:48

Hi, I have the following question: I am currently analyzing Timeseries Data exported from Pupil Cloud, and I've noticed that there are instances in the gaze.csv file where the fixation id and blink id overlap. In such cases, which one should be considered correct?

Chat image

user-480f4c 06 August, 2024, 09:00:22

Hi @user-f03094! It is possible for fixations and blinks to overlap. During certain phases of the blink sequence, such as when the eyelid begins to close or starts to reopen after a blink, the pupil may still be visible. This can result in overlapping classifications of fixation and blink.

In general, I recommend excluding gaze points that are close to blink events. A common practice is to add buffers of 100-150 milliseconds around blinks (e.g., 100 milliseconds before the start of a blink and 100 milliseconds after the end of the detected blink).

user-f03094 06 August, 2024, 09:33:45

@user-480f4c Thanks for the fast reply. You've clarified my question, and I appreciate it. On a different note, I have also emailed @user-07e923regarding the reason for the fluctuations in the timestamp[ns] in the gaze.csv file within the Timeseries Data, which range approximately between 4000000 and 8000000[ns]. If you have time, I would appreciate any insights you can provide on this matter as well. Thank you!

Chat image

user-07e923 06 August, 2024, 09:56:10

Hi @user-f03094, as described in the email, I'm still collecting information regarding this, and I can only give you a response next week.

user-f03094 06 August, 2024, 10:01:31

@user-07e923 Thank you for your response. I'll wait for your reply next week. I appreciate your help!

user-bef103 13 August, 2024, 11:42:36

Hey guys

user-bef103 13 August, 2024, 11:42:53

can the pupil glasses record and be used without a phone

user-bef103 13 August, 2024, 11:43:08

does the online platform have the capability of conducting recordings?

user-07e923 13 August, 2024, 11:50:37

Hey @user-bef103, thanks for reaching out πŸ™‚ Pupil Invisible must be connected to the Invisible Companion Phone and app for power and for recording, respectively.

It's not possible to record on Pupil Cloud, because Pupil Cloud is for storage and analysis.

user-bef103 13 August, 2024, 11:51:01

Yea i thought as well

user-bef103 13 August, 2024, 11:51:29

so its not even possible to use companion app without a oneplus phone?

user-07e923 13 August, 2024, 11:55:01

For Pupil Invisible, the compatible devices are OnePlus 6, 8, and 8T. We don't recommend using other devices because we've done rigorous testing on them.

user-bef103 13 August, 2024, 11:55:23

okay interesting, but in theory other android devices do work?

user-07e923 13 August, 2024, 11:57:37

Like I said, we don't recommend using any other devices.

A colleague of mine is telling me that the app cannot be used on other devices.

user-bef103 13 August, 2024, 11:57:56

Yea i get that

user-bef103 13 August, 2024, 11:58:14

Thanks

user-bef103 13 August, 2024, 12:05:15

@user-07e923 sorry if i am botheritng you alot. does this also apply for neon glasses?

user-bef103 13 August, 2024, 12:05:38

regarding the recommnded oneplus phones

nmt 13 August, 2024, 12:12:38

Hi @user-bef103, if I can interject, both Invisible and Neon companion apps can only be downloaded from Google Play Store on the supported phones and Android versions. They cannot be used on other devices or Android versions

user-bef103 13 August, 2024, 12:14:44

So its also just the recommended oneplus phone?

user-07e923 13 August, 2024, 12:15:55

Here are the recommended devices for using Neon. Please note that if you've questions concerning Neon, please post them in the πŸ‘“ neon channel. Thanks.

user-bef103 13 August, 2024, 12:16:38

Will do in the future, thanks alot guys

user-5553ff 20 August, 2024, 15:31:15

Hi all! When I go into the pupil invisible companion app to create a new β€œwearer”, I keep getting a message saying β€œsyncing wearers.” This message has been on the phone for nearly 24 hours. I have tried the following steps (all of which have been unsuccessful): Disconnecting from wifi, restarting the app, clearing the app cache, and restarting the phone. Any troubleshooting tips?

user-480f4c 20 August, 2024, 15:34:49

Hi @user-5553ff! We received your email and my colleague has already replied. πŸ™‚ But since you reached out here as well, would you mind trying to log out and log in again in the app?

user-472e90 21 August, 2024, 10:54:31

Hello! Can you tell me where I can write a technical question? I am interested in the following question: is it possible to calibrate The Pupil Invisible glasses?

user-d407c1 21 August, 2024, 11:05:03

Hi @user-680341 πŸ‘‹ ! This is the right channel for Pupil πŸ•Ά invisible technical questions.

Pupil Invisible uses a neural network-based gaze estimation system. That means it has been trained with a large cohort of eyes and conditions and can generates gaze data for each eye camera image without the need for calibration.

Thus, while direct calibration isn’t possible, what you can do, is to apply an offset correction to adjust the gaze estimate.

This can be done, prior to make the recording (have a look here) or if you already made the recordings, you can apply it post-hoc in Pupil Cloud.

user-680341 21 August, 2024, 11:21:41

Ok, I see. I will try this. Thanks)

End of August archive