invisible


Year

marc 01 December, 2020, 12:35:03

@user-98789c What type of processing are you looking for? There is a lot of open source software available e.g. around the Pupil Core project. I am not aware of an academic publication that used Pupil Invisible yet.

user-98789c 01 December, 2020, 16:43:22

To extract gaze direction and duration, pupil diameter (with the Core device), heat map, etc. while participant is staring at a book or a monitor.

papr 01 December, 2020, 16:44:29

@user-98789c You can open recordings in Pupil Player. It will export the raw data as csv files and generate heatmaps if you setup surface tracking (marker-based AOI tracking).

user-98789c 01 December, 2020, 20:43:53

thanks @papr I am familiar with the surface tracking plugin. and if you know of scripts to extract gaze duration and also pupil diameter from the csv files recorded by Core, can you give me the links to them?

papr 01 December, 2020, 16:44:45

You can download it here: https://github.com/pupil-labs/pupil/releases/latest#user-content-downloads

marc 01 December, 2020, 20:48:48

@user-98789c The pupil diameter is reported within the CSV files. The CSV files further contain which gaze samples are on the surface and which are not. You simply need to aggregate that data to calculate gaze duration.

user-16e6e3 01 December, 2020, 20:51:14

Hi! Does the output of Pupil Invisible also include head movement data?

user-17da3a 02 December, 2020, 16:59:43

Hey guys, I am new to Pupil invisible and I am trying to get familiar with the functions of Pupil player. I downloaded a sample recording from the web, and I dropped the directory of the recording to pupil player window. Once I do so, I see the comment shows “uploading may take a while!” just for a second and then suddenly the Pupil player windows get closed automatically. Could that be a problem with installation? Or maybe I should have not dropped the whole sample folder into player window?

marc 02 December, 2020, 17:24:00

@user-16e6e3 The output of Pupil Invisible contains data from an IMU. This would give you relative headpose. For absolute heatpose we have a marker based algorithm available in Pupil Player.

papr 02 December, 2020, 17:24:23

@user-17da3a hey, welcome to the channel. You are supposed to drop the whole folder, correct. It is normal that that window pops up and closes again. But afterward, you should see the main player window. Could you be specific from where you downloaded the example recording?

user-17da3a 02 December, 2020, 17:34:03

Thanks for your reply. I took the recording from the following link: https://docs.pupil-labs.com/core/software/pupil-player/#load-a-recording No, unfortunately both windows disappear, and I have to start with opening the player again.

user-e6124a 02 December, 2020, 23:11:41

can i export my recording videos and data streams from my device without uploading them to the cloud?

nmt 03 December, 2020, 08:30:01

@user-e6124a Recorded data can be exported from the companion device to another storage medium via USB. For example, you can transfer your recordings to a PC and analyze them in Pupil Player.

user-e6124a 03 December, 2020, 15:04:12

thanks, i was able to find instructions for how to enable MTP a on the OnePlus and download my recordings. it might be nice to document that somewhere on your site, as it's not exactly simple.

nmt 04 December, 2020, 15:17:31

Thanks for the feedback @user-e6124a

user-e6124a 04 December, 2020, 00:49:12

@nmt does the Pupil Player application work on macOS Big Sur? it doesn’t show any UI when I launch it, or if I drag a recording folder onto it.

papr 04 December, 2020, 08:27:54

@user-e6124a hi, no, at the moment we cannot support macOS Big Sur yet. We are waiting for a OpenGL-related CPython bug to be fixed and released. Once that has happened, we will be able to make a new macOS release for Pupil Core software.

user-e6124a 04 December, 2020, 16:06:55

what's the bug? did you file it at bugreport.apple.com?

papr 04 December, 2020, 16:28:04

@user-e6124a https://bugs.python.org/issue41100

papr 04 December, 2020, 16:29:14

@user-e6124a I think the actual bug has been fixed on CPython master but it has not been released yet.

user-e6124a 04 December, 2020, 17:11:59

ahhhh gotcha. if i built pupil player from source would it address this issue?

papr 04 December, 2020, 17:33:38

@user-e6124a No, unless you also install CPython from master. Not sure how easy that is to do. Checkout pyenv. It is a great tool to install specific python versions

user-98789c 04 December, 2020, 20:20:09

Hi everyone, I need to know what the settings are for the Time Sync plugin.

To strat recording with Invisible, I get the error: WORLD: Pupil Mobile Stream is not in sync. Enable Time Sync plugin.

I have enabled it, but it says: group default time not found.

papr 04 December, 2020, 21:10:09

@user-98789c Hi 👋 Pupil Capture does not support recording the streamed Pupil Invisible video. The video streaming is only meant for monitoring purposes.

user-895483 05 December, 2020, 10:15:48

hi everyone, thanks for the opportunity to join this platform. Please how can I normalize pupil data

user-895483 05 December, 2020, 11:05:12

its a homework

papr 05 December, 2020, 11:06:01

@user-895483 Welcome to the channel! Could you let us know what type of data (Pupil Core or Pupil Invisible) you use and what your goal regarding the normalization is?

user-895483 05 December, 2020, 11:36:50

The data is a csv file which has time in seconds and diameter of pupil in pixel. The pupil data has been already preprocessed as a low-pass filter with 3 Hz cutoff frequency was applied, and the signal was interpolated to fill in missing data due to blinks and to have a constant sampling rate of 40 Hz (this is why pixel values are not integers). taking the first 0-1 s of the recording as calm state of the pupil

user-895483 05 December, 2020, 13:49:58

the csv file

puppy_analysis_data.csv

user-895483 05 December, 2020, 13:55:38

the questions are: Create a set of scripts or tables, and using this data answer the following questions:

Normalize the data, create a new vector of normalized sizes

Create a plot/plots to visualize the normalized and non-normalized data - a scatter plot (time vs size) will work best.

Use the normalized data to calculate and report the following from 5 second intervals after the baseline period (i.e. between 1- 6 s, 6 - 11s, 11 - 16 s...)

Average pupil size

Standard deviation of the pupil size

Visualize the results with e.g. a bar chart

user-895483 06 December, 2020, 10:30:49

any possible solution to my homework

marc 06 December, 2020, 11:41:03

@user-895483 Since your homework is considering pupil size, it must be using Pupil Core Data rather than Pupil Invisible data, so we should move this discussion into the core channel. Do you have any specific questions regarding Pupil Core? There is a solution for your homework, but I don't think you're supposed to just get it from us 🙂

user-895483 06 December, 2020, 12:04:54

thank you for your response, I will appreciate if you can guide me on how to solve it, in case you will not like to give me the solution.

user-895483 06 December, 2020, 12:05:54

I am actually new to it

marc 06 December, 2020, 12:15:01

@user-895483 I'd be happy to answer any questions regarding our products, but although the data for your homework might have been recorded with a Pupil Core device, the task is just about plotting it and calculating statistics. This is not something we can guide you through in detail without a support contract. However, if you were to do this in Python I can point you to a couple libraries you should check out for this: Pandas for reading the CSV file and calculating statistics. Check out pandas.read_csv and things like DataFrame.mean and DataFrame.std. Matplotlib for plotting. matplotlib.pyplot.scatter and matplotlib.pyplot.bar.

user-895483 06 December, 2020, 12:23:19

I am grateful, but my main challenge is how to normalize the data in python. can you suggest how I can do that. Normalization in python

marc 06 December, 2020, 12:25:12

@user-895483 Data can be normalized in different ways, if it just about subtracting mean and standard deviation you can easily do that using the mentioned libraries and methods above.

papr 06 December, 2020, 12:25:38

This is the webpage of the mentioned pandas module: https://pandas.pydata.org/

user-17da3a 08 December, 2020, 11:51:28

Hi, How can I get "Apriltags" in a bigger size cause they are small when I open them. Is there any code to open them in bigger size?

marc 08 December, 2020, 11:54:12

Hi @user-17da3a! Please check out https://docs.pupil-labs.com/core/software/pupil-capture/#markers There you can find pre-made high-res versions in a PDF as well as links to PNG files you could scale up using nearest neighbor interpolation to any size. Note: withion Pupil Cloud we only support the tag36h11 family right now.

user-17da3a 08 December, 2020, 16:23:29

Thanks. We tested Pupil Invisible today and tried to work with the recording data in Pupil Player. These are the questions we’d like to clarify:
1. After only a few minutes of use, the Pupil Invisible eye tracker starts getting quite hot. Is that normal? May hurt the person using it? What is the maximum recording time given the possible overheating? 2. When transferring the recording data to a PC via USB, we could not read the recording folder into Pupil Player (it simply would not let us). Does reading recordings into the Player only work with recording folders downloaded from Pupil Cloud? 3. Regarding the timing: Why do the timestamps in .CSV file exceed the actual recording time according to the video? They also seem to start later. Example: The video goes from 0:00 s to 26 s. The timestamps in the csv files go from 1.6 s to 27.9 s. 4. Regarding any recording without surface tracking: Is the center of the coordinate system of the scene video calculated once in the beginning of the recording or is it calculated for each world camera frame (so that gaze data is always relative to the center of that specific world camera frame)? 5. Is the center of the world camera always 0.5? 6. How are the coordinates defined for each surface? Are x and y gaze position data relative to the center (i.e. origin) of the surface? 7. When you define the width and size of a surface, what units should it be in (cm, m etc.) or does it matter at all?

user-ecbbea 08 December, 2020, 17:29:52

Hey everyone, I have a bit of a simple question (I hope). When recording data on the Pupil Labs Invisible, is there a way to access the individual XY pupil position streams when recording on the included cell phone? Or do I only have access to gaze?

Bonus question: can I connect the Invisible to my laptop and record using the PC software instead and have access to the XY pupil streams?

marc 09 December, 2020, 10:22:47

@user-ecbbea Pupil Invisible does not actually do explicit pupil detection and thus no information on pupil positions is available, you only get gaze data. In theory you could connect Pupil Invisible to a laptop and record using Pupil Capture, although this is not well supported. However you would run into the following problem: The camera angles of Pupil Invisible are so far off to the side that the pupil is barely or not visible at all in many of the images. A classic gaze estimation pipeline based on pupil detection, like the one in Pupil Capture, would thus fail because pupil detection results are insufficient.

marc 09 December, 2020, 11:00:09

@user-17da3a First of all let me note that we have recently released a version of the surface tracker for Pupil Cloud, which makes it much easier to use especially for Pupil Invisible recordings. In cloud it is called the "Marker Mapper" and you can find some info on our blog here: https://pupil-labs.com/blog/pupil-cloud-projects-enrichments/

Now on to your questions:

  1. The glasses do get warm, but nothing hot should ever touch the users skin and it should never be uncomfortable. The temperature reached after a few minutes is the maximum temperature and there is no chance of overheating. The maximum recording time is limited by battery life, which is about 100 minutes for the OnePlus 6 phone.

  2. This should work fine with recordings pulled of the phone via USB as well. If Pupil Player can't open a folder it should output an error message indicating the reason. Could you tell me that message? Also please double check if your Pupil Player software is up to date.

  3. This is a bit complicated and a bit of limitation of how Pupil Player supports recordings from Pupil Invisible. When Pupil Invisible starts a recording, it starts all the involved sensors and it starts counting time. The sensors need a little bit of time to initialize though, so they do not start recording right away at timestamp 0. This is why the timestamps of the gaze signal in the CSV (which are based on the timestams of the eye cameras) start at 1.6s. The timeline that is visualized in Pupil Player however is the scene video time, so time=0 in the timeline is the time where the first scene video frame wasw recorded, which is a bit later than the recording start due to thte initialization. So tho conclude: The timeline in Player does not show the actual "recoding time " that is used for ALL sensors in the CSV's, but it is showing time aligned with the scene video. In Pupil Cloud this problem is not present.

user-17da3a 09 December, 2020, 19:55:37

Thanks for your comprehensive reply!

marc 09 December, 2020, 11:00:20
  1. I am not 100% sure I understand the question so I'll just try to clarify the coordinate systems of gaze with and without surface tracking: Without surface tracking, the gaze signal refers to the scene camera image, i.e. a gaze point at (0.5, 0.5) corresponds to the subject looking at what is in the exact center of the scene image at the coresponding time. If you are using surface tracking, you get the the same gaze signal as before, but you additionally get a mapped gaze signal for every surface. This signal is gaze in surface coordinates, so (0.5, 0.5) would corespond to the subject looking at the center of your surface indeoendent of where this surface is located in the scen image. Let me know if this did not answer your question.

  2. Yes!

  3. In surface coordinates (0,0) corresponds to the bottom left corner of the surface and (1,1) to the top right. For regular gaze this is similar but in relation to the scene image: (0,0) means subject looks at bottom left corner of the scene image and (1,1) is top right corner or scene image.

  4. On top of the normalized gaze signal in surface coordinates you also get a "scaled" gaze signal in surface coordinates. This signal is scaled according to the units and surface size you specify. So you can use the unit of your choice here. Further the size is used to determin the aspect ratio of the generated heatmap.

I hope this helps, let me know if you have further questions!

user-873514 10 December, 2020, 20:36:46

Hi all! Do any of you know if Pupil Invisible tracks head movements as well?

user-16e6e3 11 December, 2020, 09:16:43

Hi @user-873514 ! I'm currently trying this feature out as well. It works, but you need to use april tags. Scroll down to head tracking here: https://docs.pupil-labs.com/core/software/pupil-player/#analysis-plugins

marc 11 December, 2020, 09:20:29

@user-873514 The plugin linked above is the right tool for absolute head-pose tracking. It runs in the Pupil Player Software ☝️ Thanks for linking that @user-16e6e3! For relative headpose tracking the Pupil Invisible Glasses also contain an IMU that can be used.

user-16e6e3 11 December, 2020, 09:23:25

@marc I have a few more questions to that: Is the IMU used by default with each recording? How can relative headpose data be accessed after the recording?

user-873514 11 December, 2020, 09:51:29

Wonderful, thanks for the help all!

marc 11 December, 2020, 11:15:27

@user-16e6e3 Yes, the IMU data is recorded automatically with every recording. It is not super easy to access it yet. It is saved in the extimu_2b948f09 ps1.raw files in the recording folder. You could read this binary data directly based on the recording format available here: https://docs.google.com/spreadsheets/d/1e1Xc1FoQiyf_ZHkSUnVdkVjdIanOdzP0dgJdJgt0QZg/

We are currently in the process of adding the IMU data to what can be exported to CSV from Pupil Player, which is easy. This is still WIP and might be unstable here and there, but you could try out the current implementation by adding the following plugin to Pupil Player: https://gist.github.com/N-M-T/26da0586e75af180274d29f937250fec

To add a custom plugin you can simply copy the file into your pupil_player_settings/plugins folder.

user-16e6e3 11 December, 2020, 11:18:09

Thanks @marc! Will give it a try.

user-16e6e3 11 December, 2020, 13:07:19

I have one more question for today 🙂 With the new Project feature on Pupil Cloud, I was able to assgn events and timestamps to a recording. But I couldn't find a way to export this recording with the included timestamps? What I'd like to get is a .csv output with gaze data for each event. In my experiment, participants will be presented will multiple images sequentially and I would set a timestamp at the onset of each image to get gaze data for each image.

user-a725c1 14 December, 2020, 09:16:33

@marc Hi. We just finished with an experiment of more than 50 participants x 1.5 hours. Unfortunately, we just figured out that each recording with the invisible tracked the data with approx. 40-45 Hz. It should be actually 200 Hz. Any idea?

marc 14 December, 2020, 09:19:22

@user-a725c1 Hi! On the phone itself the gaze data can only be recorded with 40-45 Hz due to the limited resources on the phone, but the eye cameras record with 200 Hz. If you upload the the data to Pupil Cloud, the gaze data will automatically be "densified" to 200 Hz and be ready for download.

user-a725c1 14 December, 2020, 09:22:13

@marc Thanks for the information. I have some follow up questions. First, due to GDPR the cloud solution is not allowed at our university and probably, at least official, anywhere without an explicit statement from the participants. Due to this, is there any other way to get the data. For instance a script or desktop solution?

user-a725c1 14 December, 2020, 09:23:30

I am also not sure if I understood you correctly. Are the data upscalled to 200Hz?

user-a725c1 14 December, 2020, 09:24:25

The recordings are not anymore on the phone (stored locally on a HD). Is there still a way to get the 200 Hz? Thanks for your responses.

marc 14 December, 2020, 09:25:13

@user-a725c1 No, unfortuntely there is no other way at the moment. No the data is not upscaled. On the phone the data is computed in real-time which requires skipping some of the frames coming from the eye camera, which record at 200 Hz. In the cloud the signal is recomputed without skipping anything.

marc 14 December, 2020, 09:25:58

Technically you could still upload them to cloud by copying them back onto the phone (I can give you more detailed instructions if needed) if you could make that work legally.

user-a725c1 14 December, 2020, 09:32:38

Thank you for all the help. May I ask you whom I can contact for this issue? I cannot see anywhere on the available docs or site that the cloud usage is mandatory in order to get the actual 200Hz. A script should really do the job, so I do not understand the push for the cloud. This problem is strongly affecting our research and the legal part behind uploading user data on a cloud is by far not trivial for a university. This would make the invisible useless for us, but most important; the research data collection of the previous months will be useless for all my phds.

marc 14 December, 2020, 11:13:19

@user-757e4a I am sorry to hear that. I can see how that is frustrating. Since our gaze estimation pipeline requires very specific hardware, we do not distribute it as a desktop script, where we would need to be able to support many different platforms. For the same reason we only support two smartphone models as Companion devices. Thank you regarding the feedback on the website. I have opened a ticket internally to make this more clear in the future. You can contact [email removed] to discuss this further. I hope we can find a solution that works for you!

user-7a8fe7 15 December, 2020, 07:50:14

Hello at all, I have a question. Is it possible (if so: how?) to cut the measurement data? E.g. I have 3 min of measurement and want so separate it into the different conditions and exclude the "intro" (first seconds after pressing the start button). Thanks in advance!

marc 15 December, 2020, 12:08:55

@user-7a8fe7 All the data generated by Pupil Invisible is timeseries data, so you can in principle cut it wherever you like. You can can convert a Pupil Invisible recording to CSV files using Pupil Player, which makes that easier. If you are using enrichments in Pupil Cloud you could also define the sections you would like to analyze.

user-7a8fe7 16 December, 2020, 08:28:05

thanks @marc for your answer. Yes I would like to cut it in Pupil Player. So I understand that it is not possible to cut it into Pupil Player and save a part of a measurement as an extra measurement?

user-16e6e3 15 December, 2020, 16:19:33

Hi everyone! I'll try to post my question again, seems like it was overlooked the last time. Using the Project Editor on Pupil Cloud I defined different events in my recording. Is it possible to export this data with the new timestamps as .csv for further analysis ? What I need is a .csv wiht gaze and head positions for each event. I haven't found a way to export it so far, but it would help my project immensely.

marc 15 December, 2020, 21:39:53

@user-16e6e3 Sorry for missing your question before! Currently events can only be used to define sections for enrichment and there is no dedicated download for them. We are thinking about how to add something like that in the future. However, as a workaround you could create a Marker Mapper enrichment using your onset event as well as another artificial second event after the onset. If you export the enrichment data this also contains the timestamps of the section start, which would be your onset event.

marc 16 December, 2020, 09:15:41

@user-7a8fe7 In Pupil Player you can use the trim marks of the timeline to define what time range you would like to export. You can make multiple exports for different sections of interest.

user-7a8fe7 16 December, 2020, 09:28:29

thank you very much @marc I will try it

user-7a8fe7 17 December, 2020, 09:09:02

Hey @marc sorry I have just another question. Is it also possible to set markers in the player which will be exported, too? E.g. I have my measurement and there is an object I would like to mark which was not focused? So that I have the coordinates from the point of view and the object?

marc 17 December, 2020, 11:25:55

@user-7a8fe7 You could use the marker mapper enrichment for that. To use that you would need to place markers on the surface of interest and the enrichment would measure for you at what times and where gaze has been on the surface.

user-16e6e3 17 December, 2020, 11:29:03

Thanks @marc for your answer! Ano worries, messages just get lost in chats sometimes. It would be great if there will be a dedicated download for events soon. Please let us know once this becomes possible, it would be a great help.

marc 17 December, 2020, 11:32:20

@user-16e6e3 Thanks for the feedback! It is definitely on the list and I'll make sure to give it sufficient priority. We'll announce all updates on Discord!

user-17da3a 18 December, 2020, 10:42:34

Hello guys, I have some questions regarding Head-pose-plugin: - What does absolute head pose tracking mean? What's the difference to relative head pose tracking? - What is the frequency of head position tracking? It seems not constant from our recording and it is very low (head position data for 3-10 timestamps per second). - Just to double check: For head pose data, rotation_x means the horizontal left-right movement, with positive values for rightward movement and negative values for leftward movement? One more question about gaze data: - After downloading recording data from Pupil Cloud and reading it into Player and exporting it, our gaze data is still at about 40-45 Hz. How can we get data in 200 Hz? Thank you so much in advance,

user-0e7e72 23 December, 2020, 17:57:53

Hello everyone, what would be the easiest way to stream the pupil invisible video to my laptop in order to analyze it with opencv in realtime without passing through the cloud? Thanks a lot!

nmt 24 December, 2020, 11:47:32

Hi venoom. Have a look at Invisible Monitor. With this, you can stream Invisible scene video and gaze data to your laptop over the local network: https://github.com/pupil-labs/pupil-invisible-monitor. Also check out the network API: https://docs.pupil-labs.com/developer/invisible/#network-api

user-acc6b6 25 December, 2020, 23:35:45

Hello, I am a Japanese university students. I'm in trouble because sometimes I get an error when using pupil invisible. Sometimes I can't get gaze data. Also, the app may be forcibly terminated during recording. Is this due to glasses or Android? And how many minutes can this product last in a single experiment? I'm sorry, my English is not good .Thank you for your answer in advance.

marc 27 December, 2020, 11:59:01

Hey @user-17da3a! Apologies for the late reply. During Christmas holidays our office is a little thinned out. Let me try to answer your questions:

  • With absolute head pose tracking you get the head pose in relation to the environment. For example your environment may be a room and the head pose could tell you which exact wall the head was facing at a given time. With relative head-pose you get the movement of the head pose in relation to the previous frame, so you would know it is e.g.. rotated 4 degrees to the left and translated forward, but unless you know where the head was at time zero in relation to the room you do not know where it is at any time. Relative head-pose measurements usually have a drift error because the measurements are not perfect.

  • The relative tracking with the IMU is running at 200 Hz, the marker based absolute head tracking is using the scene camera which is recording at 30 Hz. The scene cameras framerate may be slightly affected from dark lighting conditions that drive up the exposure time, but the framerate should still always be at least >20 Hz.

  • The rotation_* values form a Rodrigues vector, which encodes the rotational portion of the camera pose. The coordinate system is aligned with the selected origin marker. The origin is the markers bottom left corner, x-axis is pointing to the right, y-axis is pointing upwards and z-axis is pointing upwards from the marker (instead of pointing "through" it).

marc 27 December, 2020, 11:59:02
  • Recordings made on the Companion phone initially only have gaze data at ~45 Hz, because this is the maximum that can be computed in real-time on the phone. The 200 Hz gaze data is added post-hoc in Pupil Cloud, so recordings you download from Cloud contain the 200 Hz data. However, Pupil Player does not yet support loading this 200 Hz data and will always load only the 45 Hz data. We are adding support for this in one of the next two releases that are planned for early and late January respectively. If you need this data sooner let me know. There is a workaround I could walk you through to have Pupil Player load the 200 Hz data already.
marc 27 December, 2020, 12:05:49

Hi @user-acc6b6! Could you please give me a little more information?

  • When you can not get gaze data, are the eye cameras connected? On the home screen of the app the circle with the eye icon should be green and not gray.
  • Do you get any error message?
  • Does the app just terminate itself?
  • Are you using the most recent version 1.0.3-prod of the app?

The maximum recording time using the OnePlus 6 phone is ~100 min if the phone is fully charged.

user-acc6b6 28 December, 2020, 06:31:09

@marc Thank you for your reply!

-Yes. The eye cameras are connected. However, sometimes the eye icon changes from green to gray during recording. -I got error messages: "java.lang.IllegalStateException: Could not open camera, no…" "shout Attached: sensor is not available: UVCMicSensor (video…" (I'm sorry I couldn't see the sentence after "...") -I got this message when the app terminated: "Invisible Companion has stopped. ・Restart the app. ・Send feedback." -Yes. The version of the app is 1.0.3-prod.

I feel that errors are likely to occur when the Android charge is low.

user-98789c 28 December, 2020, 13:43:50

When I add a surface and record using Pupil Invisible, and then transfer the recorded data from the Cloud to Pupil Player, there is no indications of the surface that I have defined on the Surface Tracker plugin. I can see the markers and the surface on the video, but there is no folder regarding the defined surface in the data folder. Should I take extra steps to export the surface information?

marc 28 December, 2020, 19:07:56

@user-98789c The marker mapper enrichment in Pupil Cloud and the surface tracker plugin in Pupil Player do similar things, but they are not compatible. If you have already defined a marker mapper enrichment and you have you computed, you can already download gaze mapped to the surface and an according heatmap from within Pupil Cloud.

user-94f03a 17 February, 2021, 06:04:16

Hi Marc, we have a follow up question to this.

  1. Is there a way to upload a clear image (e.g. jpg) on the cloud to introduce the surface (a) to be used in the processing, and (b) to be used for the heatmaps etc?

-> it would be great if we could batch-upload all the stimuli for a given project, and then let the cloud do the post-processing?

  1. Once we define an enrichment in the pupil cloud, how do we implement it this to all recordings? Does it automatically apply to all recordings in the folder?

Cheers, Panos

user-dad9ee 28 December, 2020, 23:53:25

Hi guys! I need to get more info about pupil products. I represent HellRaisers Esports team and we want to purchaise eye tracker. Who can i contact?

marc 29 December, 2020, 15:01:58

@user-acc6b6 Thanks for the details. The disconnecting eye cameras are probably a hardware issue and might require repairs. Are you using the USB-C cable that was included with the glasses? Have you ever been streaming the data from the phone to another computer in real-time?

marc 29 December, 2020, 15:06:28

@user-dad9ee Please reach out to [email removed] Due to ongoing vacation of the team a video meeting will probably not be possible before the 4th of January.

user-acc6b6 31 December, 2020, 09:27:06

@marc I'm using the USB-C cable that was included with the glasses. All I do on the phone is recording and uploading to the pupil cloud. Does "streaming the data" mean uploading to the pupil cloud? Also, does "hardware" mean glasses or Android?

End of December archive