👁 core


user-41f1bf 01 January, 2018, 18:15:25

@vish IR light may burn your retina. However, for people with normal vision, chronic exposure to a single IR led can't do any harm under normal light conditions (lets say, > 200 lumens/m2).

user-41f1bf 01 January, 2018, 18:31:28

A single conventional IR led will emmit 10 - 25 mW/sr2 under 100mA. The safe zone is below 10W/sr2. (I am writing from the phone, I would recommend confirmation of those values).

user-8779ef 04 January, 2018, 13:06:39

Hey guys, is it me, or is pupil player searching for pupil calibration markers even when I'm in natural feature mod?

papr 04 January, 2018, 13:07:24

@user-8779ef Welcome to the channel! Thank you for your valuable feedback on Github!

user-8779ef 04 January, 2018, 13:07:25

Notice the blue circle indicating the incorrect identification of a circular calibration marker

Chat image

user-8779ef 04 January, 2018, 13:07:38

papr - happy to contribute! thatnks for the hard work and responsiveness.

user-8779ef 04 January, 2018, 13:07:58

So, check that image. Note the one quirk - it's stuck at 99% mapping.

papr 04 January, 2018, 13:08:29

You are right that the circular calibration marker detection is run even if the section is set to natural features. We run it once and cache all found markers.

user-8779ef 04 January, 2018, 13:08:39

Oh, no wonder I have no laptop battery left 😛

papr 04 January, 2018, 13:09:18

You should wait until it is finished and close player. After the next launch the markers will be loaded from the cache file instead of re-running the detection.

papr 04 January, 2018, 13:09:58

Could you try recalibrating? Does it keep being stuck at 99% every time?

user-8779ef 04 January, 2018, 13:10:09

I'll try now. I believe it is stuck every time, yes.

user-8779ef 04 January, 2018, 13:10:52

...but, lets see.

papr 04 January, 2018, 13:11:55

Would it be possible that you share this recording (or a similar one that shows the same issues) privately with me? This would help me a lot with reproducing your issues. 🙂

user-8779ef 04 January, 2018, 13:14:00

Yes, I am stuck at 99%.

user-8779ef 04 January, 2018, 13:14:18

Papr - I have to check. It's a many-gig file and, more importantly, may be under NDA.

user-8779ef 04 January, 2018, 13:14:28

Non disclosure agreement.

papr 04 January, 2018, 13:15:06

Yes, this is often the case for experiment recordings. Try to make a small test recording and see if it shows the same issues.

user-8779ef 04 January, 2018, 13:15:12

OK.

user-8779ef 04 January, 2018, 13:15:48

I'll also talk to Jeff - he's got more experience with your player on a similar machine (if that's the issue).

papr 04 January, 2018, 13:16:51

This might be an edge case that only appears on large recordings.

user-8779ef 04 January, 2018, 13:17:10

FYI, our preferred pipeline is this: during capture, use 2D methods and save out eye images. We are sure to include a calibration sequence with natural features (eg. a custom grid) . Later, in player, we switch to 3D modes with natural feature detection.

user-8779ef 04 January, 2018, 13:17:26

This seems to provide the most flexibility and quality.

papr 04 January, 2018, 13:17:52

Yes, I agree that this is also the most future-proof way to record.

user-8779ef 04 January, 2018, 13:18:15

For large recordings, the search for cabliration markers eats up a ton of time and CPU. I think I'll open a request on github to remove this from default behavior.

user-8779ef 04 January, 2018, 13:18:46

What's the term you guys use to refer to your circular markers?

papr 04 January, 2018, 13:18:49

Or better: A way to cancel it

user-8779ef 04 January, 2018, 13:18:52

yes.

papr 04 January, 2018, 13:19:00

calibration markers is fine

user-8779ef 04 January, 2018, 13:19:04

oK, thanks.

papr 04 January, 2018, 13:19:22

Btw, do you run from source or from bundle?

user-8779ef 04 January, 2018, 13:19:28

Source.

user-8779ef 04 January, 2018, 13:19:32

SOrry, bundle

user-8779ef 04 January, 2018, 13:20:05

I think this is going to be the best approach for me - I'm testing this out for student use, too.

user-8779ef 04 January, 2018, 13:20:17

I need to know the issues they will run into ahead of time.

user-8779ef 04 January, 2018, 13:37:23

am I correct to infer that a result of being stuck at 99% is that it won't save to file? I assume this is automatic at completion.

papr 04 January, 2018, 13:39:10

Mapped gaze is actually not cached since it is quite fast to map pupil data. The actual problem is that other plugins that depend on gaze data (e.g. fixation detector) do not get the data.

user-8779ef 04 January, 2018, 13:40:19

However, my natural feature locations were not saved.

user-8779ef 04 January, 2018, 13:40:43

Upon shutdown / reload, the entire calibration region is missing.

user-8779ef 04 January, 2018, 13:41:19

Ah, I have to switch away from gaze from file, to offline calibration. Shouldn't that be remembered?

papr 04 January, 2018, 13:41:24

Mmh. This might happen if the application was not shutdown correctly/crashed.

user-8779ef 04 January, 2018, 13:41:35

It was a graceful shutdown, so that's not it.

user-8779ef 04 January, 2018, 13:41:45

...at least, I have no indication that nything went wrong.

papr 04 January, 2018, 13:44:03

This should be remembered. It is an indication for a crash during shutdown if not even the user settings were stored. I would suggest to upload the log file but it is overwritten after restarting the application. @mpk We should add an option to keep the last three log files automatically.

user-8779ef 04 January, 2018, 13:44:28

Ok, thanks again.

papr 04 January, 2018, 13:44:53

Be aware that a crash does not necessarly show as such during shutdown. The crashlog should be written to the log file.

user-8779ef 04 January, 2018, 13:45:24

Good news - now that I'm able to poke around a bit more, the track seems to be really nice. 😃

user-8779ef 04 January, 2018, 13:45:39

It seems something changed between shutdown / reload. Strange.

user-8779ef 04 January, 2018, 13:45:52

So, you're probably right.

papr 04 January, 2018, 13:46:35

Yes, if the saved user settings are broken they are automatically resetted. My guess that was your case.

user-8779ef 04 January, 2018, 13:47:05

K. I'm currently playing with a custom fixation detection algorithm. It's slower than the one you have implemented, but if it works I'll se if I can share.

user-8779ef 04 January, 2018, 13:47:32

Output is in the same format excel as you guys use, so my hope is to explore using your plugins.

user-8779ef 04 January, 2018, 13:47:42

wish me luck, and thanks again.

papr 04 January, 2018, 13:48:57

Cool! Do not hesitate to ask if there are any further questions. Especially if it comes to code details/behavior.

user-8779ef 04 January, 2018, 13:49:06

Ok, thanks. I won't.

user-8779ef 04 January, 2018, 16:25:42

So, it seems like on cannot import fixations, only export. Is that right?

user-8779ef 04 January, 2018, 16:26:10

...that is, into pupil player, for visualization of my own algorithm. I have formatted the algorithm output to match the output of the csv file found in the exports folder.

user-8779ef 04 January, 2018, 17:12:22

@papr In addition to the calibration hangingn at 99%, I've since also found an inability to export the video from player. However, resetting to default (in gen settings), recalculating pupil positions, and recalibrating fixed both issues.

papr 04 January, 2018, 19:03:42

@performlabrit#1552 That is correct. There is no such thing as importing external data. But you could easily write a plugin that reads such a file via drag and drop and visualizes the data.

user-ec208e 04 January, 2018, 19:48:56

Hi everyone! Is it normal that the headset get very hot? I've been running some temperature tests and these are the results:

  • world camera 53,8°C
  • eye 0 61,6°C
  • eye 1 58,4°C

I have another question: Does the device have a recommended usage time? Thanksss! 😃

mpk 04 January, 2018, 20:28:32

@user-ec208e the cameras do get warm.This is bacause we do compression and high speed capture with them. The headset can run continously without problems.

mpk 04 January, 2018, 20:28:48

@user-ec208e the new 200hz cameras run much coller btw.

user-8779ef 04 January, 2018, 20:53:11

@papr "easy" is a very relative term 😃 . In fact, I've been looking at making a simple plugin, and find the lack of documentation a major impediment. Is there a location to make requests related to documentation?

user-8779ef 04 January, 2018, 20:53:50

Nevermind, I've found it.

user-2798d6 04 January, 2018, 21:09:03

Just checking myself - on the excel output from Player, the timestamps listed are in seconds, correct? And if I want to get the timestamp of a certain fixation, I can subtract the very first timestamp in the gaze positions excel file from my chosen fixation in the fixation file?

papr 05 January, 2018, 07:56:53

@user-2798d6 correct.

user-39ac51 05 January, 2018, 08:39:38

Hello everyone.

papr 05 January, 2018, 09:08:21

@user-39ac51 Hey, welcome to the channel 🙂

user-39ac51 05 January, 2018, 09:11:45

Guys, so I have like 6-7 short questions after reading the docs, should I just ask them here one after the other or would that be spam-y?

papr 05 January, 2018, 09:14:12

I would suggest opening an issue over at https://github.com/pupil-labs/pupil-docs/issues that includes all the questions. Please number them such that I can refer to them when answering. Afterwards we can create new issues for eventual docs changes

papr 05 January, 2018, 09:15:43

But it is also ok to simply ask your questions here. I just have the feeling that it might be easier on Github to reference single questions when answering.

user-39ac51 05 January, 2018, 09:17:20

it's not really a bug or issue with the code, but of hardware related questions

papr 05 January, 2018, 09:17:36

ah, I see. These should be asked here.

user-39ac51 05 January, 2018, 09:17:41

OK

user-39ac51 05 January, 2018, 09:19:08

OK so first thing I have an UVC cam that can capture in 30,60 and 120fps depending on resolution. How do I specify what resolution/fps to use? Is it something the camera software controls or the uvc camera itself?

papr 05 January, 2018, 09:20:07

This is settable via software. See pyuvc for details.

user-39ac51 05 January, 2018, 09:20:19

OK, thank you.

user-39ac51 05 January, 2018, 09:21:14

Second question, if you use two uvc cameras, where do you take care of syncing them? (OK so this one is software related question but not the Pupil library necessarily)

papr 05 January, 2018, 09:24:35

We do not sync the cameras themselves. But each frame has a timestamp which we use to correlate the frames.

user-39ac51 05 January, 2018, 09:25:24

doesnt that introduce lag in realtime mode?

papr 05 January, 2018, 09:29:07

The pipeline is the following: Each frame is processed in its own process. The result is a pupil position that has the same timestamp assigned as the frame. This pupil position is send via the IPC backend to the world process which correlates pupil position pairs and maps them to gaze positions. Therefore, of course yes, there is a small lag due to the processing, but the data includes the timestamp of frame creation.

user-39ac51 05 January, 2018, 09:31:41

Understood. My only concern is each eye readings might be always few ms apart which may give slightly inaccurate data.

user-39ac51 05 January, 2018, 09:32:11

unless the uvc cameras (identical ones) could be initialized at the exact same time

user-39ac51 05 January, 2018, 09:34:13

Even provided the function calls for starting each are right after each other and the cabling length is the same I don't know if they will start equally. But you probably have some info from your own tests. Maybe it is a non issue

papr 05 January, 2018, 09:37:33

We just do not expect them to have the same timing. Also, the mentioned frame rate is not fixed. There might be slight timing differences between frames. But we are talking about a time difference of 4 ms (worst case) between binocular frames (at 120Hz). If you need to be even more accurate I would suggest to buy the 200hz eye cameras.

user-39ac51 05 January, 2018, 09:38:39

I might. Although 4ms sounds better than I was expecting. Anyway, thanks for answering this question too.

papr 05 January, 2018, 09:40:50

FYI, the calculation: 120 Hz equals to frames each 0.0083 seconds. Therefore the maximum offset between two 120 Hz cameras is about 0.004 seconds. This means that you can reduce the maximum offset to 2.5ms using the 200 Hz camera.

user-39ac51 05 January, 2018, 09:41:17

OK, perfect. ty

user-39ac51 05 January, 2018, 09:44:25

The next question is regarding the IR leds. Do they just illuminate the eyes and areas around the eyes, or are their reflections on the iris needed for calculations? Reason I'm asking is 1) if it is the latter than their positioning is important and also using more will probably affect the calculations by having more randomly positioned white dots on the iris and confusing the algorithm and 2) probably shouldnt use a more diffuse and more uniformly illuminating but brighter ir led because there wont be a point reflected on the iris.

papr 05 January, 2018, 09:46:34

Our pupil detection algorithm works glint free, therefore we do not require the reflections. The eye cameras should be positioned such that the pupil is clearly visible. More important though is that the pupil is in focus.

user-39ac51 05 January, 2018, 09:47:08

Oh, great. Makes life easier for me. Thank you.

user-39ac51 05 January, 2018, 09:49:07

OK so the next question is regarding what format pupil expects from the uvc camera. because my camera has different fps speeds for MJPEG and YUY2. Maybe I could convert one to the other if needed in realtime but not sure if it could be processed so fast. Sorry if this is mentioned in the docs and I missed it.

papr 05 January, 2018, 09:52:37

Our current pyuvc implementation expects mjpeg data.

user-39ac51 05 January, 2018, 09:52:45

Oh, perfect.

user-39ac51 05 January, 2018, 09:54:00

Can the world camera be used to determine your relative head position relative to a starting center position?

user-39ac51 05 January, 2018, 09:54:27

This will require something like monocular SLAM I think

papr 05 January, 2018, 09:56:50

@marc Has been working on something like that using markers. He published the current state of his work at https://github.com/pupil-labs/pupil/pull/872 Feel free to test it and to contribute. We would appreaciate your feedbakc on this.

user-39ac51 05 January, 2018, 09:59:16

Thanks, I will. if you or anyone in the team have contact with him you can pass this to him if you want: ARToolkit has a good marker based position tracking which also supports stereo cameras. He might want to use that library in his plugin.

papr 05 January, 2018, 10:00:36

He is part of our team 🙂 I will tell him about it.

user-39ac51 05 January, 2018, 10:00:44

great

user-39ac51 05 January, 2018, 10:01:15

ARToolkit is under LGPL license

user-39ac51 05 January, 2018, 10:01:27

(shouldnt matter for plugins though)

user-39ac51 05 January, 2018, 10:02:49

For markerless tracking ORB-SLAM seems the only decent open source left: http://webdiis.unizar.es/~raulmur/orbslam/

user-39ac51 05 January, 2018, 10:09:47

Final question, does Pupil use/need GPU for faster processing? I want to test it on a ASUS Tinker Board and while its twice as fast at least than Pi 3 I'd like to have a general idea what to expect

papr 05 January, 2018, 10:11:40

No, currently everything is process on the CPU. Be aware that the software will drop frames if the processor is not fast enough.

user-39ac51 05 January, 2018, 10:11:52

Sure.

user-39ac51 05 January, 2018, 10:13:39

Actually I lied theres another question left. With ordinary webcam software I notice lag/latency of about 200 ms. Is it because of the camera or the software displaying the camera content (not your viewer)? Can I expect less latency issues by using pyuvc?

papr 05 January, 2018, 10:24:11

I cannot tell you were this lag/latency comes from. Pupil Capture might show the frames with a slight delay as well since the processing happens before displaying the frame. Lowest latency will be archived by using pyuvc directly. Pupil Capture uses pyuvc under the hood as well.

papr 05 January, 2018, 10:25:35

You should be able to test this though. Simply download and start Pupil Capture 🙂

mpk 05 January, 2018, 10:26:39

@user-39ac51 using our hardware and pyucv we have a latency of 6-4ms depending on the camera used. Other webcams may add considerable latency.

user-39ac51 05 January, 2018, 10:27:24

you also count the processing for calculating the pupil positions right?

mpk 05 January, 2018, 10:27:32

latency is from start of expusure unit the frame is available to the user.

mpk 05 January, 2018, 10:27:51

then you have to add 3-5ms for proceesing to get the pupil pos (this depends on your cpu.)

user-39ac51 05 January, 2018, 10:31:46

I've read somewhere that it takes a while for our brain to process the visual information from a new gaze position the eyes have moved to and it is not instant. Is anyone familiar with what I'm talking about?

user-39ac51 05 January, 2018, 10:32:22

might make up for the latency

user-39ac51 05 January, 2018, 10:36:19

mpk, whats your camera fps, res and how many cams? (and what pc specs if its okay to ask)

mpk 05 January, 2018, 10:37:28

@user-39ac51 I m using 200hz eye cameras with 120hz world cam on an i7 macbook air with 8gb ram.

user-39ac51 05 January, 2018, 10:58:43

What I read a while ago not only referred to the time it takesfor vergence and to refocus the eyes when target changes, but also just pure time to make up what the new view shows. I don't know what it's caused by. Maybe exposure, other things the brain processes such as familiar shapes or linking what is in the new field of vision to the previous one(s) but I think the conclusion was it take a moment to actually "see". Is this familiar to anyone?

user-ec208e 05 January, 2018, 14:33:58

@mpk Thanks!!!! 😋

user-8779ef 05 January, 2018, 23:06:53

Hi folks, looking for some guidance. I have solid programming experience for scientific analysis, but limited experience developing applications. I want to have a try and developing a plugin for player, but can only get so far as installing all the dependencies and running from source. I would now like to import the project into Visual Studio (on a mac) and be able to debug the module I'm trying to create. .... I may be asking too much (some of this is background knowledge to a developer) . Any input would be appreciated, even if its a good webpage / resource.

user-41f1bf 06 January, 2018, 15:17:09

@user-8779ef For a simple player plugin written in pure python, you could try sublime text or geany+bundle pupil player. Yes, no sources at all, no dependencies headaches.

user-41f1bf 06 January, 2018, 15:18:16

Everything bundled with pupil can be imported

user-41f1bf 06 January, 2018, 15:21:49

If your plugin have dependencies not bundled with pupil, you don't need the sources.

user-41f1bf 06 January, 2018, 15:24:40

You can install those dependencies and import them from your plugin, normally.

user-41f1bf 06 January, 2018, 15:26:47

What your plugin will do?

user-3aea1d 06 January, 2018, 16:16:08

So uh, is there a proof of concept of foveated rendering with the pupil labs addon for vive or rift? There are some articles about saccadic movements and they give some crazy possible rotation sums like 800 degrees per second of saccadic eye rotations. I don't think a 90 Hz VR screen is enough to make foveated rendering possible even a perfect eye tracking aside but maybe I misunderstand these articles.

papr 06 January, 2018, 16:28:34

@user-3aea1d FYI, the old eye camera model is able to provide up to 120 Hz, and the new ones up to 200 Hz.

user-3aea1d 06 January, 2018, 16:34:35

not talking about eye tracking speed but the refresh rate of VR headsets

papr 06 January, 2018, 16:40:44

Ah, ok, makes sense. But it might not be necessary to render during saccades. As far as I know people do not perceive anything visual during saccades. Therefore one would only have to render as soon as the saccades ends.

user-3aea1d 06 January, 2018, 16:52:18

I understand, but from what I read, and I hope I read it wrong, there can be more than 90 saccades each second.

user-3aea1d 06 January, 2018, 16:52:42

with the sum of rotation angles 800 degrees max

user-3aea1d 06 January, 2018, 16:55:41

My bad, " The smallest “microsaccades” move the eye through only a few minutes of arc (one minute of arc equals one-sixtieth of one degree). They last about 20 milliseconds and have maximum velocities of about 10 degrees per second. The largest saccades (excluding the contributions of head movements) can be up to 100 degrees, with a duration of up to 300 milliseconds and a maximum velocity of about 500–700 degrees per second."

user-3aea1d 06 January, 2018, 17:01:20

but anyway, is there a proof of concept of foeated rendering with the pupil addons?

user-8779ef 06 January, 2018, 17:07:26

@user-41f1bf Yes, I've started along that path, but it's not a very useful environment for debugging. I would like to be able to halt the script and inspect the local variables ... for example, g_pool, because there isn't any documentation on its contents.

user-8779ef 06 January, 2018, 17:08:13

@user-41f1bf Eventually, an improved fixation detector. Possibly with a visualization of a velocity / acceleration time series.

papr 06 January, 2018, 17:14:03

@user-3aea1d Not that I know of. But we should ask @user-e04f56 , he maintains most of the VR related projects.

user-8779ef 06 January, 2018, 17:17:45

@user-3aea1d . The equipment doesn't have the precision to measure microsaccades. You should also maybe read a bit about post-saccadic inhibition. My intuition is that we have a few tens of milliseconds after a saccade ends during which we suppress change. This would suggest that there's some tolerance for eye tracker latency when implementing mid-saccadic manipulations.

user-8779ef 06 January, 2018, 17:18:21

@user-3aea1d As far as foveated rendering goes, make sure to have a look at work by David Luebke's group at Nvidia (including work by Jaewhoo Kim)

user-8779ef 06 January, 2018, 17:18:38

@user-3aea1d I haven't seen anything done with foveated rendering using Pupil.

user-3aea1d 06 January, 2018, 17:20:01

but do you think the hardware should handle it?

user-8779ef 06 January, 2018, 17:21:41

@user-3aea1d Yes. A student over here measured the average latency of the pupil lab mobile eye tracker running at 60 fps at 0.012 seconds.

user-8779ef 06 January, 2018, 17:22:56

12 milliseconds. So, you could expect the information to be available to the unity pipeline within about 1 frames time (at . 90 Hz, that's about 11 ms)

user-8779ef 06 January, 2018, 17:23:38

How soon it influences the screen would depend upon the delay introduced by your shader ( I assume you'll use a shader to impement your foveated rendering compression)

user-8779ef 06 January, 2018, 17:26:12

So, off the cuff, assuming your shader introduces a minimum of <20 ms of latency, I'll guess you're introducing a minimum 30 ms of latency from eye movement to screen update. I would go to the literature on change detection and post saccadic supression to see if that's within the bounds of blindness due to post-saccadic inhibition. Sorry - I don't have names I can provide on that literature base (it's a tentative suggestion - you may have better luck elsewhere).

user-41f1bf 06 January, 2018, 20:10:15

@mpk , are you still using sublime text? Do you use any tool for inspecting stuff?

user-41f1bf 06 January, 2018, 20:11:02

Beyond the logger and python self awareness functions

user-3aea1d 06 January, 2018, 23:28:25

is ir led to your eyes dangerous? This is a very loose/vague question, I know. I guess both legally and according to scientific evidence we have so far. I know for uv things are tricky and not 100% known at this point.

user-256b23 07 January, 2018, 00:41:01

Are there linux commands to be able to run the player in a GUI-less way?

user-256b23 07 January, 2018, 03:26:16

I am having errors with GLFW window failing to create now, ubuntu 16. went through the steps just seems to be hung up on this

user-c23464 07 January, 2018, 04:07:15

Hi all, I'm having a bit of trouble running Pupil from the source The install seems to have gone fine, but when I try to run main.py, I get the following error: https://pastebin.com/B8iJxZ9K I've tried searching for similar cases online - in fact, I found a much older Pastebin that contained much the same error message - but I haven't had any luck. From what I can tell, ZMQ is experiencing some kind of error when looking for the appropriate socket. Any advice is greatly appreciated!

user-3aea1d 07 January, 2018, 17:19:33

Research inconclusive but cataracts sounds scary. Please link to other contradicting research if available. Thanks.

user-8779ef 07 January, 2018, 18:57:07

srsbdness: the amount of IR from mobile eye trackers is typically a fraction of what we get during exposure to outdoor light.

mpk 07 January, 2018, 20:20:37

@performlabrit#1552 for developement with a python debugger check out pycharm. the community edition is free.

user-e938ee 07 January, 2018, 22:38:53

Anybody successfully subscribing to pupil from qt c++ zeromq?

user-3aea1d 08 January, 2018, 00:49:23

anyone tried running on a Pi3?

user-2ac56b 08 January, 2018, 03:11:45

@user-3aea1d I actually did get it running on a Pi last year!

user-2ac56b 08 January, 2018, 03:12:46

https://www.youtube.com/watch?v=GPGllEY_QNM

user-2ac56b 08 January, 2018, 03:12:58

This was a Pi2, so it should work on a Pi3

wrp 08 January, 2018, 03:14:04

@user-3aea1d We take IR safety very seriously at Pupil Labs. Pupil Labs eye tracking hardware is tested and evaluated by professionals to ensure that they comply with photobiological safety standards.

user-3aea1d 08 January, 2018, 06:25:52

nice. is opencv optimized?

user-e04f56 08 January, 2018, 07:26:47

@user-3aea1d regarding "foveated rendering", I came accross these papers last year, one of them co-authored by the [email removed] mentioned

user-e04f56 08 January, 2018, 07:27:59

https://1drv.ms/b/s!AizpW0Y1FGnBguJ7TL2wm9hymcyBcQ

user-e04f56 08 January, 2018, 07:31:19

I experimented a bit with the concept by adapting the toon shader from the 3D market demo scene through reducing the texture LOD based on the distance from the gaze position. But I could not see any performance improvements

user-3aea1d 08 January, 2018, 07:33:24

You usually render the scene twice, one at full fov and about 20% target resolution and one at much lower fov but pixel perfect resolution, then merge these renders into one frame by a fragment shader

user-e04f56 08 January, 2018, 07:33:43

I also asked around in the Unity forums for suggestions on how performance could be improved in any means by utilizing gaze information, but I did not get an answer

user-3aea1d 08 January, 2018, 07:34:42

There's a transition added between the two renders. The low res full fov frame also has some fragment shaders, such as blur

user-3aea1d 08 January, 2018, 07:35:36

not sure what you mean by texture LOD

user-e04f56 08 January, 2018, 07:39:42

@user-3aea1d texture2Dlod is a shader function to specify the "level of detail" in which you access the texture used for the current 3d model. so it corresponds to a reduction in resolution, as you describe it

user-e04f56 08 January, 2018, 07:40:40

so based on the distance from the gaze point, I tried to reduce the resolution on the texture being used

user-3aea1d 08 January, 2018, 07:40:49

I see, I dont use Unity myself. But that will make only a tiny difference in performance

user-3aea1d 08 January, 2018, 07:47:59

I'll read the pdfs you linked to, thanks. Do any of them by any chance mention using pupil for tracking?

user-3aea1d 08 January, 2018, 08:48:06

@wrp Where can I find those standards for using custom leds for myself? Im my own test subject for my experiments so I should be fine but just to be on the safe side.

user-3aea1d 08 January, 2018, 08:59:54

@user-e04f56 I don't know your levelo of experience with 3d programming but maybe you are confusing render resolution with texture resolutions of 3d models.

user-3aea1d 08 January, 2018, 09:01:13

Unity should have a profiler, you can check that once texture of a 3d model is loaded it takes up far less resources than render resolution per frame.

user-e04f56 08 January, 2018, 09:10:55

@user-3aea1d I was not confusing them, just experimenting with different ways on how to utilize the gaze. But you are right, maybe I should have a look at composing scene renders of different resolution

user-f68ceb 08 January, 2018, 14:32:12

Hi, Has anybody worked out a way to use the eye-tracking on a mobile device? To test e.g. user interaction?

user-2798d6 08 January, 2018, 20:27:27

Hello, Is there a way to get the number of the fixation (that shows up with the yellow fixation circle) to be present in the exported video? I'm getting the yellow circle, but no number with it.

user-c3650b 08 January, 2018, 20:28:27

Am I correct that there is no CLI for any of the eye recorders?

user-4e7774 08 January, 2018, 21:49:54

Hello every one, quick question. Which packages do you use to create tasks that we be linked to my pupil eye tracking device ? I was thinking about using PyGaze, but I read somewhere that it was not compatible. Thanks!

papr 08 January, 2018, 21:52:21

@user-c3650b you are correct.

user-c3650b 08 January, 2018, 21:52:43

@papr thanks!

papr 08 January, 2018, 21:56:35

@hellomarcoliver#8847 You can use our Android app Pupil Mobile to connect the Pupil hardware to your phone and make recordings on it or stream the video live to your computer.

user-c3650b 08 January, 2018, 21:57:41

@papr what about pupil remote? I could attach via python...? does that require building my own version of the app from source?

papr 08 January, 2018, 21:58:35

No, Pupil Remote is a zmq based network interface. You can use the bundle as it is to connect to it.

user-c3650b 08 January, 2018, 21:59:01

So I could use it to start and stop recording just like a CLI?

papr 08 January, 2018, 22:00:56

You will have to run the app but you can surely control it remotely.

user-c3650b 08 January, 2018, 22:01:10

perfect

user-c3650b 08 January, 2018, 22:01:34

Thanks again!

papr 08 January, 2018, 22:03:24

@user-2798d6 Mmh, I am pretty sure that the numbers are rendered in the same way as the circles. I will have a look at them tomorrow.

papr 08 January, 2018, 22:06:20

@user-4e7774 Incompatible is the wrong word. There is no built in integration into pygaze. But this is surely doable. The most important thing is that you synchronize the clocks between pygaze and Pupil Capture such that you can correlate task related events with the gaze data.

user-2798d6 09 January, 2018, 01:38:48

@papr - I see them now! They were just going by really fast and the background made them hard to see. Thanks!

user-537e9a 09 January, 2018, 03:52:01

hello, Is it available to purchase the product including 120hz binocular eye camera? If It does not possible, I'd like to buy 120hz eye camera as additional product.

wrp 09 January, 2018, 03:54:14

Hi @user-537e9a we do still have some 120hz eye cameras available. Please send an email to sales@pupil-labs.com and we can go from there.

user-537e9a 09 January, 2018, 03:58:21

@wrp Thank you!

wrp 09 January, 2018, 04:16:44

@user-537e9a you're welcome!

user-8779ef 09 January, 2018, 12:21:37

@mpk Thanks - I've switched to Pycharm, which I actually have experience with. So, the issue now is that offline nat. features calibration halts at 99%, and player does not shut down gracefully.

mpk 09 January, 2018, 12:23:40

@user-8779ef any console/log output you can share?

user-8779ef 09 January, 2018, 12:23:53

I posed an issue to github with the console output

user-8779ef 09 January, 2018, 12:24:05

Let me find the link.

user-8779ef 09 January, 2018, 12:24:28

Sadly, it's not much to go by. THe program does not shut down gracefully either. Is there a log file I can access to see messages stored during shutdown?

user-8779ef 09 January, 2018, 12:24:39

https://github.com/pupil-labs/pupil/issues/1008

mpk 09 January, 2018, 12:25:57

the output you show is not an actual error. we only save the marker data when the search was complete. In you case the 99% stuck issue is what I think leads to the search not being saved.

mpk 09 January, 2018, 12:26:12

lets try to get to the bottom of that first.

user-8779ef 09 January, 2018, 12:26:16

Yep.

mpk 09 January, 2018, 12:26:21

how long is your recording?

mpk 09 January, 2018, 12:26:34

@papr can you try to recreate this?

user-8779ef 09 January, 2018, 12:26:37

Let me see.

user-8779ef 09 January, 2018, 12:26:50

BTW, I'm now running from source

user-8779ef 09 January, 2018, 12:27:01

...persists in bundle, too.

mpk 09 January, 2018, 12:27:48

ok. thats good to know.

user-8779ef 09 January, 2018, 12:28:05

One video is 5:25. The other is 11:34.

user-8779ef 09 January, 2018, 12:28:42

Same issue with both. I had this issue previously and found that resetting to default settings resolved the issue ...but, that only worked once.

user-8779ef 09 January, 2018, 12:29:54

Just loaded the file and, during load, I see this error: Ceres Solver Report: Iterations: 13, Initial cost: 1.943841e-01, Final cost: 7.563179e-03, Termination: CONVERGENCE Traceback (most recent call last): File "/Users/gjdiaz/PycharmProjects/Pupil1/pupil_src/shared_modules/background_helper.py", line 46, in _wrapper raise EarlyCancellationError('Task was cancelled') background_helper.EarlyCancellationError: Task was cancelled

user-8779ef 09 January, 2018, 12:31:00

...and, good news, this time it loaded the settings. So, it did shut down gracefully that last time. Calib. still stuck at 99%.

user-8779ef 09 January, 2018, 12:38:27

FYI, the pipeline I use is .... collect data using with 2D pupil detection and calibration, but later I use offline 3D pupil detection and 3D calibration with natural features ( a calibration grid ).

papr 09 January, 2018, 12:51:38

We found the issue: Because of caching reasons, the binocular gaze mapper does not return a gaze point at the end. This only happens in few cases and there is no way to flush the cache. I will fix this this afternoon.

papr 09 January, 2018, 12:55:34

Actually, the mapping is successful but the displayed mapping state is simply incorrect.

user-8779ef 09 January, 2018, 13:01:24

ok, great. So the calibration issue is not linked to the ungraceful shutdowns.

user-8779ef 09 January, 2018, 13:02:09

Now, how can I help debug those? Is the log saved to file?

papr 09 January, 2018, 13:02:45

Yes, there is a log file as <pupil repo>/player_settings/player.log

user-8779ef 09 January, 2018, 13:04:20

Great. I'll create a new issue on github, and the next time I have a crash on shutdown, I'll post the log.

papr 09 January, 2018, 13:04:20

It would be great if you could use this branch for testing: https://github.com/papr/pupil/tree/offline_calib_improvements

It includes the fix for the mapping display issue

user-8779ef 09 January, 2018, 13:04:49

Ok, I can switch. My goal here is to develop the knowhow to build plugins.

user-8779ef 09 January, 2018, 13:06:03

...so far, I've found player's instability between runs to be the greatest obstacle to progress. It seems I have to recapture the pupil far too often due to crashes, and that costs time.

user-8779ef 09 January, 2018, 13:06:20

It crashes and reverts to pupil from file.

papr 09 January, 2018, 13:06:21

Plugins have a cleanup() function that is called on shutdown. If the program crashes beforehand the cleanup method is not called. This might cause an ungraceful shutdown for a lot of plugins.

papr 09 January, 2018, 13:07:43

And even if the detection was finished beforehand it does not read the data from the cache?

user-8779ef 09 January, 2018, 13:08:00

Say that another way for me? A bit unclear.

user-8779ef 09 January, 2018, 13:08:37

Oh, I getcha. Yeah, it often reverts to pupil from file.

user-8779ef 09 January, 2018, 13:09:12

So, another suggestion I was going to have was the option to save pupil detection to file explicltly . It is, afterall, a computationally heavy process.

papr 09 January, 2018, 13:10:13

The Pupil Offline Detection runs the detection for both eyes. After finishing the detection the eye windows should close and the resulting data is written to a cache file in <recording>/offline_data/. The plugin will try to load this file on start up.

So even if Player crashes and reverts the session seetings, and Pupil from Recording is loaded: Does selecting Offline Pupil Detection a new detection or does it load the data from cache?

user-8779ef 09 January, 2018, 13:10:48

New detection.

user-8779ef 09 January, 2018, 13:10:58

It does not load from cache.

papr 09 January, 2018, 13:11:43

I just saw that the data is only cached on cleanup(). I will add another save after finishing detection explicitly. I do not think that an ui element to do so is a good idea. It should work reliable automatically.

user-8779ef 09 January, 2018, 13:12:05

Yes, that seems like a reasonable fix.

user-8779ef 09 January, 2018, 13:12:53

Pushing that change to the repo you've just shared?

papr 09 January, 2018, 13:13:09

yes

user-8779ef 09 January, 2018, 13:13:23

Great!

user-8779ef 09 January, 2018, 13:13:30

That should save me a lot of time.

papr 09 January, 2018, 13:29:28

@user-8779ef I implemented the "caching on finish" for both, offline pupil detection and calibration. Both is pushed to the branch above. You should see an INFO-level log message saying Cached ... data to <path>

user-8779ef 09 January, 2018, 13:30:34

@papr Thanks very much. I'll continue to try and break things, and will report back!

user-0d187e 09 January, 2018, 15:56:26

Hi guys, I just runned the Pupil player 1.2-7 on my mac and I noticed that the main menu is not responding when I click on the icons on the right side. It basically does nothing and does not show/hide the selected window.

user-0d187e 09 January, 2018, 15:57:55

nevermind. I deleted the setting folder and now is working

user-0d187e 09 January, 2018, 15:58:16

I guess the new version is not compatible with the old setting files

papr 09 January, 2018, 15:58:19

@user-0d187e This is a know issue that appears in some edge cases. I fixed it this morning in https://github.com/pupil-labs/pupil/pull/1006 Yes, deleting the settings folder is a work around

user-0d187e 09 January, 2018, 15:58:38

Thanks

user-2798d6 09 January, 2018, 17:45:23

Hello - I have downloaded the new version of the software, but am still having issues with offline calibration with natural features on my MacBook with Retina display. I click a point and the marker shows up almost 4 inches away on the screen up and to the left.

user-2798d6 09 January, 2018, 17:45:39

Is there something I need to adjust on my computer or Pupil settings?

user-8779ef 09 January, 2018, 17:46:01

@user-2798d6 . This issue has been addressed in the newest version. Update 😃

user-2798d6 09 January, 2018, 17:46:06

I did

user-8779ef 09 January, 2018, 17:46:20

What version does it say in "about" ?

user-2798d6 09 January, 2018, 17:46:22

Unless the update came out this morning

user-2798d6 09 January, 2018, 17:46:39

1.1-2

user-8779ef 09 January, 2018, 17:47:12

K, lemme check something on this end

user-2798d6 09 January, 2018, 17:47:17

ok, thanks!

user-8779ef 09 January, 2018, 17:49:00

Ok, I'm running the same version on a macbook pro w/retina, and no issue. I had this issue previously, but it was resolved in the latest update. So, I'm not sure. It's a question for the team.

user-2798d6 09 January, 2018, 17:52:20

Thanks for your help 😃

user-2798d6 09 January, 2018, 17:52:32

I will check in with someone about it. Maybe it's my computer

user-8779ef 09 January, 2018, 17:52:39

Tried my best 😃 . I don't think it's your computer.

user-8779ef 09 January, 2018, 17:52:47

It seems to be a mac issue - perhaps a retina display issue.

user-2798d6 09 January, 2018, 17:52:49

I appreciate it!

user-8779ef 09 January, 2018, 17:52:51

HOld on...

user-8779ef 09 January, 2018, 17:53:56

https://github.com/pupil-labs/pupil/issues/994

user-8779ef 09 January, 2018, 17:54:16

Please add your issue there. See if you can reopen the issue (I think you can, but maybe I have to).

user-2798d6 09 January, 2018, 17:56:32

I added - thank yoU!

user-8779ef 09 January, 2018, 17:57:30

No problem. You may also want to report your computer type (mac retina, etc etc). They're usually quite responsive, but hold tight.

user-8779ef 09 January, 2018, 17:57:35

I've re-opened the issue.

user-2798d6 09 January, 2018, 18:02:01

Hold on - apparently I had two versions of the software on my computer and may have been running an older one?

user-8779ef 09 January, 2018, 18:02:15

That makes sense. Don't worry - I did the same thing 😃

user-8779ef 09 January, 2018, 18:02:24

Should I close out the issue?

user-2798d6 09 January, 2018, 18:02:49

It's freezing at the moment so I can't see if it works - give me one second and I'll let you know! 😃

user-2798d6 09 January, 2018, 18:07:34

IT WORKED! Thank you so much for working through that with me @user-8779ef !

user-8779ef 09 January, 2018, 18:16:29

No problem LKH!

user-93f13b 09 January, 2018, 23:40:44

Hi all, I know this is probably somewhere and I am just being lazy and not finding it. But I am trying to save my pupil diameter to the .csv file. I know I read that it automatically saves data like this but frankly I am lost on how to access or enable this. As well does the diameter have a time stamp option? Background I am doing research into pupil dilation so I need to be able to sync the pupil diameter with tasks people are doing, I’m hoping to do this by just using the time stamps and then comparing them to the outside time. Sorry if I’m being a dummie here, but any help would be greatly appreciated.

wrp 09 January, 2018, 23:47:33

Hi @user-93f13b please open your recording in Pupil Player and open the raw data exporter plugin from the plugin manager. Press e or the down arrow ⬇ in the Player GUI to export raw data. Diameter data is listed in each row of the csv, each row/datum is time-stamped.

user-93f13b 10 January, 2018, 02:40:01

You are an absolute life saver! Thank you for the quick response. One more question. So I bought the diy headset, I was wondering if there is a way to adjust the eye mount? I read somwhere that it can be shaped if you heat it? currently it is only pointing at the very bottom of my eye and I was wondering how I could move it up somehow so the eye is in the center of the picture. Thanks again for helping with possibly dumb questions.

wrp 10 January, 2018, 02:50:27

You can bend/twist the DIY eye camera mount arm a bit. You may also want to try repositioning the headset frame on your nose bridge.

user-41f1bf 10 January, 2018, 10:37:34

You can also tie the nose bridge, so the frame will be higher. It also decreases sliping down into the nose.

user-0d187e 10 January, 2018, 14:13:57

Just a quick question guys. Is the current Pupil Capture executable program able to log the keypress events while recording?

user-93f13b 10 January, 2018, 14:16:39

Thanks I will try that. Another question when Installing the new exposed film for the ir sensors. After installing it I am not able to get the camera to focus? I have taken off the auto focus, and have tried redoing it a couple times to see if I put it on incorrectly but still the same problem. When I take it off the camera works fine and focuses. Any fix for this? I have been using normal exposed film from a camera, is there a different material I should use?

mpk 10 January, 2018, 14:58:12

@user-93f13b it should just work. Makse sure that you insert the lens far enough to focus. I find that running the camera and inserting the lens give you decent feedback.

user-93f13b 10 January, 2018, 15:33:48

Could I be doing something wrong when putting the film in? It gets close to being in focus but then i can't move the lense anymore.

user-93f13b 10 January, 2018, 15:34:34

as in it bottoms out.

mpk 10 January, 2018, 15:35:30

@user-93f13b no I m not sure whats wrong then.

user-6419ec 10 January, 2018, 15:46:36

Hi everyone, i was thinking about if it is possible to calculate the brightness of the delivered world camera image. Therefore i have the question, how am i able to save every world camera image as a seperated image during the record. I thought about writing a plugin, but i dont know how i can get access to the pixel image data?

user-41f1bf 10 January, 2018, 15:51:51

@user-6419ec you should be able to access image pixels through the "recent_events" function in a plugin. Each world frame is accessible in there

user-41f1bf 10 January, 2018, 15:54:32

You could export them usind cv2.imwrite() or just doing your stuff right there. Frames are numpy arrays.

user-41f1bf 10 January, 2018, 15:55:31

@user-0d187e take a look at pupil anotations in the docs

user-6419ec 10 January, 2018, 15:55:45

@user-41f1bf thank´s for the fast reply 😃

user-93f13b 11 January, 2018, 15:05:40

Im curious if this could be a good option for an eye cam? https://www.spinelelectronics.com/USB_2.0_Camera/2mp_usb_camera_modules/UC20MPD

user-93f13b 11 January, 2018, 15:06:22

thinking about taking off most of the IR leds becasue they are over kill. But not sure if the pupil capture can handle the camera?

user-8779ef 11 January, 2018, 20:59:41

Hey, is self.g_pool.gaze_positions_by_frame[fIdx][0]['index'] an index into g_pool['gaze_positions'] . ?

user-8779ef 11 January, 2018, 21:01:38

I am trying to get the sample of gaze data used to calculate the gaze position currently shown...and then want to search for the gaze data 0.5 secs ahead of time. (I need to grab the range from now through now+0.5 secs)

user-8779ef 11 January, 2018, 21:02:31

So, although gaze_positions_by_frame is most convenient data structure when finding the first sample used for the current frame, gaze_positions is more convenient for searching for the last sample less than N seconds from now.

user-8779ef 11 January, 2018, 21:07:06

https://www.youtube.com/watch?v=vqIsenRl_Wk&t=1m37s

user-8779ef 11 January, 2018, 21:07:37

I think the germans are asleep.

user-8779ef 11 January, 2018, 21:12:47

Ignore the previous question, I used this solution: curGP = next(gp for gp in self.g_pool.gaze_positions if gp['timestamp'] > events.get('frame').timestamp)

user-8779ef 11 January, 2018, 21:13:44

...and interestingly enough, the index value is not the index into gazePositions. Oh well...

mpk 12 January, 2018, 07:40:11

@user-8779ef events.get("frame").timestamp is the current world frame timestamp.

mpk 12 January, 2018, 07:41:57

@papr does this kind of thing a lot ich think the bisect tool is usefull.

mpk 12 January, 2018, 07:42:09

basically you make a list of all gaze timestamps.

mpk 12 January, 2018, 07:42:41

then you can find the value closest to target+.5 and you know the timestamp of the gp you want.

mpk 12 January, 2018, 07:43:07

then you can go through gaze_positions and find the gp you want.

mpk 12 January, 2018, 07:43:25

you can even make a dict that has all gp by timestamp value for faster access.

user-f68ceb 12 January, 2018, 10:13:24

Thanks @papr – So I understand correctly; you the software knows which point I am looking at on the mobile screen?

user-8779ef 12 January, 2018, 12:08:28

@mpk, Yes, I did something similar: startFr = np.where(self.g_pool.timestamps > aTimeStamp)[0][0]

user-8779ef 12 January, 2018, 12:09:08

Unfortunately, I eventually found myself in nested for-loop territory. I know, I know, I'm not proud.

user-8779ef 12 January, 2018, 12:09:21

gazeNormE0_fr=[]

user-8779ef 12 January, 2018, 12:09:23

for frame_gp in gpList_frame_gp: for gp in frame_gp: gazeNormE0_fr.append( gp['gaze_normals_3d'][0] if 0 in gp['gaze_normals_3d'].keys() else [np.nan] * 3 )

user-8779ef 12 January, 2018, 12:10:15

I would like to consider more elegant ways to search for data present on current frames, but I was struggling with your indexing scheme.

user-8779ef 12 January, 2018, 12:11:06

...and not familiar with an elegant way to search through the list of dicts for dict['timestamp] within a range.

user-8779ef 12 January, 2018, 12:40:09

@mpk . THanks for pointing out bisect(). That one's new to me.

user-9b14a1 12 January, 2018, 13:39:21

(solved) ... what am I doing wrong? turbojpeg is corrupting. I download newest apps to OSX 10.12.6 . tried to solve it by installing the developer dependencies. But no change. Do I have to update to OSX 10.13? It´s eating cpu pretty alot, I guess it´s ok . cpu 80 degree. Disconected ones also USB - drive. It´s running, but it makes me feel that I am doing something, with yellow messages on the screen. . . . Finally I could wipe out all turbojpeg-problems by migration into a clean user account. I guess some background processes slowing down the performance of my default user account.

Chat image

mpk 12 January, 2018, 13:53:02

@user-9b14a1 this is a potentially faulty usb connection. Can you try a different USB port?

user-9b14a1 12 January, 2018, 13:55:03

@mpk it´s macbook pro , it has 2. ok, i go to wifi and disconnect my harddrive. one moment

user-8779ef 12 January, 2018, 13:56:53

Guys, I have a question related to pyglui, and the Line_graph class. How do I define the length of the data, and replace the data? I only see an add() function right now. THe class has a field for data, but I see that in the init() it is defined as a cython view array: self.data = view.array(shape=(data_points,), itemsize=sizeof(double), format="d")

user-8779ef 12 January, 2018, 13:57:28

I'm not quite sure if I can modify a view array inside a *.pyx from a separate python file. Any thoughts on this? Is there an intended approach?

user-9b14a1 12 January, 2018, 14:01:54

@mpk (solved) ... the right side is worth. It´s skipping even half images. . . . Finally I could wipe out all turbojpeg-problems by migration into a clean user account. I guess some background processes slowing down the performance of my default user account.

Chat image

user-9b14a1 12 January, 2018, 14:02:53

Chat image

user-9b14a1 12 January, 2018, 14:04:22

Chat image

mpk 12 January, 2018, 14:05:37

ok. This looks like a fautly usb cable or hub. We can do a repair replacement. Please contact us via email for fruther diagnosis and coordination!

user-9b14a1 12 January, 2018, 14:11:14

@mpk ok that would be good.

user-8779ef 12 January, 2018, 14:13:26

who woudl I talk to about guidance making subtle modificaitons to pyglui?

mpk 12 January, 2018, 14:27:46

@user-8779ef talk to @papr he has work on it most recently.

user-8779ef 12 January, 2018, 14:28:30

Great. Thanks. I'll keep an eye out for him.

user-9b14a1 12 January, 2018, 14:43:55

@mpk Finally I could wipe out all turbojpeg-problems by migration into a clean user account. I guess some background processes slowing down the performance of my default user account. ( osx malware ).

mpk 12 January, 2018, 14:52:15

@user-9b14a1 great to hear!

user-8779ef 12 January, 2018, 15:51:09

@papr Lots of questions related to line_graphs in pylgui. currently, I have a bunch of data I've added to my line_graph, but the thing crashes up the all to graph.draw(). File "pyglui/graph.pyx", line 292, in pyglui.graph.Line_Graph.draw TypeError: not all arguments converted during string formatting

user-8779ef 12 January, 2018, 15:51:38

Not the most transparent error. Perhaps I'm not adding the right type of data?

user-8779ef 12 January, 2018, 15:52:44

I'm using system_graphs as a template, but building an analysis_plugin to plot gaze velocity data.

user-33d9bc 13 January, 2018, 05:34:54

Is there any age limit

user-33d9bc 13 January, 2018, 05:35:07

For when you can use the eye tracker technology

user-33d9bc 13 January, 2018, 05:35:09

?

mpk 13 January, 2018, 11:22:37

@malkawi#8305 upward no. for children we have custom frames. Babies need custom solutions.

mpk 13 January, 2018, 11:25:52

@performlabrit#1552 I would not use the graphs in pyglui but the timelines feature. Check out this: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/offline_surface_tracker.py#L117

mpk 13 January, 2018, 11:26:39

here you can draw a polyline of what you want to show.

user-33d9bc 13 January, 2018, 19:14:30

I meant regarding the IR and it’s effect

user-2798d6 14 January, 2018, 00:58:10

Hello - I am trying to work with the audio file I recorded while running Capture. I have exported 4 different "clips" from a full recording, and the video and audio will play on VLC player, but the audio doesn't play on anything else and there is no audio file that got exported. Is there anything I can do about this? I'm trying to see how fixations line up with audio.

user-8779ef 14 January, 2018, 15:35:12

@mpk an example of output with timeline

Chat image

user-8779ef 14 January, 2018, 15:36:50

@mpk Not quite there yet. Need to find a creative solution... it may make more sense to use something similar to the system graphs for this one, so that I can have a variable range of n seconds, and show a dynamic figure that is always focused on t-n/2 through t+n/2 seconds.

user-8779ef 14 January, 2018, 15:46:19

I have been in touch with papr about it.

papr 14 January, 2018, 16:53:08

Yes, we would need a solution, that scales all timelines at the same time, though. I will keep you up-to-date on that matter.

user-8779ef 14 January, 2018, 21:31:34

That makes sense. Thanks, papr.

papr 15 January, 2018, 08:38:22

@user-2798d6 Unfortunately, I cannot tell you why that is. Did you try transcoding it?

user-93f13b 15 January, 2018, 15:30:40

Im curious if this could be a good option for an eye cam? https://www.spinelelectronics.com/USB_2.0_Camera/2mp_usb_camera_modules/UC20MPD thinking about taking off most of the IR leds becasue they are over kill. But not sure if the pupil capture can handle the camera?

papr 15 January, 2018, 15:32:28

@user-93f13b 30 FPS sound a bit low for an eye camera. But this depends on your use case.

user-93f13b 15 January, 2018, 15:33:02

It is actually 60 fps, tested it.

papr 15 January, 2018, 15:33:53

Ah, and `320x240 (QVGA) [email removed] I did not look at the specifications before.

user-93f13b 15 January, 2018, 15:34:37

Im more wondering about the software side of it, if it is compatibale with the program. The camera switches between ir and normal. Is pupil capture able to switch that itself or would I need to do it another way?

user-93f13b 15 January, 2018, 15:35:04

I tried it, and the program crashes after about 5 minutes of use.

papr 15 January, 2018, 15:37:11

Mmh. So the eye process just uses the grey image of the video frames. As far as I know, our eye cameras have an IR light filter, there is only one mode. I cannot tell you anything about the crash though. Is there a crash log?

user-8779ef 15 January, 2018, 15:47:55

@papr. WOuldn't you want to remove the IR light filter, in this case? Your cameras operate in IR.

user-8779ef 15 January, 2018, 15:48:10

Ehr, your eye images are in the IR spectrum.

papr 15 January, 2018, 15:50:01

In this case it is a filter that lets through mostly IR light but nothing else.

papr 15 January, 2018, 15:50:57

I am sorry if I did not use the correct terminology

user-8779ef 15 January, 2018, 15:51:49

No prob - just making sure 😃 . That's an issue when it comes to hardware - really narrows camera selection for eye tracking, because most RGB cameras have built in filters that filter OUT the IR spectrum.

user-8779ef 15 January, 2018, 15:52:10

...at a scale that is really too small to remove manually after production.

user-8779ef 15 January, 2018, 15:52:34

@user-93f13b Make sure you have a look at what I just said regarding IR filters.

user-93f13b 15 January, 2018, 15:59:15

But what would be causing the system to crash. As well since that camera switches between the filters is there a way to have it stay in the ir mode?

user-93f13b 15 January, 2018, 16:00:56

How would I show the crash log?

papr 15 January, 2018, 16:02:59

There should be a capture.log file in your capture_settings folder. Upload it after the crash without restarting Capture. The might be a low-level libuvc issue specific to your camera, but this is out of reach for us to fix.

user-93f13b 15 January, 2018, 16:04:11

alright. Thanks all for the quick responses. Also quck quesitons that I know has been adressed before. SHould I worry about the amount of IR this thing puts out?

user-8779ef 15 January, 2018, 16:06:17

I think yes. Measure it, and compare to the flux outside on a sunny day.

user-8779ef 15 January, 2018, 16:07:10

Is there a reason the pupil cam isn't sufficient for your needs?

user-8779ef 15 January, 2018, 16:07:21

...just cost saving?

user-93f13b 15 January, 2018, 16:08:12

I am using the diy kit. And yes I am using the headset for my senior thesis. The department gives out money for that type of research, and it was not enough for the full headset.

user-8779ef 15 January, 2018, 16:08:31

Ok, makes sense.

papr 15 January, 2018, 16:09:02

Do you know that you can buy the eye cameras separately?

user-93f13b 15 January, 2018, 16:09:44

Yes I looked into that, the camera itself is still to expensive. It looked around 400 correct?

user-93f13b 15 January, 2018, 16:10:40

I am waiting for the hd 6000 to come in. The first one I frankly did not put together well so it stopped working. Thinking something to do with soldering. But this was a cheap alternative I wanted to try becasue it looked like it was better than the hd 6000

user-8779ef 15 January, 2018, 16:11:11

They've just upgraded to a new camera type. Maybe you can talk em down on a 120 hz for student academic use 😛 . (no, I don't work for them)

user-8779ef 15 January, 2018, 16:11:49

...and don't forget the educational discount.

user-8779ef 15 January, 2018, 16:12:51

( I probably should have said that in a private message :P)

user-8779ef 15 January, 2018, 16:13:03

always worth a shot, though.

user-93f13b 15 January, 2018, 16:13:38

No worries I dont mind haha

user-93f13b 15 January, 2018, 16:14:15

E0115 11:13:55.623034 13320 trust_region_minimizer.cc:72] Terminating: Residual and Jacobian evaluation failed.

user-93f13b 15 January, 2018, 16:14:29

I am getting that when I run the program.

user-93f13b 15 January, 2018, 16:14:55

Sorry pretty new to this, and have been having problems gettting things working/ findng resources to help

papr 15 January, 2018, 16:16:35

This is not a problem related to the camera. This is output of the 3d model software. It is an indication for a low-level issue if there are no uvc related erros in the logs.

user-93f13b 15 January, 2018, 16:25:26

It looks as if I keep getting flashed of the IR going on and off almost. I know this sounds weird

user-93f13b 15 January, 2018, 16:25:39

this is the the line it shows when it crashes.

user-93f13b 15 January, 2018, 16:25:40

E0115 11:25:05.889878 7912 trust_region_minimizer.cc:72] Terminating: Residual and Jacobian evaluation failed. OpenCV Error: Assertion failed (0 <= _rowRange.start && _rowRange.start <= _rowRange.end && _rowRange.end <= m.rows) in cv::Mat::Mat, file C:\build\master_winpack-build-win64-vc14\opencv\modules\core\src\matrix.cpp, line 483

user-3c2df0 15 January, 2018, 16:39:02

Daam now if we would get that kind of support at Tobii i never would need to use python i'm pretty shure lol

papr 15 January, 2018, 16:44:35

@user-93f13b This error message is new to me. It might that this is a usb transmission issue and that opencv crashes when it receives a broken frame. But this is only a vague guess.

user-8779ef 15 January, 2018, 17:06:11

@user-93f13b a general approach to problems like this: make sure all your packages are up to date.

user-8779ef 15 January, 2018, 17:06:28

'specially openCV, which seems to be the thing crashing here.

user-d74bad 15 January, 2018, 21:37:36

Recording capture question: are the calibration started and ended events present in recording data? And perhaps is even more specific data present, like calibration position?

papr 15 January, 2018, 22:22:18

@user-d74bad Yes, all this data is present in the pupil_data file as part of the notifications, if you calibrated during the recording. You can also calibrate offline if the recorded video includes the calibration markers or similar things that the subject had to fixate during the recording.

user-d74bad 15 January, 2018, 22:57:50

excellent, thanks

user-6e1816 16 January, 2018, 08:54:32

I have install pyserial on windows, but after move my plugin source code(contain "import serial") into the plugins folder and run the capture app, there is a WARNING: world - [WARNING] plugin: Failed to load 'real_time_picture'. Reason: 'No module named 'serial'',so how to solve this problem?

papr 16 January, 2018, 08:59:55

@user-6e1816 The bundle behaves differently than running from source. Please try copying the pyserial module into the plugin folder as well.

user-e938ee 16 January, 2018, 21:24:33

How do I subscribe to 3d pupil detector?

user-e938ee 16 January, 2018, 21:24:49

zmq::context_t context(1); zmq::socket_t subscriber(context, ZMQ_SUB); zmq::message_t message; subscriber.connect("tcp://127.0.0.1:50020");

while (1) { subscriber.recv(&message); cout << std::string(static_cast<char *>(message.data()), message.size()) << "\n"; }

user-921ec1 16 January, 2018, 21:28:34

Hey just wondering if anybody has an e-prime (or similar, we're not picky) script for inserting triggers/measurement points in to the tracker data that they wouldn't mind letting me have a look at. We are building a paradigm involving image displays and need to be able to insert timestamps for picture onset, etc

papr 17 January, 2018, 08:11:42

@user-e938ee Please be aware that 50020 is the designated Pupil Remote port. You will need a ZMQ_REQ socket to talk to it. You can connect and request the port for the actual subscription socket. See this Python script as reference https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py

papr 17 January, 2018, 08:14:22

@user-921ec1 you can use Pupil Remote as well in your case. Just use it to send custom notifications that contain information about your triggers and their timestamps and add the record key in order for the notification to be stored during a recording.

papr 17 January, 2018, 08:15:35

You should use it to sync clocks between eprime and Capture, too.

user-84047a 17 January, 2018, 11:20:37

Hi, looking back at offline calibration and it seems that some of the plugins are missing? As well as visualisation and analysis, there doesn't seem to be the third section that includes the offline calibration plugin. Does anyone know why this is the case, and how we can find the plugin?

Chat image

papr 17 January, 2018, 11:24:55

@user-84047a I would recommend to upgrade to the newest version. 😃

user-8779ef 17 January, 2018, 19:58:46

Hurrrm.... "Exception: pyndsi version is to old. Please upgrade"

user-8779ef 17 January, 2018, 19:59:08

I think I have screwed up my local git repo. , so if this isn't a common error, forget it.

user-5d12b0 17 January, 2018, 20:12:51

Even though the application is for hmd-eyes, I'm pretty sure this question is for pupil_capture in general. Is it possible to launch the eye windows already minimized? When stopping play on a Unity scene the eye windows close. When starting play the eye windows open up but take over focus. This causes the Unity application to pause and go to a SteamVR loading screen. Things only start working again when the Unity app is given focus by clicking on it... and for performance reasons we want to minimize the eye windows first before we give Unity focus. It would be easiest if there was an option to open the eye windows minimized.

papr 17 January, 2018, 20:46:25

@user-8779ef pyndsi is one of the dependencies.

papr 17 January, 2018, 21:16:13

@user-5d12b0 I will look into it.

user-5d12b0 17 January, 2018, 21:43:09

@papr , today I finally got around to testing the labstreaminglayer plugin. It works, but there are a couple issues. 1) The metadata isn't in the same format as other streams I was collecting. That might be my fault. I'll look into the metadata standard a little more and make changes to the pupil plugin or my streams, as appropriate. 2) The signals were highly unstable. This might not have anything to do with the plugin and might just be a consequence of me finally looking at the continuous data on a graph for the first time, but it seems like 20% of the samples were low confidence and had incorrect estimates of pupil position. I'll see if I can generate a png really quickly...

user-5d12b0 17 January, 2018, 21:50:46

Recorded with Vive add-on and labstreaminglayer plugin. Values are z-scored.

Chat image

user-8779ef 17 January, 2018, 21:52:43

@papr Yeah, I see that. Trying to upgrade... is it not a pupil module?

user-8779ef 17 January, 2018, 21:54:39

....because pip isn't seeing it.

papr 17 January, 2018, 21:56:59

If pip does not upgrade, use the - U flag. It will force an update.

user-8779ef 17 January, 2018, 21:57:31

sudo pip install pyndsi --upgrade -U ?

papr 17 January, 2018, 21:58:16

No, we did not register the package yet. You will have to use the repository URL as it is described in the docs

user-8779ef 17 January, 2018, 21:58:26

Ahh, Ok, thanks.

papr 17 January, 2018, 21:58:35

--upgrade is the same as - U

user-8779ef 17 January, 2018, 21:58:38

I would have thought dling hte master would have fixed this.

user-8779ef 17 January, 2018, 21:58:53

..forking the master branch would include the dependencies.

user-8779ef 17 January, 2018, 21:59:08

However, this happens with a fresh fork.

papr 17 January, 2018, 22:00:02

No, pyndsi is not part of the Pupil repository. It needs to be installed independently.

papr 17 January, 2018, 22:00:29

Same goes for pyrealsense, pyav, pyglui, etc

user-8779ef 17 January, 2018, 22:00:33

Ok, thanks for helping me out. Sorry for being all amateur-hour over here - still somewhat new to app development.

user-8779ef 17 January, 2018, 22:00:47

Yeah, understood.

user-8779ef 17 January, 2018, 22:01:31

This should be sufficient info for a fix.

papr 17 January, 2018, 22:03:18

You are welcome. We know that the whole dependencies part of the docs is very painful (especially on Windows) but maintaining an automatic installation is not something we realize right now. We have the bundles if you do not want to go through the manual installation.

user-8779ef 17 January, 2018, 22:04:15

I'm working on it in pycharm so

user-8779ef 17 January, 2018, 22:04:20

I'll deal. Thanks!

user-8779ef 17 January, 2018, 22:04:39

gotta run. Take it easy.

mpk 18 January, 2018, 07:36:16

@user-54a6a8 looking at the graph I think the data looks explainable. You can ignore all data with low confidence. In the beginning the eye0 is not detected and then we are seeing a few blinks and a few frames with no detection. (blinks can be identified by low confidence on both eyes simultationously.)

user-5d12b0 18 January, 2018, 13:09:48

@mpk the x axis is seconds. The ‘beginning’ is actually 35 seconds into the recording. Segments like that were found throughout the data.

papr 18 January, 2018, 13:14:24

@user-5d12b0 Why is the timestamp graph constant?

papr 18 January, 2018, 13:15:53

@user-5d12b0 Could you make such a recording again, but make a Pupil Recording in parallel to see if the data is transmitted correctly or if the issue is already in Pupil Capture

user-5d12b0 18 January, 2018, 13:40:35

The time stamps aren’t constant but rising slowly. I z-scored everything so I could see them on the same axes. I’ll do pupil capture in a few hours.

papr 18 January, 2018, 13:41:12

Ah, right, you mentioned that. Ok, lets wait for these results then.

user-5d12b0 18 January, 2018, 15:02:30

@papr , is there a way for me to record eye data without doing world capture? With the HMD addon there is no world camera.

papr 18 January, 2018, 15:02:56

Yes, you need to activate the Test Image/Fake Capture

user-5d12b0 18 January, 2018, 15:32:46

I was able to capture it. I loaded it up in pupil_player and it didn't look very good. The result of the hmd calibration was pretty bad too. I'll finish writing my Python script to plot my LSL data and the pupil_data then maybe re-record if necessary.

user-5d12b0 18 January, 2018, 15:40:09

@papr Any sample code on how to read pupil_data in Python. I see from the docs that it's python pickled pupil data but pupil.load(filename) doesn't work. I'll investigate how pupil_player does it on GitHub.

papr 18 January, 2018, 15:41:17

Actually, it is msgpack encoded. You can use this function to read the data: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/file_methods.py#L51-L66

user-5d12b0 18 January, 2018, 15:43:43

It works, thanks.

user-5d12b0 18 January, 2018, 16:29:58

@papr : Good news! The data are effectively identical between LSL and pupil_capture.

user-5d12b0 18 January, 2018, 16:30:12

Chat image

user-5d12b0 18 January, 2018, 16:33:02

Those segments of data where confidence was high are segments where I was running the HMD calibration. The data look pretty good except for the one-eyed blinks in eye_1. In between calibrations I didn't take off the HMD. I didn't even stop playing on the unity project. I did, however, use SteamVR to open the desktop view so I could see my desktop through the HMD.

papr 18 January, 2018, 16:34:04

Did you record the eye videos as well by any chance? Could you share them with us?

user-5d12b0 18 January, 2018, 16:35:03

I did, but I'm not sure they are correct. eye0.mp4 is 580 MB and it won't play in any player I tried. Same with eye1.mp4.

user-5d12b0 18 January, 2018, 16:35:16

I couldn't get them to play in pupil_player either.

papr 18 January, 2018, 16:35:36

That is ok. VLC should work though.

user-5d12b0 18 January, 2018, 16:35:50

I tried VLC. Do I need a particular codec pack?

papr 18 January, 2018, 16:36:22

Mmh. Could you upload one of the videos to Google Drive, or something similar?

user-5d12b0 18 January, 2018, 16:36:55

I can. First I'll redo the recording with only ~ 20 seconds of data.

papr 18 January, 2018, 16:37:38

Which version of Capture do you use?

user-5d12b0 18 January, 2018, 16:37:43

1.2

user-5d12b0 18 January, 2018, 16:57:12

This time the videos loaded in VLC. I don't know what happened with the last one. I was also able to view them in pupil_player and do offline detection. Still problematic though. I'm hoping you'll see something obvious in the videos about how wrong my setup is.

user-2798d6 18 January, 2018, 16:59:53

Hello! Is there any data output that shows saccade length (distance and/or duration)?

user-8779ef 18 January, 2018, 17:07:46

@user-5d12b0 As someone patiently awaiting news that HMD calib works, I'm thankful for the work you're putting into this.

user-5d12b0 18 January, 2018, 17:08:50

@papr I just shared the videos with you via google drive, at your pupil_labs address.

user-8779ef 18 January, 2018, 17:08:58

@user-5d12b0 You're in a neuro lab?

papr 18 January, 2018, 17:10:07

@user-5d12b0 great, thank you. I will have a look at it later.

user-5d12b0 18 January, 2018, 17:12:12

@user-8779ef Yes. In Ottawa Canada. We do invasive brain computer interface stuff with applications in Parkinson's and assistive communication for tetraplegics.

user-5d12b0 18 January, 2018, 17:12:57

Are you at RIT? I did my PhD at SUNY Albany with Jon Wolpaw. The group is now called the National Center for Adaptive Neurotechnologies.

user-8779ef 18 January, 2018, 17:13:27

@user-5d12b0 Awesome. Yeah, I worked in Brett Fajen's lab @ RPI, and have interacted with those guys once or twice.

user-8779ef 18 January, 2018, 17:13:52

@user-5d12b0 I now run my own lab at RIT where we study the principles of visually guided action and oculomotor control.

user-8779ef 18 January, 2018, 17:14:13

Hey, one tip for offline pupil detection. With the mobil tracker, the results are MUCh better if you lower the threshold on new eye models.

user-8779ef 18 January, 2018, 17:14:29

(for 3d pupil detection). I lower it to like .992 or so.

user-8779ef 18 January, 2018, 17:14:48

By default, the system switches modes far too quickly, disrupting the track often.

user-5d12b0 18 January, 2018, 17:15:48

Thanks. I'm more interested in online detection, but that applies to online detection as well, I guess.

user-8779ef 18 January, 2018, 17:16:10

yeah, that makes sense. ...and, yes, it would.

user-5d12b0 18 January, 2018, 17:18:53

In your field you probably know Doug Crawford? I'm trying to setup an experiment for a collaboration with him, and for a student in the lab. For the student's experiment we need online detection to progress through the task and give feedback. For the experiment with Doug, we can do offline detection. I'm a bit worried about the file sizes though.

user-5d12b0 18 January, 2018, 17:19:12

Do you plan to record eye videos for all your experiments?

user-8779ef 18 January, 2018, 17:19:22

Yes, I know Doug. I have met him at several conferences, and not of my graduate students recenlty presented at a workshop he was involved in.

user-8779ef 18 January, 2018, 17:19:43

Data is cheap.

user-8779ef 18 January, 2018, 17:19:50

Hard drives are cheap.

user-8779ef 18 January, 2018, 17:20:08

Yes, this stuff is space hungry, but I'm sure doug can find the scratch for a few drives. 😃

user-8779ef 18 January, 2018, 17:20:21

Yes, I plan to record eye videos for all of them.

user-5d12b0 18 January, 2018, 17:20:53

It's probably a good idea for us too. I guess it's not the storage I'm worried about as much as all the simultaneous things we are doing.

user-8779ef 18 January, 2018, 17:21:02

Yes, I understand.

user-5d12b0 18 January, 2018, 17:21:46

Do you have everything running on one PC?

user-5d12b0 18 January, 2018, 17:22:34

The current plan is to do the VR experiment and stream out task-related data on one PC, then do the neural acquisition and file storage on a second PC.

user-8779ef 18 January, 2018, 17:22:44

Sounds like a good plan.

user-5d12b0 18 January, 2018, 17:22:54

Our VR PC has already overheated a couple times just doing the experiment and the eye tracker.

user-8779ef 18 January, 2018, 17:23:05

I can only speak vaguely, but I believe we had issues recording eye images and doing 3D pupil detection at the same time.

user-8779ef 18 January, 2018, 17:23:41

Not sure where the bottleneck was. Our approach is to use 2D pupil det. during collection, and then 3D post-hoc.

user-8779ef 18 January, 2018, 17:23:47

offline 3D detection./

user-5d12b0 18 January, 2018, 17:24:31

I might have to plug the eye tracker into the second PC, and shuttle the data from the second PC back to the VR PC for task progression.

user-5d12b0 18 January, 2018, 17:24:53

I have yet to try that. I somehow doubt hmd-eyes supports that right now.

user-8779ef 18 January, 2018, 17:29:30

That's a pupil service task common to the mobile and VR tracker environments, so it might actually work.

user-8779ef 18 January, 2018, 17:29:50

as I understand it, the IPC backbone is well developed, and they have been really careful with timestamps

user-8779ef 18 January, 2018, 17:29:59

...but, ask papr!

user-8779ef 18 January, 2018, 17:30:32

ehr, maybe this isn't IPC, but their networking (which is something else).

user-8779ef 18 January, 2018, 17:30:41

in either case, I believe that aspect of the system is fairly far along.

user-5d12b0 18 January, 2018, 17:32:34

I know it SHOULD work, but using the .unitypackage from hmd-eyes has proven to be difficult. e.g., Whenever we were running pupil_service in a different directory than where it was specified in the Unity inspector, we could still connect to the already running pupil_service but other things wouldn't work. This was fixed by matching the inspector and the actual location of the running service. Sorry I can't remember the details of what wasn't working.

user-8779ef 18 January, 2018, 17:33:22

interesting.

user-5d12b0 18 January, 2018, 17:33:22

I'm letting the student toil away on those problems.

user-8779ef 18 January, 2018, 17:34:06

eeesh. Well sir, good luck. I'll keep an eye on your messages so that I stay abreast of your issues. You're farther along than I am, though.

user-8779ef 18 January, 2018, 17:34:16

Last I checked, the poor 3d calibration disuaded me from continuing.

user-5d12b0 18 January, 2018, 17:34:25

We aren't doing 3d calibration at all.

user-5d12b0 18 January, 2018, 17:34:48

Only 2D, and then reconstructing the gaze vector from the head pose + the 2D gaze vector.

user-8779ef 18 January, 2018, 17:35:02

ah, OK, nice.

user-8779ef 18 January, 2018, 17:35:27

that's a good way to go. I started down that path, but couldn't find enough info to reconstract the gaze vector (in a 3rd party program)

user-8779ef 18 January, 2018, 17:35:49

issues with unity's weird frustum. Not an insurmountable obstacle, but it put me off for now.

user-5d12b0 18 January, 2018, 17:36:08

All of our selectable objects have collision spheres around them so we can trigger progression on object selection.

user-8779ef 18 January, 2018, 17:36:27

Great if all your objects are sphereica in shape, or if precision isn't a big deal

user-8779ef 18 January, 2018, 17:36:42

Hey, sorry, but I have to run

user-5d12b0 18 January, 2018, 17:36:48

Thanks for the chat.

user-8779ef 18 January, 2018, 17:36:56

Yes, lets keep in touch.

user-5d12b0 18 January, 2018, 17:36:56

And the helpful info.

user-8779ef 18 January, 2018, 17:37:17

I try., and I want you to succeed, so I'll do what I can.

user-8779ef 18 January, 2018, 17:37:32

FYI trying to work with krystel huxlin @ uofR on a home system for visual rehab

user-8779ef 18 January, 2018, 17:37:39

needs a tracker

user-8779ef 18 January, 2018, 17:37:55

want to use pupil tracker, but it's not there yet. Using SMI for now. Not a good long term solution (they were bought by apple)

user-8779ef 18 January, 2018, 17:40:51

k, chat later, thanks

user-5d12b0 18 January, 2018, 17:40:57

There's Tobii. A bit expensive, and a bit too black box for me, but I might run out of time. Later.

user-8779ef 18 January, 2018, 17:41:03

yeah, forget tobii

user-8779ef 18 January, 2018, 17:41:11

they don't care about researchers

user-8779ef 18 January, 2018, 17:41:41

black box is right. their demos at scientific conferences are designed to hide the systems accuracy

user-8779ef 18 January, 2018, 17:41:54

I always give them guff, they never address the issue.

user-8779ef 18 January, 2018, 17:42:11

I'm not spending 10k+ just to find out if the thing is useable.

user-8779ef 18 January, 2018, 17:42:18

...at least they finally started providing eye images

user-8779ef 18 January, 2018, 17:42:29

still, they lost my faith ages ago.

user-5d12b0 18 January, 2018, 17:42:46

Right. I haven't loked into their API to see what was available. If eye images aren't available then that's a problem. Thanks for your perspective.

user-8779ef 18 January, 2018, 18:37:54

@papr Getting some strange behavior for offline 3d detection in the latest source

user-8779ef 18 January, 2018, 18:38:41

Chat image

user-8779ef 18 January, 2018, 18:39:17

Notice that the eye video is blank, despite the presence of eye videos in the scene insets.

user-8779ef 18 January, 2018, 18:40:04

...and notice that the FPS is 0 in the eye video, and the progress bar is stuck at 0%. Strangely enough, the eye videos still close as if the process did a good job. However, no pupil data is cached.

user-8779ef 18 January, 2018, 18:40:54

Shall I post an issue?

user-921ec1 18 January, 2018, 20:13:17

@papr Thanks for the advice reagrding using Pupil Service to send info through to e-prime. Are there any documentation or guides at all on how to do that?

mpk 19 January, 2018, 07:58:57

@user-8779ef this was a regression caused by commit https://github.com/pupil-labs/pupil/commit/44e719825ac6b8f615ae791fc6b496783919142c I just fixed it. Please pull master.

papr 19 January, 2018, 08:34:40

@user-8779ef Thanks for catching that 🙂

papr 19 January, 2018, 08:38:01

@user-921ec1 I would start by reading and understanding these pupil-helpers examples https://github.com/pupil-labs/pupil-helpers

papr 19 January, 2018, 13:46:18

@user-5d12b0 Your data looks different to me

Chat image

papr 19 January, 2018, 13:46:44

2d offline detection

papr 19 January, 2018, 13:49:13

This is the jupyther notebook that I used to visualize the data.

vis_eye_data.ipynb

user-5d12b0 19 January, 2018, 15:05:25

@papr Does doing a 2d offline detection overwrite the pupil_data file?

user-5d12b0 19 January, 2018, 15:05:48

because if so then the zip I sent you has the offline-detected data. Not the online. I'm only interested in the online data.

mpk 19 January, 2018, 15:06:20

online and offline 2d are the same. Was yours recording in 3d?

user-5d12b0 19 January, 2018, 15:07:46

I don't even know what that means. How do you record in 3d?

mpk 19 January, 2018, 15:12:57

check detection and mapping mode in the world window general settings.

user-5d12b0 19 January, 2018, 15:17:01

2D

user-5d12b0 19 January, 2018, 15:19:18

The graph you showed is different than the previous graph I posted because I uploaded a new shorter recording than what I used to generate the graph I posted.

user-5d12b0 19 January, 2018, 15:20:28

What you plotted is similar to what I have for the new data. Aren't those extended sections of low-confidence data in eye1 problematic?

user-bb3137 19 January, 2018, 15:22:24

Hi I want to know if it is possible to subscribe to multiple topics on a single socket I see that in the source code for Connection.InitializeSubscriptionSocket(), there is a subport that is used to initialize sockets, that are then paired with a topic I believe this is causing unity to crash when I make multiple calls to this function, as it tries to establish new sockets for topics, but using the same port What I want to achieve is very simple; I want to be able to subscribe to multiple topics

user-5d12b0 19 January, 2018, 15:27:05

@papr 100 * np.sum(np.asarray(pupil_data[1]['confidence']) < 0.25) / len(pupil_data[1]['confidence']) 20.510238335011749. 20% of my eye1 data has low confidence. What am I doing wrong?

mpk 19 January, 2018, 15:31:24

@user-bb3137 you can subscribe to mulitple topics on one socket in general. Unity3d specific questions I recommend asking in the hmd-eyes channel.

mpk 19 January, 2018, 15:31:57

@user-5d12b0 if @papr has your eye data he will have a look at in on monday and let you know!

user-5d12b0 19 January, 2018, 15:36:08

He has the eye videos. OK thanks. Do you have any ballpark numbers for what % of the data can be expected to have low confidence? It's not even the %... I'd be fine with 20% if it was exactly every 5th frame. The problem is that I have blocks of low confidence ~ 1-second long. My next idea is that it is related to the strain I'm putting on the system. I will try putting the eye tracker on a different computer.

user-5d12b0 19 January, 2018, 15:36:38

but no time today. Maybe Monday. Thanks again.

mpk 19 January, 2018, 15:37:29

@user-5d12b0 detection requires a decent view of the pupil. Let me check the videos and give you a more specific answer on monday.

user-8779ef 19 January, 2018, 16:19:26

@papr No prob! Another issue - debug mode no longer works for me. It hands after a folder is dropped on player. Could this be a config issue?

user-c828f5 19 January, 2018, 16:44:04

Hello, for some reason, the side panel doesn't seem to be working in the pupil player (1.2.7) GUI.

Chat image

user-c828f5 19 January, 2018, 16:44:40

The buttons on the side panel don't bring out the necessary pane for each option. Any thoughts on that? Seems to work fine on Pupil capture.

user-c828f5 19 January, 2018, 16:45:26

Tried re-installing pupil player. Experienced same problem.

user-8779ef 19 January, 2018, 16:51:39

@user-c828f5 Could your pane simply be adjusted to minimum width? Try changing the width by dragging from the small lines halfway down its height, and just outside the black sidepane.

user-c828f5 19 January, 2018, 17:09:20

@user-8779ef I tried all the possible lines to move it around but no luck! Also, it seems I have lost control of the time slider line (the vertical cyan line) using my mouse.

user-8779ef 19 January, 2018, 17:37:51

@user-c828f5 Mac or PC?

user-8779ef 19 January, 2018, 17:38:31

Search for the pupil_player_settings folder and find the player.log. Open it up and dump it here, if it's not super long.

mpk 19 January, 2018, 19:01:00

@Rudra8#0474 just delete the settings file in ~/pupil_player_settings

user-2798d6 19 January, 2018, 21:21:28

Hello! Is there any data output that shows saccade length (distance and/or duration)?

user-8779ef 20 January, 2018, 16:11:13

I'm getting an error when I try and import a module (pandas) into a plugin that is in the pupil_player_settings/plugins folder.

user-8779ef 20 January, 2018, 16:11:46

Is this something to expect - that I can't add modules that pupil isn't compiled with?

papr 20 January, 2018, 16:12:23

If you run from bundle yes

user-8779ef 20 January, 2018, 16:12:37

HUrmn. Bummer.

user-8779ef 20 January, 2018, 16:12:46

I'll see if I can convert my code to run on numpy only.

user-8779ef 20 January, 2018, 16:12:54

wrote it in a notebook

papr 20 January, 2018, 16:13:12

Or run from source. That should work as well.

user-8779ef 20 January, 2018, 16:13:42

Yeah, but I want to give this to a non-cs undergrad or two to use for analysis . Best to hide that from them.

papr 20 January, 2018, 16:13:58

I see.

user-8779ef 20 January, 2018, 16:14:05

thanks for the input, though

user-8779ef 20 January, 2018, 16:39:38

is scipy is a part of the package?

user-8779ef 20 January, 2018, 16:40:40

yes, yes it is.

user-8779ef 20 January, 2018, 16:43:28

@papr Still having issues running in debug mode. It hangs on loading the folder. Have any ideas?

papr 20 January, 2018, 16:44:00

What debug mode do you mean?

user-8779ef 20 January, 2018, 16:44:04

this is using pycharm on a mac.

user-8779ef 20 January, 2018, 16:44:32

I realize that this may be a pycharm question. Perhaps I should erase a folder or settings file or something

user-8779ef 20 January, 2018, 17:00:21

it hangs on ...

user-8779ef 20 January, 2018, 17:00:22

player - [WARNING] camera_models: Loading dummy calibration player - [INFO] gaze_producers: Calibrating "1" in 3d mode...

user-8779ef 20 January, 2018, 17:00:38

Strange, no?

papr 20 January, 2018, 17:16:28

Did you check if you have a break point somewhere which you are not aware of?

user-8779ef 20 January, 2018, 17:18:30

yeah, no bp

user-2798d6 20 January, 2018, 17:34:35

Hello - The side pane with plugins will not expand. I've re-downloaded the software and I've deleted the settings, but no luck. Is there anything else I can try?

user-33d9bc 20 January, 2018, 17:36:31

May I ask if there is a research channel? If not, can we create one?

papr 20 January, 2018, 17:39:10

@user-2798d6 deleting the settings should help. The next release will have a fix for that issue

papr 20 January, 2018, 17:40:12

@wrp I think @user-33d9bc has a great point. What do you think?

user-2798d6 20 January, 2018, 18:02:12

ok, thanks! Also, is there any part of the raw data output that gives info about saccade length or duration?

papr 20 January, 2018, 18:04:12

Not yet. I have been working on a saccade detector but it is not finished yet

user-8779ef 20 January, 2018, 18:05:12

@user-2798d6 I may be able to provide one ... depending on issues related to an NDA. I'll text you if it works out, but don't hold your breath.

user-2798d6 20 January, 2018, 19:04:50

Ok, thanks everyone! 😃

user-8779ef 20 January, 2018, 19:25:55

@papr for inspiration on the timeline class, you might look at Bokeh.

user-8779ef 20 January, 2018, 19:26:31

@papr ...where you associate a columndatasource and style with each glyph / timeline per figure.

user-921ec1 21 January, 2018, 21:17:40

@papr the link to your example client is broken, is there a working one I could check out? https://github.com/pupil-labs/hmd-eyes/tree/master/hmd_calibration

papr 21 January, 2018, 21:19:52

@user-921ec1 My link should point to a repository containing multiple examples. These are minimal examples, not full clients. These are independent of hmd_eyesbut use the same network interface.

user-921ec1 21 January, 2018, 21:21:01

@papr but the link is 404'd?

papr 21 January, 2018, 21:22:22

Your link is not the link I posted. These are the examples: https://github.com/pupil-labs/pupil-helpers

user-921ec1 21 January, 2018, 21:23:37

Oh I see what you mean, cheers!

user-921ec1 21 January, 2018, 21:25:18

@papr Do you happen to have a python program with the pupil lab tracker codes in it? It would be incredibly healpful to see/backwards engineer a functioning paradigm (I'm not too terribly programming-savvy)

papr 21 January, 2018, 21:30:53

Do you mean like an user program that fully integrates Pupil Remote?

user-921ec1 21 January, 2018, 21:44:49

Yeah 😃

user-8deb3b 22 January, 2018, 01:07:57

Hi! Everyone, is there anybody knowing how to integrate Pupil Lab to LSL?

user-8779ef 22 January, 2018, 03:00:15

@user-8deb3b @user-54a6a8 has been working on this lately.

wrp 22 January, 2018, 04:28:44

@user-8779ef @user-8deb3b - https://github.com/sccn/labstreaminglayer/tree/master/Apps/PupilLabs

wrp 22 January, 2018, 04:29:59

@user-33d9bc @papr I created 🔬 research-publications as a place for the community to share their research and publications

user-8779ef 22 January, 2018, 14:15:50

Thanks, wrp

user-8779ef 22 January, 2018, 14:16:20

So, is anyone here devloping on a mac? I'm having serious issues with debugging, regardless of the IDE

user-8779ef 22 January, 2018, 14:17:19

https://github.com/pupil-labs/pupil/issues/1029

user-8779ef 22 January, 2018, 14:17:45

Sadly, this problem seems quite esoteric. Three IDE's and not one will allow me to properly debug pupil player.

papr 22 January, 2018, 14:17:50

Is this issue specific to Player or does it appear in Capture as well?

user-8779ef 22 January, 2018, 14:17:58

I can try it with capture later today, papr.

user-8779ef 22 January, 2018, 14:18:41

Notice that Pycharm and VS Code fail in different ways, but both during calibration

user-8779ef 22 January, 2018, 14:18:51

and <sigh>, only in debug mode.

user-8779ef 22 January, 2018, 14:19:33

Wing actually doesn't crash, but it also ignores the breakpoints I'm placing in my submodule. Someone get me some scotch.

user-8779ef 22 January, 2018, 14:20:25

So, I'm stuck in the mud. Here's what I haven't explored yet - I've tried this only with one machine, and only video file, although I have tried deleting the offline data.

user-8779ef 22 January, 2018, 14:20:38

I'm going to try a different machine and video file later today.

user-8779ef 22 January, 2018, 14:20:52

...it will also be a mac running high sierra, though.

papr 22 January, 2018, 14:30:07

@user-8779ef After a short test, I can only tell that PyCharm uses 100% of my cpu to run Pupil Player in debug mode. I think that this is mostly a speed issue.

papr 22 January, 2018, 14:31:59

I know, it is far from nice, but maybe you can get further by simply using Sublime Text + manual breakpoints using https://docs.python.org/3/library/pdb.html

user-8779ef 22 January, 2018, 14:42:50

@papr I have successfully debugged in pycharm without issue for weeks, until the latest commit. Also after 3 IDE failing me, this tells me that the issue may lie elsewhere. Of course, I have no idea how you guys would go about addressing the possibility that something in pupil is at hand.

user-8779ef 22 January, 2018, 14:43:57

I'm vocal because I'm wondering if it's just me, or if others have the same issue. This issue has eaten up a lot of my time, lately.

user-8779ef 22 January, 2018, 14:44:33

If nothing else provides more details, I'll try sublimetex.

papr 22 January, 2018, 14:50:09

My collegue @marc uses pycharm regularly and did not encounter any issues yet.

user-8779ef 22 January, 2018, 15:04:25

@papr Thanks for inquiring. Mac, or PC?

papr 22 January, 2018, 15:06:43

I used it on my Macbook Pro (Late 2015) with High Sierra

papr 22 January, 2018, 15:07:04

My collegue uses it on a Linux machine.

user-78dc8f 22 January, 2018, 15:35:58

Hi All. We have some data from a session where we did the calibration, stopped the recording (but did not move the trackers), and then started the data collection session. Is it possible to combine these recordings or somehow tell pupil player to apply the calibration from the first segment to the second video segment?

papr 22 January, 2018, 15:39:08

@user-78dc8f There is an open issue for this feature request: https://github.com/pupil-labs/pupil/issues/1003 Unfortunately, this is not a trivial problem to solve.

papr 22 January, 2018, 15:39:20

With other words, this is not possible yet.

user-78dc8f 22 January, 2018, 15:43:29

@papr thanks for the link. Any idea on the timescale for cracking this? And can we link into this thread to follow updates?

user-78dc8f 22 January, 2018, 15:44:35

@papr Sadly, we have about 90 participants worth of data that has this issue since our team in India didn't realize this was an issue until just recently...

papr 22 January, 2018, 15:52:47

@user-78dc8f I would classify this as a mid- to longterm issue. There are a lot of issues that have higher priority to us due the amount of people being affected by them.

mpk 22 January, 2018, 15:54:44

@user-78dc8f I m sorry to hear that!

mpk 22 January, 2018, 15:55:07

I m not sure how we can best help since this is a not so easy to solve problem.

user-78dc8f 22 January, 2018, 15:55:11

@papr thanks for the info. If we can tip the balance toward 'mid' that would be brilliant. Here's the basic problem...we hook up a mom with the eye-tracker, then we have to set up the child. This can take a while, and we were having storage and over-heating issues. So we started stopping the videos in-between as a solution. And now we realize this was a problematic solution...

user-78dc8f 22 January, 2018, 15:56:15

We have lots of data from our UK sample to process, so we can wait a bit. So if 'mid' means a few months, that would be brilliant.

mpk 22 January, 2018, 15:56:50

@papr I think we could save the gaze mapper config in offine_data and could then move this to a different recording...

papr 22 January, 2018, 16:03:38

Alternatively, you could manually merge the recordings, most importently the videos and their timestamp files. But I cannot tell you for sure that this would work.

user-78dc8f 22 January, 2018, 16:05:06

both ideas sound promising...any way we can help test these possibilities?

papr 22 January, 2018, 16:11:04

We are not able to provide the manual merging solution. You would need to test this on your own. You probably can concatenate the videos using https://trac.ffmpeg.org/wiki/Concatenate The timestamp files are just numpy arrays that can be concatenated as well.

user-78dc8f 22 January, 2018, 16:13:19

@papr alrighty. I'll give this a try in the new few weeks. Basic idea is to just force them together and then see what pupil player does, yes?

papr 22 January, 2018, 16:13:34

Exactly.

user-78dc8f 22 January, 2018, 16:14:47

@papr @mpk ok. I'll give the merge idea a try. If that doesn't work, perhaps you can keep mulling over the idea of saving the gaze mapper config and moving this to a different recording...

user-8779ef 22 January, 2018, 16:20:51

@user-78dc8f I've put my two cents in as well, that this would be a valuable feature. Especially to the developmental context. Getting multiple calib. sequences out of a child is practically impossible. Considering the number of develop. psychologists that would love to update their eye trackers...

user-78dc8f 22 January, 2018, 16:22:31

@user-8779ef Excellent point. I hadn't even gone there yet. Yes, multiple calibration runs with a child would be impossible...I can see the value of applying a previous calibration. Even if it isn't perfect, it's better than what we have currently (which is often nothing...)

user-8779ef 22 January, 2018, 16:22:54

Which tracker are you using now? POs sci?

user-78dc8f 22 January, 2018, 16:25:08

@user-8779ef eyelink II in the lab. pupil labs in the home...

papr 22 January, 2018, 16:26:42

Are you able to get kids to wear the eyelink 2? This is impressive. On the images it looks very heavy...

user-78dc8f 22 January, 2018, 16:27:16

@papr No. We use a remote system in the lab....

papr 22 January, 2018, 16:28:32

Ah, I see. Thanks for the clarification.

user-78dc8f 22 January, 2018, 16:42:51

@papr @user-8779ef @mpk Thanks for the input. We'll try the concatenate approach and report back soon...

user-8779ef 22 January, 2018, 16:43:20

@user-78dc8f Please do report back. Thanks.

papr 22 January, 2018, 16:44:39

@user-78dc8f Please do so! Preferably in the issue linked above, such that we have a persistent reference for other users.

user-78dc8f 22 January, 2018, 16:45:02

@papr yep. Got it...

user-02ae76 22 January, 2018, 17:27:01

Hey, wondering if anyone has had success using the off screen calibration markers. I've had a lot of trouble using printed markers, and have found that the system continually tells me I haven't calibrated enough markers (I've done upwards of 20) - the calibrations I have managed to get with it were extremely subpar. Wondering if anyone has figured out the best method for presenting markers or whether this functionality is just unideal.

papr 22 January, 2018, 17:32:11

@user-02ae76 You only need to print one marker. The calibration should look similar to what can bee seen in the beginning of Offline Calibration tutorial: https://www.youtube.com/watch?v=lPtwAkjNT2Q

papr 22 January, 2018, 17:35:42

@user-02ae76 Alternativly, you could share a small recording with us and we could have a look at it and maybe provide ideas for improved detection.

user-02ae76 22 January, 2018, 18:20:44

I have tried it this way, my issue is when I try to calibrate online this way, it runs into issues as I am moving the marker. I may share a recording soon to get advice

user-02ae76 22 January, 2018, 18:22:27

Does off-screen just work better when offline?

user-8779ef 22 January, 2018, 18:29:22

@papr That video is really helpful. Thanks!

user-8779ef 22 January, 2018, 18:29:51

@papr You'll get much better results if you keep the marker stationary, and have the person engage in VOR.

user-8779ef 22 January, 2018, 18:30:15

you'll have less gaze/object error from biology, and independent of the pupil tracker, because the movement of the target in the head frame is voluntary and predictable. Jeff pelz does this and calls it the "head tic" method.

user-8779ef 22 January, 2018, 18:31:33

"Head tick" because he also had them move in small discrete increments.

user-02ae76 22 January, 2018, 18:45:38

@user-8779ef can you clarify the stationary method? Would you simply tack the marker somewhere and have the person keep their gaze on the marker while moving their head to different angles?

user-8779ef 22 January, 2018, 19:06:57

@user-02ae76 Yep. That's the only difference.

user-8779ef 22 January, 2018, 19:07:53

@user-ba183b ....and be sure to have them move their head somewhat slowly, and try not to blink 😃

user-02ae76 22 January, 2018, 19:53:14

@user-8779ef I appreciate the advice! I had seen someone mention that method on the Google a few years back and wasn't sure if updates had made it less ideal. Definitely will try doing it with a stationary marker!

user-02ae76 22 January, 2018, 19:54:08

Is there any more documentation that you know of on the "head tic" method?

user-8779ef 23 January, 2018, 00:16:09

@arispawgld#8014 No, but I'll try and remember to ask Jeff tomorrow, or the next time I see him (we're both at RIT)

user-8779ef 23 January, 2018, 00:18:15

@user-02ae76 One limitation of both approaches is that they assume you are always looking at the marker, and never blink. In truth, I would imagine folks might accidentally look away, or blink. Not sure how pupil handles these exceptions. Drops in pupil tracking confidence might deal with blinks. No way to tell when the person is not looking at the marker, but there are algorithms to detect outliers, like RANSAC.

papr 23 January, 2018, 08:36:46

We filter low confidence pupil positions before calibrating.

user-5d12b0 23 January, 2018, 14:40:32

@papr , did you get the chance to look at the eye videos I shared with you? Is there anything obvious I'm doing wrong? If it's just a settings thing, can you recommend some tips on "things to look for" while I tweak the settings? e.g., the pupil debug window which I have no idea how to interpret.

papr 23 January, 2018, 14:54:45

Mmh, generally it looked good. My guess these sections of low confidence are due to bad contrast. I would suggest to either increase the eye cameras exposure times or to increase the gamma/gain/contrast values.

user-5d12b0 23 January, 2018, 15:00:22

Thank you, I will try. I think the exposure times are already at their upper limit for 90 fps. But I can reduce the fps if it will lead to overall better tracking.

papr 23 January, 2018, 15:02:42

Then I would recommend the seconds option. The tracking is already good for most of the times. This is just about optimizing the last few percent.

mpk 23 January, 2018, 15:02:52

@user-5d12b0 one more option is to play with gain, gamma and contrast.

user-5d12b0 23 January, 2018, 15:54:38

@papr @mpk Sorry for the late reply, I'm working on something else at the moment... What is a good way to monitor pupil tracking performance, without doing a full recalibration? What should I look for in pupil_capture as a good indicator that things are working well?

papr 23 January, 2018, 15:56:08

There is a confidence graph for each eye in the Pupil Capture world window.

mpk 23 January, 2018, 15:56:22

also available in the pupil data stream on the IPC

user-02ae76 23 January, 2018, 16:37:15

Hey guys, running into a bug with the debugging window - it stays black even though previously it worked. Already tried reinstalling, should I raise an issue in Github?

Chat image

papr 23 January, 2018, 16:39:04

@user-02ae76 This is the 2d detector debug window. This is supposed to look like that. I guess you expected the debug window for the 3d mode?

papr 23 January, 2018, 16:40:05

@user-02ae76 You can change the detection mode in the general settings of the World window.

user-02ae76 23 January, 2018, 17:00:37

@papr I feel silly, thank you!

user-02ae76 23 January, 2018, 17:03:56

Might be a silly question, but in what settings would one use the 2D detection? I assume it would operate more like screen based ET, so I wonder what advantages there are within headset models.

papr 23 January, 2018, 17:06:49

The 2d calibration is more precise than 3d but way more prone to slippage. 2d/3d is just the pupil detection and mapping method. It does not matter if you are in a VR or a flat screen based setting.

user-02ae76 23 January, 2018, 17:26:11

Okay, I was misinterpreting what the dimension referred to. I'll experiment with both and see what works best for us, thanks again for the help!

papr 23 January, 2018, 17:26:31

You are welcome 🙂

user-8779ef 23 January, 2018, 21:34:51

@user-5d12b0 If exposure times are at their upper limit, then you are going to get motion blur.

user-8779ef 23 January, 2018, 21:35:46

@user-5d12b0 I mean, that's generally the case with cameras. I can't really say how bad it is with the pupil system because I haven't played with it. Increase the ambient light levels in the room to get better image quality with lower exposure times.

user-8779ef 23 January, 2018, 21:37:31

@user-5d12b0 Actually, I'm second guessing my suggestion to increase ambient light levels. We're dealing with the IR camera here, so the ambient light levels (which are mostly in the visible range) shouldn't play much of a role.

user-f1eba3 23 January, 2018, 21:39:55

Hi guys, I just found out about this platform and I'm thinking of integrating it in my Bachelor Thesys. So my quick question is : Can you integrate it with Unreal Engine ? Has anyone done project with it + Unreal ?

user-8a8051 24 January, 2018, 00:25:19

Hi guys, I opened a new recording in pupil player this morning and the clickable sidebar buttons would not open their respective menus. i attempted to reinstall player and capture and when attempting to open the recordings they are not valid.

user-8a8051 24 January, 2018, 00:25:53

Recordings were made using pupil mobile and capture and player are in stalled on my mac

user-8a8051 24 January, 2018, 00:27:26

mac os sierra version 10.12.6

user-8a8051 24 January, 2018, 00:28:12

and ideas would be greatly appreciated, thanks,

wrp 24 January, 2018, 00:39:57

@user-8a8051 running latest version of Pupil software?

user-8a8051 24 January, 2018, 00:40:06

yes

wrp 24 January, 2018, 00:40:11

@user-8a8051 Could you restart with default settings?

wrp 24 January, 2018, 00:40:20

From the general menu

wrp 24 January, 2018, 00:42:14

@user-f1eba3 we maintain a Unity3d plugin. There is no official support for unreal. However you can subscribe to messages over the network and develop your own plugin for Unreal - this would be a solid contribution 😄

user-f1eba3 24 January, 2018, 00:44:05

Im going to do a litle bit of research on how integrating third party software in Unreal. My professor told me we could be able to work with Pupil from March sooo if we decide on this I will definitly try to integrate it.

user-8a8051 24 January, 2018, 00:45:20

@wrp initially, even the general menu would not open. after the reinstall, the initial grey screen ('drop a recording' screen) remains and says oops that was not a valid recording

user-f1eba3 24 January, 2018, 00:46:01

Do you think it is feasible to develop such a plugin in a month @wrp ?

wrp 24 January, 2018, 00:46:23

@user-8a8051 please delete pupil_player_settings direct in your home directory

wrp 24 January, 2018, 00:47:02

@user-8a8051 you can direct message me with a link to a small sample recording that is not working for you and I can take a look

wrp 24 January, 2018, 00:48:07

@user-f1eba3 it really depends on what you want to do and your familiarity with the dev environment/coding for unreal

wrp 24 January, 2018, 00:48:17

I don't have any insight into Unreal

user-f1eba3 24 January, 2018, 00:50:13

We want to integrate it in this project : http://robcog.org/. I'm still somewhat of a noob even though I have developed one or 2 things in Unreal. I'm betting some colleagues could help me on this if I can figure out exactly what is to be done. 😄

user-2798d6 24 January, 2018, 02:26:46

Hello! Is there a plugin or some way to add a time bar in Player rather than or in addition to the frame bar?

wrp 24 January, 2018, 03:24:48

[email removed] - I wanted to let you know that we created a new repo pupil-community that will serve as a place to share community contributed projects, forks, plugins, and scripts for Pupil. https://github.com/pupil-labs/pupil-community

wrp 24 January, 2018, 03:25:20

If you would like to add your work (or edit) please fork this repo and make a PR to the README file 😄

wrp 24 January, 2018, 03:26:37

I know there are a lot more plugins, custom scripts, and projects out there and it would be great to have these all in one palce so that the community can build on top of your work and contribute back.

user-e7102b 24 January, 2018, 04:45:48

Hello, I'm attempting to read pupil info into MATLAB using the scripts posted by @user-ed537d in https://github.com/matiarj/pupil-helpers/commit/8d25d75645cf53082a423943ed79418d1563472b . I'm able to get the Python Server (pythonPupil.py) to connect to Pupil Capture (I see a stream of "Average Sent..." messages in the terminal), and I'm able to run PupilNetwork2.m in matlab and receive a steady stream of tracker data (e.g. Eye tracking data: [0.46, 0.10], 1.00) to the matlab command line However, the gaze coordinates/confidence messages I'm seeing at the matlab command line are not changing in value at all when I wear the eye-tracker and start the acquisition, suggesting something may be up with the connection. I'm not seeing any error messages, so I'm not sure what the solution is. Does anyone have any suggestions? I'm running Python 3, Macbook Pro (Sierra), Matlab 2017b. Any advice would be much appreciated! Thanks!

user-3c2df0 24 January, 2018, 05:41:36

Anybody can show me how accurate is this Device :O?

user-3c2df0 24 January, 2018, 05:42:39

sorry my bad. Software, hardware... Pupil at all i mean

wrp 24 January, 2018, 05:43:05

Hi @user-3c2df0 gaze accuracy with 2d mode (in ideal circumstances) is 0.6 degrees, with 3d mode you should be able to achieve 1.4 deg of accuracy

user-3c2df0 24 January, 2018, 05:43:28

is there a VIdeo i can see

user-3c2df0 24 January, 2018, 05:44:04

Or do you mind sharing your screen a minute so i can see, while you can say, I'm looking at : left, right, left , right

user-3c2df0 24 January, 2018, 05:44:46

Or. if you have the.... Information, rather in PIXELS then Degrees?

wrp 24 January, 2018, 05:45:19

@user-3c2df0 you might like to check out some community contributed demos to get a qualitative idea of accuracy: https://www.youtube.com/watch?v=X_BalnBOcpk&list=PLi20Yl1k_57pr6zl9D6JHSrOWyLXxsTQN

wrp 24 January, 2018, 05:45:40

👆 is a playlist of community contributed demo videos

wrp 24 January, 2018, 05:46:54

We gaze accuracy in degrees because this is industry standard, but also because the resolution of the world/scene camera is variable (e.g. you can change resolution of the world/scene camera)

wrp 24 January, 2018, 05:47:03

I hope this is helpful

user-3c2df0 24 January, 2018, 05:48:10

Woooah Is that persson in the video interacting with multiple screens? Now I'm asking myself BUT How does it work.. ? Receivers on monitors and Senders on the face?

user-3c2df0 24 January, 2018, 05:48:44

This is Absolutly awsome x"D

user-3c2df0 24 January, 2018, 05:49:46

Or am i Imaging things and .. the device is simply and Eye Tracker, not even hooked up to the computer x"D?

user-3c2df0 24 January, 2018, 05:49:57

(not interacting*)

user-3c2df0 24 January, 2018, 05:53:59

Sorry not understanding the concept i dont live in the future yet 😮

user-3c2df0 24 January, 2018, 06:10:25

Ok so this is Basicly Open source... SO i can Basicly hook up As much Devices as i want ._.

user-3c2df0 24 January, 2018, 06:10:35

WTF ❤

wrp 24 January, 2018, 08:07:29

@user-3c2df0 You need to hook up Pupil hardware to a computing device (e.g. laptop, desktop, or Android device running Pupil Mobile) in order to power the Pupil Headset and get data (video frames) from the Pupil headset. Pupil Capture is the software we have developed for real-time pupil detection, gaze estimation, surface detection, and more. You can subscribe to Pupil's data stream over the network. So, yes, you can hook it up to other computing devices that are connected on the same network.

wrp 24 January, 2018, 08:08:01

For more information on the API/network communication please see: https://docs.pupil-labs.com/#interprocess-and-network-communication

user-8779ef 24 January, 2018, 13:52:04

@ÐarkMøðns#4964 It's good to be excited, but realize that the device only gives you a crosshair on an image after calibration. As the video demonstrates, we make many eye movements a second. If you want more information, like WHAT is under the crosshair, you will have to manually label it, or develop an algorithm to analyze the image data, or that communicates between the eye tracker and your rendering software (e.g. unity). In either case, to get accurate data takes time, expertise, and patience.

mpk 24 January, 2018, 14:16:05

@user-3c2df0 you can also use our fiducal marker tracker to autolocate the screen in the video and map gaze onto that.

user-02ae76 24 January, 2018, 15:02:34

Is there any way to use an online calibration as a basis for offline? i.e. start with the online calibration and adjust from there.

user-f1eba3 24 January, 2018, 15:19:53

So I'm thinking of making a plugin for Unreal Engine. Now I'm researching the Approuch

user-f1eba3 24 January, 2018, 15:20:00

Should I Build

user-f1eba3 24 January, 2018, 15:21:13

Server publishing the eye tracking data over the network. Is there something working in this area, and I just need to subscribe, or do I have to implement the publisher as well, in the SDK?

user-f1eba3 24 January, 2018, 15:21:16

Or

user-f1eba3 24 January, 2018, 15:21:31

Client (Unreal Engine Plugin) subscribing to server and accessing the data.

papr 24 January, 2018, 15:22:05

@user-f1eba3 Pupil Capture publishes all data via a zmq.PUB socket. You just need to subscribe to this socket to receive the data that you need.\

papr 24 January, 2018, 15:22:37

I would suggest to have a look at the https://github.com/pupil-labs/hmd-eyes/ project.

user-2798d6 24 January, 2018, 17:02:50

Hello! Is there a plugin or some way to add a time bar in Player rather than or in addition to the frame bar?

papr 24 January, 2018, 17:47:56

@user-2798d6 what data should the timebar show?

user-2798d6 24 January, 2018, 19:10:38

@papr - minutes and seconds into the recording, and down to milliseconds would be awesome!

papr 24 January, 2018, 19:18:10

@user-2798d6 understood. There is an open issue for that. I cannot tell you if it will be solved with the next release, but within the next two for sure.

user-2798d6 24 January, 2018, 19:19:51

awesome! Thank you @papr!

user-02ae76 24 January, 2018, 19:20:27

@papr Wondered if you could help with an issue, we have been trying to work on using manual marker calibration and started running into an issue where the calibration would start and then immediately end. We have tried restarting, etc. Running on a MacBook PRO

user-c828f5 24 January, 2018, 20:58:38

Hey guys, so I recorded data post calibration and the 'gaze from recording' is accurate. However, when I calibrate the recording offline, it calibration is completely off.

The offline pupil detection worked very well.

user-c828f5 24 January, 2018, 20:58:47

Has anyone experienced this?

user-c828f5 24 January, 2018, 23:39:59

There also happens to be another bug where in the calibration status remains stuck at 99%.

The last message from pupil is:

player - [INFO] gaze_producers: Calibrating "1" in 3d mode... Ceres Solver Report: Iterations: 23, Initial cost: 4.722104e-01, Final cost: 1.205650e-02, Termination: CONVERGENCE

Does this mean it has applied the calibration to all the gaze samples?

Chat image

user-8779ef 25 January, 2018, 00:45:15

@user-c828f5 They have fixed this one and committed the fix to the main branch. It will be incorporated into the next release. ...and yes, it is just an issue with the text update. It may also prevent writing to disk.

wrp 25 January, 2018, 02:04:11

@user-02ae76 Can you make a short sample recording that demonstrates this behavior where calibration is included in the recording (start recording first then start calibrating)? I see your issues now on github as well.

wrp 25 January, 2018, 02:09:39

@user-c828f5 Mapping is complete at 99% as @user-8779ef notes

user-2798d6 25 January, 2018, 03:19:08

Hello - Is there a way to remove one of the eye videos after the recording? I recorded with both eye cameras and discovered after the fact that the right eye did not get a good read, but the left eye did. I've tried messing with a few things in the offline pupil detection process, but it's not working. Is there a way to only use the left eye or does that compromise the integrity of the recording?

mpk 25 January, 2018, 05:37:16

@user-2798d6 have you tried just removing the eye video from the recording?

papr 25 January, 2018, 10:04:01

@user-2798d6 The issue for reference: https://github.com/pupil-labs/pupil/issues/949

user-8779ef 25 January, 2018, 14:05:38

@user-c828f5 @wrp The relevant (closed) issue. https://github.com/pupil-labs/pupil/issues/1008

papr 25 January, 2018, 18:15:09

@user-2798d6 Ready for you in the upcoming release: Time-based seek controls.

Chat image

user-2798d6 25 January, 2018, 18:20:25

@papr THANK YOU! That's perfect!

user-2798d6 25 January, 2018, 18:21:49

@mpk - will removing the eye video from the screen actually remove it from being considered for fixations and such? I'm not getting a read on fixations for a portion of my video because (I assume) one eye has a 0.99 confidence but the other is around 0.20 or lower.

papr 25 January, 2018, 18:22:49

@user-2798d6 He meant to delete the actual eye video file. Renaming it should do the trick as well.

user-2798d6 25 January, 2018, 18:25:22

Oooooh ok - that makes sense. I'll give it a try!

user-02ae76 25 January, 2018, 18:29:05

Is there any way to update pupil without entirely reinstalling?

papr 25 January, 2018, 18:31:36

@user-02ae76 If you run from source, you just need to pull the changes from the master github branch. The application needs to be downloaded. Somewhere in the future, the application will notify if an update is available.

papr 25 January, 2018, 18:32:12

You usually do not need to reinstall, as I mentioned in your github issue. Deleting the correct settings will do the trick.

user-02ae76 25 January, 2018, 18:39:22

@papr thanks I was actually wondering if the same worked!

user-8779ef 25 January, 2018, 19:16:49

@papr The addition of time is great. Any idea when the next release is coming?

papr 25 January, 2018, 19:17:15

Hopefully next week 🤞

user-8779ef 25 January, 2018, 19:17:20

There have been lots of good updates since the last one. 😃

user-8779ef 25 January, 2018, 19:20:08

BTW, positive results with a new tracker on 120 hz mode

user-8779ef 25 January, 2018, 19:20:32

We are gearing up to record about 40 subjects worth of data, 30 mins each.

papr 25 January, 2018, 19:21:26

That's a lot of data...

user-8779ef 25 January, 2018, 19:21:32

YEP.

user-8779ef 25 January, 2018, 19:21:54

I have my ISI detector working well, although it's not streamlined as a plugin due to my issues with the debugging mode.

user-8779ef 25 January, 2018, 19:22:09

I'm half convinced these issues are related to mutiple python environments, but that's a hard thing to debug.

papr 25 January, 2018, 19:23:04

Which algorithm did you implement for the ISI detector?

user-8779ef 25 January, 2018, 19:23:09

My own.

user-8779ef 25 January, 2018, 19:23:28

applies binary filters on velocity, rising accel, and falling accel.

user-8779ef 25 January, 2018, 19:23:39

Only good for post-hoc use, really. Not for real time.

papr 25 January, 2018, 19:25:00

The offlne blink detector uses such filters as well. You might be able to use it as a template for your plugin

user-8779ef 25 January, 2018, 19:25:15

I wrote the plugin the plugin, for the most part.

user-8779ef 25 January, 2018, 19:25:27

The skeleton is all there.

user-8779ef 25 January, 2018, 19:26:11

The one function I need to fill is ins the one that applies my algorithm after a button click. Right now it just imports data that is. processed in a notebook and then saved to disc (a temporary fix)

papr 25 January, 2018, 19:27:14

How fast is it? How many pupil positions per second are you able to process on your macbook?

user-8779ef 25 January, 2018, 19:27:33

I haven't really timed it.

user-8779ef 25 January, 2018, 19:27:47

Next time I meet with my sponsors, I'll ask if I can share it.

user-8779ef 25 January, 2018, 19:28:19

They may make me wait until the project is complete 😦

user-0d187e 25 January, 2018, 20:57:36

Q: The eye sphere model constantly updates itself during the use of the tracker. I wonder how useful the 3D data could be when the origin of the sphere changes constantly. I mean when the model changes, all the 3d data (phi and theta angles, circle_3d_norms and ...) will be changed accordingly.

papr 25 January, 2018, 20:58:47

Old data points do not loose their validity if you mean that.

user-0d187e 25 January, 2018, 21:05:12

how? the phi angle for example. it's measured in a polar coordinates system that is defined based on the eye model

user-0d187e 25 January, 2018, 21:05:35

when the eye model changes the old phi angles won't be valid any more

papr 25 January, 2018, 21:08:19

They are valid within the old model. New models are created due to e.g. slippage. New pupil data is based on the new model, old data based on the old one. That is why each pupil datum includes information about its model.

papr 25 January, 2018, 21:11:06

Do not forget that this is just pupil data. Pupil data lies within the eye camera coordinate system. This data is then mapped to "gaze data" within the world camera coordinate system. 3d Gaze data includes a gaze vector that is independent of the current eye models

user-de97ec 25 January, 2018, 21:36:08

Anyone in here familiar with the blinking demo?

papr 25 January, 2018, 21:36:28

Could you be more specific?

user-de97ec 25 January, 2018, 21:37:43

Everything Connects through unity fine, however the script is not detecting blinks and not calling the 'CustomReceiveData' function at all.

user-de97ec 25 January, 2018, 21:38:08

I feel like im missing something obvious.

papr 25 January, 2018, 21:38:33

Ah, this blinking demo. Please refer to the 🥽 core-xr channel for the unity questions. 🙂

user-de97ec 25 January, 2018, 21:38:51

Thanks

user-de97ec 25 January, 2018, 21:39:03

😃

papr 25 January, 2018, 21:39:46

Just to be sure, did you turn on the blink detector in Pupil Capture/Service?

user-de97ec 25 January, 2018, 21:51:40

Yeah, still nothing...

user-0d187e 25 January, 2018, 22:16:29

Should we expect a big jumps in the 3d pupil data everytime that the model gets updated?

papr 25 January, 2018, 22:17:43

@user-0d187e would you mind to explain your setup and what you want to do?

user-0d187e 25 January, 2018, 22:21:39

For a project I need to use the pupil 3d data and not the gaze and I need those to be measured relative to a fixed model so that the reference remains the same throughout the experiment with a few trials. But I wasn't sure if those 3D data are relaiable as the model changes. I wish there was a way to prevent the software from updating the model after the calibration.

papr 25 January, 2018, 22:23:38

This is definitively possible. But you will have to make manual adjustments to the source code.

user-0d187e 25 January, 2018, 22:24:03

fo course. Thanks

user-0d187e 25 January, 2018, 22:24:56

But Please consider this for the future versions. For some studies the model shouldn't change after calibration.

papr 25 January, 2018, 22:25:59

@user-0d187e We are actively working on improving our pipeline. Model stability is one of the aspects. Keep an eye on the release notes. 🙂

user-8779ef 26 January, 2018, 02:24:38

@user-0d187e I agree with your intuition, and I have found better performance when I lower the threshold to change models. @papr (who probably played a significant role in designing the system) is also correct to suggest that it may be more robust to slippage if you raise the threshold. I think we're all still learning the art of how to adjust the threshold based on the particulars of the session - no one threshold will be ideal for all situations.

user-8779ef 26 January, 2018, 02:24:54

I generally lower my threshold to about .991-.992

user-8779ef 26 January, 2018, 02:26:35

@user-0d187e And, just to be clear, the threshold is set during 3D calibration, in the individual eye windows. It's the bottom-most slider. Make sure to restart pupil detection following a change.

user-e7102b 26 January, 2018, 03:25:09

@papr Hi, I noticed you responded to @user-921ec1 's post last week regarding sending event markers via eprime and suggested using pupil_remote. Is there any chance you would be able to provide an example of the command used to send event markers via pupil_remote? Also, can you tell me where I would be able to view these event markers? I did a test recording and attempted to send several triggers using the following command: socket.send_string('TRIGGER') ...but when I exported the data I was unable to see these commands anywhere in the pupil_positions.csv file. I'm clearly doing something wrong. @user-921ec1 ...did you manage to figure this out? Thanks!

papr 26 January, 2018, 08:04:33

@user-8779ef The pupil detection and mapping pipeline have been there before I started working at Pupil Labs. I do not have as much insight into it compared to the parts of the software.

papr 26 January, 2018, 08:11:26

@user-e7102b I will create a Pupil Helpers example that will illustrate how to do that.

user-e7102b 26 January, 2018, 19:15:39

@papr Great - thank you!

user-8779ef 26 January, 2018, 21:06:38

Can anyone tell me how to get to the exports folder from inside a plugin?

user-8779ef 26 January, 2018, 21:07:22

surely relevant folder info is passed in, I"m just not sure how to access it, and I can't debug right now (not sure why, it's an ongoing saga)

user-8779ef 26 January, 2018, 21:18:09

Found it: <export_range = g_pool.seek_control.trim_left, g_pool.seek_control.trim_right export_dir = os.path.join(g_pool.rec_dir, 'exports', '{}-{}'.format(*export_range))>

papr 27 January, 2018, 09:04:40

@user-8779ef There is a notification that indicates an export intend by the user. See this example: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/blink_detection.py#L166-L167

papr 27 January, 2018, 09:05:33

The plugin should listen to this notification instead of having their own export button.

user-8779ef 27 January, 2018, 14:27:26

@papr It's actually an import button. Due to my issues with debugging, I'm unable to complete my code. The workaround is to calibrate, export gaze positions, process them in an external program, write out saccades/isi from this program as a csv, and import the csv back into pupil for visualization and validation against the video.

user-f68ceb 28 January, 2018, 06:27:24

Hi @user-8779ef – I followed your conversation and just wanted to asked you if the initial setup of the pupil labs glasses/software was easy on your MacBook? I am about to order a set and just wanted to make sure tracking and recording works fine. I am not a real tech experts. Thanks for help.

user-6db96e 28 January, 2018, 09:57:30

Trying to run Pupil Capture on a Asus Rog notebook and no UVC source is found, not the builtin webcam nor other USB UVC cameras. What might be the issue? Works fine for me in OpenCV and Skype. https://image.ibb.co/c5da4G/43235345.jpg

papr 28 January, 2018, 10:14:46

@user-6db96e On Windows, you need to make sure that the correct drivers are installed for your cameras: https://docs.pupil-labs.com/#troubleshooting and https://github.com/pupil-labs/pupil/issues/1011#issuecomment-360345338

user-6db96e 28 January, 2018, 10:22:09

hello. The drivers are installed and work in other programs.

papr 28 January, 2018, 10:24:47

@user-6db96e In the UVC Backend plugin (click the ≡ icon), what cameras are listed in the Active Source selector?

user-6db96e 28 January, 2018, 10:28:58

Only one "unknown" when clicked says "WORLD: The selected camera is already in use or blocked". No other program using the camera is running.

papr 28 January, 2018, 10:33:43

This means that the drivers are not correctly installed. Be aware that Pupil Capture requires special drivers in order to recognize the cameras correctly. The cameras should be listed as Pupil Cam1 IDx and not as unknown. Please see the links above on how to make sure that the correct drivers are installed.

user-6db96e 28 January, 2018, 10:39:14

Is that the case with any UVC camera?

papr 28 January, 2018, 10:43:13

Yes, because the default Windows driver for imaging devices does not give us enough control over the device, e.g. does not allow to run the eye cameras with 120Hz. Be aware that you will need to manually install the libusbK drivers if you want to use cameras that were not provided by Pupil Labs. See the the instructions for such a manual installation: https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md

user-6db96e 28 January, 2018, 10:43:43

thx ill get back after im done

user-6db96e 28 January, 2018, 10:44:44

Is the situation different on Linux?

papr 28 January, 2018, 10:44:55

Yes

user-6db96e 28 January, 2018, 10:45:29

Im going to switch to Ubuntu at one point. Is there docs for Linux as well or is it much simpler?

papr 28 January, 2018, 10:49:56

@user-6db96e Do you run from source or do you simply use the bundled application? In the second case you just need to install the deb package and you just be good.

papr 28 January, 2018, 10:50:36

In the first case you need to copy/paste a list of dependency install instructions that are listed in the docs.

user-6db96e 28 January, 2018, 10:52:55

https://image.ibb.co/iNF8rw/5432525.jpg

papr 28 January, 2018, 10:55:08

Nice, that looks good. The next problem is that Pupil Capture was not able to set our default camera settings for your camera. You will need to manually adjust them in the Sensor Settings and Image Post Processing menus on the right. Unfortunately, I cannot tell you which exact settings you need, since I do not know your camera.

papr 28 January, 2018, 10:56:02

But since your image looks over-exposed I would suggest to start with reduzing the absolute exposure time setting.

user-6db96e 28 January, 2018, 10:56:54

Error: could not set value

papr 28 January, 2018, 10:58:47

Right, else Pupil Capture would have been able to set it automatically as well... Are you able to change any of the Image Post Processing values?

user-6db96e 28 January, 2018, 11:01:33

no

papr 28 January, 2018, 11:02:27

Mmh. You are still running on Windows, correct?

user-6db96e 28 January, 2018, 11:02:49

yes

papr 28 January, 2018, 11:06:39

Then my recommendation would be to switch to Linux and hope that it is able to set these values. If this does not work as well my guess would be that your camera is not as UVC compliant as expected by Pupil Capture. But this is just a guess. I am not deeply familiar with these low-level details. With a bit of luck @mpk has an idea what the reason for this behavior is.

user-6db96e 28 January, 2018, 13:51:10

can a long 5 meter cable for the camera introduce lag? If yes, how much?

papr 28 January, 2018, 14:16:54

@user-6db96e Not sure, probably only very small one. But there might be usb power issues. Make sure it is usb-conform.

user-6db96e 28 January, 2018, 14:17:28

by small are we assuming sub frame levels?

papr 28 January, 2018, 14:22:04

definitively. But you can test it yourself with pyuvc. Just compare the frame timestamp with the timestamp of first availablity in pyuvc using the two differennt cables. Measure this for 1000 frames each and see if the two distributions overlap

user-6db96e 28 January, 2018, 14:22:27

gotcha, thanks

user-6db96e 28 January, 2018, 14:23:46

is it possible for an UVC camera to be accessed by two programs at once? Sadly the other one is not in Python so I cant mofigy the code for them to access the same frame data.

papr 28 January, 2018, 14:25:51

I am to 80% sure that this is not possible. But if you can use Pupil Capture's Frame Publisher plugin to stream the images using zmq

user-6db96e 28 January, 2018, 14:28:07

sorry to clarify Im not talking about pupil capture but using the pupil library in code

papr 28 January, 2018, 14:32:47

Given that there can only be one program instance that receives the camera video, then you either need a special programm that duplicates the video into two virtual camera devices or you need one of both programs to relay the data to the other. The Frame Publisher plugin is able to relay the data. The only thing left is that the other program is able to receive it.

user-6db96e 28 January, 2018, 14:34:48

the other program now that i think about it just asks for an image each nth of a second, not necessarily a uvc camera. So as long as frame data can be read by Pupil in Python I can somehow feed it to the other program which is in C++

papr 28 January, 2018, 14:35:14

correct

user-6db96e 28 January, 2018, 14:35:35

thank you

user-6db96e 28 January, 2018, 17:02:26

So the webcam I made the changes to to work with Pupil is now not recognized by other programs it used to be recongized by, such as Skype. Is this normal?

papr 28 January, 2018, 17:03:12

yes, since the other programs do not look for libusbk devices

user-6db96e 28 January, 2018, 17:33:05

but I can't get it back...

papr 28 January, 2018, 17:40:17

if you deinstall the drivers as in the instructions above, disconnect the device and reconnect it, then windows should install the correct Imaging Devices drivers

user-6db96e 28 January, 2018, 17:43:56

its a webcam of a notebook (embedded)

papr 28 January, 2018, 17:46:07

then rebooting after deinstalling the drivers should have the same effect

user-6db96e 28 January, 2018, 17:46:14

ok

user-6db96e 28 January, 2018, 18:07:43

didnt work. https://image.ibb.co/kdfL6w/Untitled.png

papr 28 January, 2018, 18:36:39

It should either be listed as libusbk device or as imaging device. Please enable the hidden devices in the View menu as well

user-6db96e 28 January, 2018, 18:52:03

Well, I uninstalled its driver, rebooted Windows 10 and its there (in the pic), under Imaging devices theres only my printer

user-6db96e 28 January, 2018, 18:52:40

it says the driver is already installed, but it isnt

user-8779ef 29 January, 2018, 03:21:12

@user-f68ceb Installation of the capture/player software was simple. However, getting reliable data from any eye-tracker, takes experience and patience. So, if this is your first foray into eye-tracking, don't expect to plugin and get instant results.

papr 29 January, 2018, 09:16:21

@vrsauce#9955 I just realised that you wanted to use an integrated camera with Pupil Capture. This is typical for remote eye tracking. Pupil Capture does not support remote eye tracking.

user-c14158 29 January, 2018, 09:53:42

Hello - I'm having an issue with pupil player, it crash when I try to load recording that was made on a different laptop (both computer are Windows 10). Everything works on the laptop that was used to record.

papr 29 January, 2018, 09:54:54

@user-c14158 Do I understand correctly, that opening the recording on one computer works, and on the other it does not? Please make sure both laptops use the most recent version of Pupil Capture.

user-6db96e 29 January, 2018, 09:58:33

No, I dont want to use remote eye tracking, the camer is very close to the eye

user-c14158 29 January, 2018, 09:59:58

Yes, opening the recording on the computer that was used to record works, and on a other it does not. Bot computer are using the 1.2.7 version

papr 29 January, 2018, 10:01:22

@user-c14158 Could you try to open the recording on the laptop on which the app crashes, wait for it to happen and then upload the player.log file that lies within the pupil_player_settings folder?

user-c14158 29 January, 2018, 10:11:25

looking at the log i realize that i renamed different folder with letter with acute (shame on me )

user-c14158 29 January, 2018, 10:11:37

It works now 😃 thanks you

user-8779ef 29 January, 2018, 21:26:19

Just got a new error: Traceback (most recent call last): File "/Users/gabe/Documents/Pycharm/pupil/pupil_src/launchables/player.py", line 409, in player handle_notifications(n) File "/Users/gabe/Documents/Pycharm/pupil/pupil_src/launchables/player.py", line 387, in handle_notifications g_pool.plugin_by_name[n['name']], args=n.get('args', {})) File "/Users/gabe/Documents/Pycharm/pupil/pupil_src/shared_modules/plugin.py", line 321, in add plugin_instance = new_plugin(self.g_pool, **args) File "/Users/gabe/Documents/Pycharm/pupil/pupil_src/shared_modules/pupil_producers.py", line 169, in init except Exception(): TypeError: catching classes that do not inherit from BaseException is not allowed

player - [INFO] launchables.player: Process shutting down. MainProcess - [INFO] os_utils: Re-enabled idle sleep.

user-8779ef 29 January, 2018, 21:27:44

This is off of a fresh clone of the pupil git. The error happened when I tried to use offline pupil detection.

papr 29 January, 2018, 21:33:59

yeah, probably my bad. I am fixing this right now.

papr 29 January, 2018, 21:40:16

@user-8779ef Please git pull origin master in your repository. The error should be fixed now.

user-8779ef 29 January, 2018, 21:43:12

thanks! I'll do it now

user-8779ef 29 January, 2018, 21:45:12

@papr Yep, it works.. Thanks for the quick response!

user-8779ef 29 January, 2018, 21:45:28

Also, loving those timelines. You're working on variable X range?

papr 29 January, 2018, 21:47:44

Thanks. Still in the planning phase.

user-8779ef 29 January, 2018, 21:48:10

Ok. Happy to weigh in, if it matters.

user-8779ef 29 January, 2018, 21:49:04

For example, it might be nice to be able to drop csv data (timestamp + value) into a folder (or otherwise import it) and have pupil provide the option to represent it as a time series.

user-8779ef 29 January, 2018, 21:49:26

helpful for prototyping algorithms 😃

user-8779ef 29 January, 2018, 21:50:23

you could imaging a plug that includes an "add timeseries" button, and a dropdown selection box

user-8779ef 29 January, 2018, 21:50:35

that looks in a folder or something like that

user-8779ef 29 January, 2018, 21:50:41

Just spitballing' here.

papr 29 January, 2018, 21:51:20

I understand. I like the idea. Please make an issue for that.

user-8779ef 29 January, 2018, 21:51:32

Ok, will do.

user-8779ef 29 January, 2018, 22:44:44

@papr Bad news - just tried to debug on another Mac with a fresh fork, and fresh Python install. Same issue with the debugger in Pycharm:. Stuck on: player - [INFO] gaze_producers: Calibrating "1" in 3d mode. Bummer. I've posted the same to the issues. I'm guessing this is a Mac / OS X high Sierra issue. This means that no Mac users can really contribute.

papr 29 January, 2018, 23:40:17

@user-8779ef I would disagree. I develop on my Mac all the time and I do it without pycharm. Just plain old sublime + terminal.

user-8779ef 30 January, 2018, 00:09:45

@papr That's encouraging. Sublime + terminal wasn't halting. I'm a bit drained today, but I may ask you for some specifics tomorrow.

user-8779ef 30 January, 2018, 00:10:15

Ehr, my tomorrow is your today.

user-8779ef 30 January, 2018, 00:10:22

(still 7pm here)

wrp 30 January, 2018, 01:49:49

@user-8779ef Sublime Text 3 + terminal are my go to for development as well. Everyone has their own preferences of course 😄

user-8779ef 30 January, 2018, 12:47:24

@wrp This is what I'm dealing with here: https://github.com/pupil-labs/pupil/issues/1029

user-8779ef 30 January, 2018, 12:47:34

Sadly, I don't get to choose 😦

user-8779ef 30 January, 2018, 12:51:26

@wrp @papr Your approach, I assume, is to create a breakpoint with "python3 -m pdb main.py player"

user-8779ef 30 January, 2018, 12:51:49

...and then to run, " python3 -m pdb main.py player" ?

user-8779ef 30 January, 2018, 12:52:10

Using this method, my debugger skips right over my breakpoint.

papr 30 January, 2018, 12:54:26

No, I do not use break points at all. If at all, I use asserts.

user-8779ef 30 January, 2018, 12:56:00

ah, OK. I'll google that method. Thanks.

papr 30 January, 2018, 12:56:35

assert are just simple tests, that raise an exception if the condition is not true

user-8779ef 30 January, 2018, 12:57:33

but, no breakpoint is an issue for someone who isn't quite as familiar with the data structures as you are. Not so much documentation on g_pool etc.

user-8779ef 30 January, 2018, 12:58:28

I guess the halt on exception treats it as a breakpoint, eh?

mpk 30 January, 2018, 12:59:38

@user-8779ef I agree that using a debugger is what you want. We have devs that use debuggers with pupil on a regual basis. Let me check and get back to you if I find anything helpful.

user-8779ef 30 January, 2018, 13:00:28

@mpk . Thanks. If you look at that thread, you'll see that i've tried 4 IDE now, without luck. The issues vary with each IDE, and persist across multiple data folders, and 2 macs.

user-8779ef 30 January, 2018, 13:01:22

In every case, I can run without issue when a debugger is not attached. I have not yet reinstalled Python, but that's a thought.

user-8779ef 30 January, 2018, 13:02:38

Part of me worries that my shared use of anaconda, brew, and pip mean that I have a volatile python environment.

user-8779ef 30 January, 2018, 13:03:27

...however, as I mentioned, this does persist across two machines.

user-8779ef 30 January, 2018, 13:14:45

I should also add that I used to use a debugger. It stopped working shortly before I created that issue.

user-62cec9 30 January, 2018, 14:59:54

Hi there! I was wondering if there are any good 3rd party papers evaluating the usefulness of the tracking glasses for research purposes?

user-62cec9 30 January, 2018, 15:00:26

Also: do you have any data on using the glasses in people with nystagmus? I saw two forum entries but I'm more interested in: can the glasses be calibrated with the standard software? Do you know of anyone who has made a custom calibration script for nystagmus individuals?

user-62cec9 30 January, 2018, 15:02:08

Finally: how well do they work with spectacles?

user-62cec9 30 January, 2018, 15:02:52

If some of the answers are at an obvious place on the website or if this would be better posted in the forum, I apologize and please let me know.

user-8779ef 30 January, 2018, 15:03:40

@user-62cec9 Re:spectacles: I played with this a bit yesterday. They tracked the pupils just fine - typically, trackers don't work through spectacles because the corneal reflections are smeared. THe pupil tracker does not track corneal reflections, so this isn't as big an issue.

user-8779ef 30 January, 2018, 15:03:58

However, be wary of bifocals! They change gaze behavior quite a bit.

user-8779ef 30 January, 2018, 15:04:30

Just be sure to take this at face value: an anecdote.

user-8779ef 30 January, 2018, 15:05:25

@user-62cec9 Pupil is too new to have much out there on its usefulness.

user-62cec9 30 January, 2018, 15:05:32

Ahh, I was expecting to try and angle the cameras below the spectacles, but I wondered if that angle would even work. I was hoping someone somewhere tried many different spectacles. I suppose a successrate of 50% would be acceptable. May I ask the refractive error of the glasses you tried?

user-8779ef 30 January, 2018, 15:06:26

@user-62cec9 SOrry - don't have that info available.

user-8779ef 30 January, 2018, 15:08:12

@user-62cec9 THis was with the newer 240 hz cameras. You can't adjust their positioning so much, only the orientation.

user-8779ef 30 January, 2018, 15:08:34

there's sufficent play tehre to get a good track across a variety of conditions.

user-62cec9 30 January, 2018, 15:08:50

200 or 240Hz?

user-8779ef 30 January, 2018, 15:09:32

Sorry, 200 Hz

user-8779ef 30 January, 2018, 15:10:25

@user-62cec9 "can the glasses be calibrated with the standard software?" THeir software is fantastic once you learn how to use it.

user-8779ef 30 January, 2018, 15:10:51

Pupil player. You can download it. I don't know if they provide a sample bit of data ...you can probably get one if you ask around here (sorry, mine is under NDA).

user-62cec9 30 January, 2018, 15:11:17

the thing with nystagmus is this: https://entokey.com/wp-content/uploads/2016/07/DA1-DB2-DC1-C11-FF5.gif

user-8779ef 30 January, 2018, 15:11:36

The software has a learning curve. I would say these are tough glasses for someone new to eye-tracking, but if you know how to use em, they work well.

user-62cec9 30 January, 2018, 15:12:00

their "intended fixation" is where the horizontal lines are drawn in that pic, so it can be quite tricky... the eyes aren't even stable enough to use the eyelink default calibration

user-8779ef 30 January, 2018, 15:12:02

Software is constantly in development, and they are VERY responsive to issues (if posted to the appropriate place on github)

user-62cec9 30 January, 2018, 15:12:45

would you suggest cross-posting to the google groups? and would it make sense to inquire about nystagmus calibration at github?

user-8779ef 30 January, 2018, 15:12:52

No, this is the best place to ask.

user-62cec9 30 January, 2018, 15:13:07

I have heard the legends about the responsiveness (:

user-8779ef 30 January, 2018, 15:13:21

Just give them a few hours to see the post, and remember that they are on German time.

user-8779ef 30 January, 2018, 15:13:37

You might write it once more after our conversation ends, so it' s near the bottom 😃

user-62cec9 30 January, 2018, 15:14:07

ahh, they are in Germany, I forgot!

user-8779ef 30 January, 2018, 15:14:36

@user-62cec9 So, thre are two things that might contribute to pool tracking during nystagmus. THe first is motion blur. This will be amplified in low lighting conditions, when teh camera needs to increase exposure time to get a good image.

user-8779ef 30 January, 2018, 15:15:00

Oh, wait, wait - Not true.

user-8779ef 30 January, 2018, 15:15:19

not true because the eye camera is in the IR range, and the tracker provides its own lighting via IR illuminators.

user-8779ef 30 January, 2018, 15:15:37

So, that light level should be fairly constant.

user-8779ef 30 January, 2018, 15:15:49

That is true of the scene camera, which works in the visible range.

user-8779ef 30 January, 2018, 15:15:56

I keep on making that mistake.

user-62cec9 30 January, 2018, 15:16:25

hehe

user-8779ef 30 January, 2018, 15:16:26

anyhow, motion blur would be an issue, but I don't htink that will be an issue with the 200 Hz eye cameras

user-8779ef 30 January, 2018, 15:16:39

They have a high sampling rate and low exposure time.

user-62cec9 30 January, 2018, 15:16:51

so my current understanding is that the endpoints of nystagmus saccades are where people try to and think are looking and also sample the most information

user-8779ef 30 January, 2018, 15:17:04

The other issue would be if the pupil tracking algorithm has any "memory."

user-8779ef 30 January, 2018, 15:17:17

in teh way that a kalman filter does

user-8779ef 30 January, 2018, 15:17:38

...or if it only operates on the instantaneous information present in a single eye image. I think that's the case, so I think you're good on both counts.

user-8779ef 30 January, 2018, 15:18:42

@user-62cec9 You're on your own there. Pupil, or any tracker, will only report a gaze location in pixel coodrinates based on the angular orientatino of the eyes. IN this ase, that location will have good deal of jitter due to nystagmus.

user-8779ef 30 January, 2018, 15:19:18

The question is, should you treat that jitter as noise? If so, is it normally distributed around the location of interest, and suitable to be averaged out?

user-8779ef 30 January, 2018, 15:19:37

These are questions for hte researcher more than the eye tracking software.

user-62cec9 30 January, 2018, 15:27:36

absolutely, and I'll likely just do an offline calibration.

user-62cec9 30 January, 2018, 15:31:44

one question I still have is: usually calibration procedures are based on fixation detection and nystagmus patients often have only very short fixations (which throw off the eyelink calibration, as it just reports unstable pupil). I assume that I could just tweak the software to allow short fixations, but suppose all attempts to use the pupil calibration fail and no calibration takes place, can I still get data from the glasses and do my own calibration on that data later?

user-8779ef 30 January, 2018, 15:41:58

@user-62cec9 THe pupil system allows for manual post-hoc calibration using "natural features"

user-8779ef 30 January, 2018, 15:42:48

Basically, you can scrobble to the frame when the subject reports fixation (we give them a button activated LED that is placed in teh scene camera), click on the calibration point they are looking at.

user-8779ef 30 January, 2018, 15:43:07

Do this for all calibration points, and then click "recalibrate." Done!

user-8779ef 30 January, 2018, 15:44:08

No reliance on automated fixation detection.

user-8779ef 30 January, 2018, 15:44:20

In fact, the fixation detection in pupil could use some work 😃

user-8779ef 30 January, 2018, 15:44:37

...I don't rely on it, and wrote my own detector which I can' t share yet due to NDA.

user-62cec9 30 January, 2018, 15:58:48

that won't work because patients aren't typically aware of the movements, and they are very fast, so it would have to be done by the experimenter after seeing the traces... if I see those I can easily say, okay, here and here and here are the points they wanted to fixate. Those things we have solved, my question is more: do you get any useful data if you skip the calibration?

wrp 30 January, 2018, 15:59:35

@user-62cec9 Responses to (some) of your points: - Academic (third party) papers - We maintain a collection of papers that cite Pupil/use Pupil via this spreadsheet: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?usp=sharing - while this may not be exactly what you are looking for, hopefully there are some papers in this list that can at least demonstrate the quality of data and uses cases in research contexts - Nystagmus research with Pupil - I know that there are researchers in the community that are/were using Pupil to develop diagnostic tools for nystagmus, however there are no papers published that I am aware of at this time - Pupil Player and sample dataset - you can download latest software bundle from https://github.com/pupil-labs/pupil/releases/latest - we have sample dataset available via https://pupil-labs.com/pupil - if you are looking for a short demo of a specific type of movement, DM me and I would be happy to try to make one for you or set up a time to talk to demo via screen-sharing - Pupil with eye glasses - we will be shipping all new 200hz Pupil headsets with a extension arm for the eye camera, this will enable researchers to have greater control of the position of the eye cameras. Typically we recommend trying to capture the eye from below the frame of the eye glasses. @user-8779ef I am pleased to hear that you are able to capture the eye through the lens. - Discussion - Yes, this is the best place for discussion. We are considering transitioning completely away from Google Groups in favor of just github issues and chat, but of course we will ask the community first before making any decisions on this. - 200hz eye cameras and motion blur - global shutter and high frame rate reduces motion blur artifacts

wrp 30 January, 2018, 16:00:36

@user-8779ef it would be great if you would be able to share work on the fixation detector if it ever makes its way to a more permissive environment

user-8779ef 30 January, 2018, 16:02:38

@wrp Yes. We have a meeting coming up. I'll share it the second I'm able to get it in writing.

user-62cec9 30 January, 2018, 16:06:51

@wrp wonderful, thank you! Perhaps you could comment on the question of skipping the calibration procedure: if calibration isn't possible, will I still get data which I could run my own offline-calibration on later?

user-8779ef 30 January, 2018, 16:20:04

@user-62cec9 Yes. During data collection, you really just record the eye and scene imagery. Pupil detection etc can be performed later.

user-62cec9 30 January, 2018, 16:20:35

Aha!

user-8779ef 30 January, 2018, 16:20:40

@user-62cec9 If you buy a tracker, come back here for details on the best methodological pipeline to use.

user-8779ef 30 January, 2018, 16:22:01

@user-62cec9 Just keep in mind, the software is still in alpha, and I don't know how reliable everything really is. I consider myself well versed in the software, but until you try and use it for real data, you never know. I'm about to collect 30-40 subjects worth of data, though. Wish me luck.

user-8779ef 30 January, 2018, 16:22:09

about = this month.

user-62cec9 30 January, 2018, 16:25:02

ahh, gl!

wrp 30 January, 2018, 16:25:58

@user-62cec9 as @user-8779ef noted, you can always record eye and scene video data - without calibration - and can run pupil detection algorithms post-hoc in Pupil Player and calibrate post-hoc as well in Pupil Player

user-62cec9 30 January, 2018, 16:26:33

great!

wrp 30 January, 2018, 16:31:28

Pupil software will always be developed continuously so that we can continue to add new features, improve existing features, fix bugs, etc. There are currently Pupil users/community members both in industry and in academia that record with high volumes of participants/subjects per day and conduct long duration recordings.

wrp 30 January, 2018, 16:32:31

@user-8779ef I wish you the best in your data collection/experiments and look forward to feedback

wrp 30 January, 2018, 16:33:24

Ok - going AFK now (UTC +7 time zone)

user-62cec9 30 January, 2018, 16:58:13

thank you both (:

user-6952ca 30 January, 2018, 19:27:19

Hello, everyone! I am a member of a computational cognitive science lab, and I am very excited to utilize Pupil Labs hardware and software for some research. I am curious about how to utilize the glasses for eye tracking on mobile devices. More specifically, I would like to be able to track gaze position on a mobile device's screen (nexus phone or tablet). If I am not mistaken, I would need to define the device's surface using markers or some other method. Does anyone know if this has been done already or is in the works?

user-6db96e 30 January, 2018, 20:07:44

still cant fix my webcam...

wrp 31 January, 2018, 00:18:59

Hi @user-6952ca welcome to the Pupil channel/community. Yes! You can use Pupil for gaze tracking relative to a tablet/phone screen. Using fiducual markers is exactly the method I would recommend. This works and is the method used by other researchers in our community. If you are only interested in content on screen of the tablet/phone I would recommend that you consider using the high speed world camera with the narrow angle lens that ships with this system so you can dedicate more pixel real estate to the on screen content.

wrp 31 January, 2018, 00:20:13

Side note to all, when you swap out world camera lens to the narrow FOV you'll need to recalibrate the world camera

wrp 31 January, 2018, 00:23:11

@user-6db96e please remind me, you're trying to restore system drivers in windows 10 for your integrated webcam, correct?

wrp 31 January, 2018, 00:23:56

You were not successful in installing libusbK drivers (based on what I see above in the chat)

wrp 31 January, 2018, 00:26:09

If your looking to restore system drivers you can either roll back if drivers are still on your system or use a tool like this: https://www.drivethelife.com/free-drivers-download-utility.html

user-d74bad 31 January, 2018, 19:24:26

to clarify for the hololens: pupil 2d gaze tracking assumes a standard distance of the cameras from the eyes, right? in order to do the raycast from 2d -> 3d. So this makes the 2d gaze tracking most accurate with objects closer to the camera, right?

user-d74bad 31 January, 2018, 19:26:05

gazePosition.z = PupilTools.CalibrationType.vectorDepthRadiusScale[0].x;

user-d74bad 31 January, 2018, 19:26:31

I'm assuming this is the code that supplies the standard distance

user-8779ef 31 January, 2018, 21:39:24

@user-6952ca I suggest that you do some back-of-the napkin math before you get going. You should expect all eye trackers to have maybe about 1.5 degrees of error in the center, and maybe as much as 3 degrees in the moderate periphery. Please consider how close your ROI on the cellphone are during comfortable use (in units of degrees). Is that level of accuracy sufficient to distinguish between gaze locations for your particular use-case?

user-8779ef 31 January, 2018, 21:39:57

FYI, pupil can (and usually does) outperform those numbers, but not by much. That's still very good for a mobile tracker.

user-8779ef 31 January, 2018, 21:50:36

Guys, I'm curious - do you have any idea why a good pupil image would produce fluctuating confidence measures like this:?

user-8779ef 31 January, 2018, 21:50:46

Chat image

user-8779ef 31 January, 2018, 21:51:16

See eye 1, which was taken while the subject was in fixation. The eye images are beautiful throughout, but the confidence measures are meh.

papr 31 January, 2018, 22:05:47

@user-8779ef The confidence signal looks very periodic. That's unexpected.

user-e7102b 31 January, 2018, 22:28:39

@papr Hi, I'd just like to follow up on my previous message re sending event codes via pupil remote. Can you please tell me if there is a simple way to do this? I can control all the other necessary functions e.g. start/stop recording, calibration etc. using the functions in pupil_remote_control.py...but I can't see an obvious way to send event codes/timestamps. This seems like something that should be fairly straightforward to do. Thanks!

papr 31 January, 2018, 22:35:08

@user-e7102b Sorry, I have been very busy the last few days. I will do so first thing when I get to the office tomorrow. Have a look at the annotation plugin and which notifications it generates and receives. You will need to send notifications in the same format over Pupil Remote. See the Pupil Remote example on how to send notifications.

papr 31 January, 2018, 22:35:26

I will create a complete example tomorrow

user-e7102b 31 January, 2018, 22:37:10

@papr No problem - thank you! I'll take a look at that annotation plugin now.

End of January archive