@user-ed537d `sudo pip3 install [email removed] This will install commit 5306a0e which is the commit before merging my pull request. This should be a temporal solution to your problem. I will try to find a long term solution tomorrow (it is 2 a.m. over here 🙃)
thank you so much
sorry for keepying you up 😄
can you confirm that it solved your problem?
testing now
coudlnt find a tag or branch
Could not find a tag or branch '5306a0e', assuming commit.
no worries. I was awake anyways -- just not in the state of making meaningful changes to the official repository 😉
but it installed 1.2
lemme see if it worked
this is expected
boooom
you're awesome!
it works!!!
nah, not that awesome if you keep in mind that I am the one who broke it 😄
great to hear that it works for you
well you fixed it thats what i care about lol
thank you!!!
😃 ok great. have a productive day. I am going to bed. And tomorrow we will find a solution to the actual problem 😃
looking forward to it 😬
😁 *
Hi guys! Any news for the new osx release?
It is ready today 😄
woow! Great!
All files have been uploaded. We just need to hit release
and it is ready to go.
Great!
@user-99e72e - we have a bit more testing to do and then the release will be made public. In approximately 1 hour.
We have released Pupil v0.9.4
- https://github.com/pupil-labs/pupil/releases/tag/v0.9.4
Note - bundles for Windows 10 should be uploaded early next week
gzgzg
[email removed] I was wondering if theres a way for me to pull up the screen marker calibration on a different computer on the same network. The reason I ask is the machine the pupil is connected to is different than the machine we use to run our experiments and it would be ideal if we didn't have to switch inputs everytime we want to start an experiment
Have a look at Pupil Remote and the calibration.should_start notification
^definitely works
An alternative to pupil remote would be mqtt... Which is what i was using for issuing commands due to an already existing infrastructure using mqtt
but pupil remote calibration.should_start would display the calibration routine on the machine that is running pupil and not connected to the tv
right?
i guess what i'm asking is it possible to pull up the screen marker calibration on a different computer and have the pupil calibrate that way?
Does anybody have experience with eye tracking and pupil ? If so which accuracy I can be expected ? Would it make sense to combine different technologies such as Eyetracking first then webcam based head tracking to move the cursor pixel perfect ?
A friend of mine has rsi which is why Im Interested
Hey guys! i want to ask if i can extract the blinking rates from a recorded video ? I also want to know if I can get the data in real time, rather than analysing it after recording, as I want to use the pupil diameter as input to something else in real time
@user-9c17f2 - you can classify blinks in real-time with the Blink Detector
plugin. Blinks are also recorded, but at current not yet visualized or exported with Pupil Player. Pupil diameter is also accessible in real-time. I would encourage you to look at the IPC and network communication docs here: https://docs.pupil-labs.com/#interprocess-and-network-communication and also pupil-helper scripts that demonstrate how to subscribe/work with pupil data via network communication: here's an example: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py
@user-1e086f - I believe we spoke via email already. Accuracy can be expected within physiological limitations if setup correctly with proper calibration. Difficult cases like bright direct sunglight can still be a challenge for robust pupil detection. If you are interested in typing by using gaze, you may be interested in this project: https://github.com/hookdump/asistiva
☝ the above linked project is not yet complete (AFAIK), but looks very promising 😄
@wrp ist it known how your accuracy differs from tobii.com ? I'd like to replace the mouse. Having. A typical distance from screen such as 50 cm which approximate gaze accuracy in cm on screen could be reached ? Has anyone tried combining such mouse control with head tracking so that pixel accuracy could be reached ? I mean blocking gaze detection if pupil angle doesn't change and using head tracking ?
Does anybody have existing code which is close so that this could be tried ?
I found many head tracking software but nothing of quality ..
@user-1e086f - I am not sure I understand why you need "head tracking" - what you would need for screen based work is to establish where the screen is located within your FOV, for this you can use surface tracking (or other methods to detect the screen). This way you can move your head as much as you like and gaze is always relative to the surface you define. Existing code for mouse control with screen defined as a surface can be found here: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/mouse_control.py
@user-1e086f - regarding accuracy and performance comparisons, we do not have any comparisons to other products available
If you're doing cad and have rsi you're able to use the mouse for 5min till your hands start hurting too much
Eye tracking will it allow to move the cursor on a pixel but nearby.
however, researchers like: http://thume.ca/2016/03/24/eye-tracker-reviews-pupil-labs-tobii-eyex-eye-tribe-tobii-x2-30/ have some reviews (from about 1 year ago). This author was also implementing a gaze driven interface and might be worth contacting for your project
Then you could lock eye control and move your head ..
@user-1e086f - I would suggest contacting Tristian Hume (unfortunately his does not seem to be here in discord - but maybe you can share a link with him or contact him via email) for more information on his implementation
It seems his repo is here: https://github.com/trishume/PolyMouse
@user-1e086f - this seems to be kind of what you are looking for
Hi Guys, I'm having an issue with Surface marker tracking, got them all setup in Capture, tracking working fine in Capture, but when opening in Player and starting Offline surface tracker it doesn't find any markers at all.. I noticed this was a known bug earlier (#583), so updated to latest version v0.9.4. but it's still not working 😦 Any ideas?
Hi – Has anyone from the community tried the Pupil Glasses with the EEG device from EMOTIV (EPOC)?
@user-fb1b80 delete marker cache
Thanks @wrp, maybe a stupid question, but where can I find the Cache file on Ubuntu for surface trackers? New recordings I made in latest build actually worked, just wanted to get my old ones to work as well.
have a look in the recording folder itself. There shouldbe a file called aquare_marker_cache. delete that and reopen the recording.
make sure to not delete the fule when the recording is already open.
@mpk that didn't fix it unfortunately still no markers detected at all for earlier recordings..
@user-fb1b80 did you try to reduce the minimal marker perimeter? it is a setting in the offline surface detector menu
@user-fb1b80 can you email us one of your recordings? I think you might need dropbox for that. We will have a look.
@everyone I see that the Windows 10 bundles are greatly improved in latest releases, and the latest release will be up this week. I was wondering to find out if anyone had tried running the software on Windows 10 and faced any problem? I couldn't run it smoothly on Win10 previously and so had to switch to Linux
@user-d00e4f we will try to get Windows bundles out this week for v0.9.4
@user-d00e4f can you be specific about issues with the bundle on Windows 10 and/or open issues on the GitHub repo so that we can investigate.
I just ran the previous version I had and I get this, and none of the capture windows open. It's an old version (I don't remember the version number)
you are running from source for v0.9.0-17
?
I just checked, yes it is v0.9.0-17 @wrp
@user-d00e4f - you are running from source - correct?
yes
If so, it would appear that the pupil detectors are not being built
I would assume that your environment is not properly set up
in the setup.py files for the pupil detector and calibration you need to modify the paths: https://github.com/pupil-labs/pupil/blob/master/pupil_src/capture/pupil_detectors/setup.py#L49-L55
I have installed dependencies again just now. I see all wheels are linked in the new docs page.. that's super helpful
Yes, I did edit the paths
try just building the pupil detector
and see what errors/warnings are shown
Okay
Setting up dependencies on Windows (unfortunately) is much more difficult than on Linux or macOS
Yeah. That's the reason I switched to Linux :p
But I got a little excited with the new releases
I think we should automate at least the wheel installations
Pack everything in a directory and run a single script
maybe more elegant to run a script that installs all wheels from the source - but the dependency versions are fast changing so such script may be quickly outdated
hmm.. that makes sense
When I build only pupil detector
can you link to your fork?
Okay. I'll do it first thing in the morning tomorrow
From the error msg I posted, it appears like some compiler issue. Have you faced that before? Or should it work fine with the VS compilers?
@user-ed537d - have you released or pushed your Matlab + Pupil helper script? It would be great to have this as a PR for pupil-helpers repo if you have time.
I'll try this week I have to change some of the code up. Been swamped at school
@user-ed537d - great - I look forward to seeing the code 😄
@here - this was shared on the Pupil google group, but wanted to re-post here in case it could be of use for someone's research/work: https://github.com/ignacioxd/pupil-remote-php
Hi! @user-fb1b80 and @mpk , did you figure out what the problem was? I also have issues with the surface tracking - it appears that markers are tracked just fine (they are highlighted blue in the capture and also player windows), but e.g. the surface tracking debug window is completely blank. Has anybody experienced this before? Thx!
Hi all, I recently had the problem of not being able to do calibration with my own eye tracking hardware and pupil's software v092 on windows 8.1, the suggestion I got here then was to use v093 on windows 10. In the previous version both my cameras were detected but in the v093 version the software detects my eye camera but gives me frame rate errors and doesn't show me any eye feed and nothing changes. I even tried to adjust the resolution or the fps to sth else but it didn't help. Does anybody know what mistake I'm making here? or how I should proceed from here?
Thanks, I was able to fix it : )
@user-006924 would you mind sharing your solution? :)
@user-006924
What did you do to solve your problem?
I had only installed the driver for interface 0 which I'm guessing is only the video feed of the camera and I realized I had to install the driver for composite parent of the camera.
So you installed both drivers and now it works?
Yes but I think if only the composite parent is installed it would work as well.
@wrp [email removed] -- Here's the link to the fork.
@user-fa6396 try decreasing the size of the perimiter of the markers the sliders located undre the surface tracker plugin
@everyone I get this error when I try to run the screen calibration
why is speech dispatcher having a problem?
And I have installed dependencies for v0.9.5, and I'm unable to update ndsi
It keeps throwing the error "ImportError: cannot import name 'H264Writer'"
What is the problem that you are having with updating ndsi? The import error is a result of an unsuccessful ndsi update.
The speech dispatcher is used for phrases like e.g. "calibration started". We did not change anything related to it as far as I know. This is a problem specific to your setup.
My system is rolling back to previous ndsi version as I try to install the new one from pupil labs page
How exactly are you installing it? Is there no error message? If it rolls back without an error than it might be sign that your installer is installing an old version without you noticing
Now that I'm away from the system, I don't recall the error on top of my head. But it's something related to g++ if I remember right
But my system has the compilers installed
Oh OK. Would be great to get more information on that as soon as you have time for it. :)
First thing in the morning tomorrow
:)
Hi all - our lab just got a new pupil headset. What software are folks using for data analysis?
@user-af9414 Great to hear that! 😃 Pupil Player is able to do some basic analysis (e.g. fixation detection) and exports all its result to CSV files. These files can be easily read/parsed by any other software. So it depends mostly on your personal preference/experience which softare you want to use.
Cool - suppose we want to calculate relative duration fixating on multiple stimuli. Our last eye tracker was stationary (mounted under monitor), so it was easy to define the location of the stimuli as coordinates. With this setup, however, I'm unsure how we can define locations for static stimuli given the frame will move as participants head moves. Is there a way to control for this? (I'm a grad student with very little experience - the prof I am working for has a ton of experience with eye tracking research, but is a little less tech savy). Any pointers?
Sure, that is a very common problem and is easily solved by using our Surface Tracker plugin. Using specific markers, you can define surfaces into which the gaze will be mapped. This is built-in functionality. You could add these markers around your screen for example. Calculating fixations relative to this defined surface should be a small step.
awsome! will read up on those
@user-af9414 link to docs here: https://docs.pupil-labs.com/#marker-tracking
Thanks!
Hi
May I ask if the "show 2d calibration" functionality will be (re) implemente) any soon?
It was really useful. In the mean while, is there an accuracy measurement that I could read at the calibrstion end? To make sure that all calibrations will have minimal confidence.
@user-41f1bf it is on our todo list but we are prioritizing the missing parts to round up Pupil Mobile at the moment, e.g. offline pupil detection and offline gaze mapping.
what level of lighting works best - we are in the early stages of calibration and having difficulty!
@user-af9414 we need more info to debug. Can you turn on eye recording and record while calibrating and a bit after and send us the recording? This way we can debug your problem.
@mpk Can you tell me how the coordinate frame of the pinhole eye camera is? How do I interpret the eye sphere x,y,z coordinates and similarly the gaze normals?
hmm any plans which release will see a prebuilt windows installer? 😄
how to open pupils app ?? i click .exe and got this
@user-5f84b3 this is an error I have not seen before. @wrp any ideas?
@user-d00e4f the coordinate system should be centered on the optical center of the camera with y up x right and z into the scene.
@mpk i try to open with 3 computer and get same result . can you send download link to me?
The releases are available on the GitHub repository page. A quick Google search suggests that you should install this https://www.microsoft.com/en-US/download/details.aspx?id=48145 maybe it will help. But @wrp will know more...
@mpk Thank !! it's work !!
@mpk thanks.
Also, what is the diagonal FOV of the cameras we use?
@wrp i'm going to be heading out of town for the next week but I was going to share a raw form of what I currently have working was wondering how i could send it to you so you can test to see if it works on your machine
thats for the matlab stuff btw
@user-ed537d - would be great if you could put up a repo on or a fork of pupil-helpers with your work
We would be happy to see it. Also a short readme that explains how to use it within matlab if you have time 😄
@wrp i'm sort of new to github but I'll definitley try my best lol
@user-ed537d just click fork
for https://github.com/pupil-labs/pupil-helpers and then it will make a fork of the pupil-helpers repo. Then you can clone this locally, add your changes, commit changes locally, push to your fork, and make a Pull Request via github web interface to pupil-labs/pupil-helpers master branch
wow... that looks like a lot of steps... but it's much easier than it seems 🙂
just added
its very bare bones....really tired
going to head home
great! will take a look - thanks
are you able to see it?
i called it matlabadd
i do not yet see the PR
how about now? lol
it shows up on my browser now
I see the fork - https://github.com/matiarj/pupil-helpers
and the commit - https://github.com/matiarj/pupil-helpers/commit/8d25d75645cf53082a423943ed79418d1563472b - thanks! I'm not a matlab user, but it would be great if someone else in the community could test this and give feedback
cannot connect to pupils_service. pls help
Hi @user-fae857 please check the pupil_remote plingin in the world gui and check the port number. This should be the same as in your script.
Hey, I was directed here to ask a question. do any of you guys use Pupil with OBS? I'd love to know 🙂
@user-fe3296 the streaming software?
Yup! How Devices like the tobii can overlay where you are looking
For example, you can watch a couple minutes of this video. https://youtu.be/YVfhFeFZGzM?t=287
🙂
I am pretty sure you could easily build your own little Plugin in the pupil framework that overlay your screen
While I currently have no experience with proper coding (Just basic HTML/CSS/JS) getting a pupil is a rather long-ish term goal, so maybe I'll work towards learning to be able to code it .... 🤔
Well that's a start 😃 You could consider adapting the current gaze displaying plugin by spawning a new window (which is transparent) and applying the already existing gaze plugin's drawings on it
That is a start indeed but you would see it yourself. What you actually want to have is an OBS plugin that composits the gaze on top what you are recording such that you do not have to see it yourself but only the people watching your stream
Of course, but this would make things a tad more complicated as you would need to consume the pupil data from a different environment 😃
It is probably easier to write a standalone obs plugin that uses pupil remote and a subscription to the backend IPC than writing a full-featured Capture plugin
😓
Biggest problem here (for me): OBS API documentation only exists in form of source code. A general overview guide would be great to have.
There's this, but I doubt it's anywhere near enough info https://obsproject.com/forum/threads/how-to-create-obs-studio-plugins.53470/
Just seemingly an overview
oh, actually, what you could do as an inital easy solution: just stream Pupil Capture content. As far as I know, you can select specific applications to stream
<:eveaThinking:304457812271038464> Well, in my case, I am an Overwatch streamer. Tl;Dr #1 Public player of a certain character (out of 24 characters), and top 50 players in north america over-all.
So I'm not exactly leaving the game anytime soon, and I feel that this is a very powerful educational tool if used properly, and teaching the fundamentals of this topic is important imho.
I totally agree on the educational topic! I would like to see it extended to other parts in life like sports (e.g. table tennis, snooker) or something different, like driving.
You have probably seen this: https://www.reddit.com/r/dataisbeautiful/comments/66pjv2/i_wore_an_eye_tracker_while_playing_overwatch/
The setup for this is not difficult. You open Pupil Capture, select it as the application that shoukd be streamed using OBS. This should work out-of-the-box. The problem with this setup is that one does not see the game fullscreen but as part of your field of view
Yeah, I did see that (which put me onto pupil) but I have been looking into eye tracking for like 7 months now, since every path I look at there are some downsides.
what were the biggest downsides you have encountered so far?
Support for the devices dropped, refresh rate of the cameras low, software used being clunky, some devices requiring the full USB bandwidth (that I don't have on my current PC)
Hmmm, some other small things
But realistically it will be a couple years before I get a pupil, am doing what research I can now so I know what path to take.
Is it just low priority as an investment or is it due to the specific costs of the pupil headsets?
Yes, Cost. ~~mainly~~. I'll be getting a Tobii for $110 within a week probably, seeing if it will be "good enough" for now, it's "new-ish" and has pretty good support.
Eye trackign as a whole is a very high priority for me right now, I think the value it brings to the topic of learning is immense, and as a channel that's my goal is educating my audience.
I understand. Would be great to hear about your experiences with up- and downsides of this product. We can probably learn from it as well 😃
For now, I'll probably set it up on a keybind, so when a match starts, I can enable it, then disable when the match ends.
But long term I plan to record it onto a seperate "green-screen'd" color video, so when I go to edit a video of a match, if I think my eye-movment was important at, say, the last 40 seconds of the match, I can simply put it in the video when I see fit.
Though that assumes chroma-keying a video like that is viable <:Thinkong:293458884805525504>
But yeah, thanks for the guidance. Hopefully I can come back with some useful info/content.
Hi .. I've got a quick question. Is there already the option to publish the frames of the eye cameras besides the one of the world camera? If not, could you give me a hint what the easiest way would be to do so?
Hi Tobi, this is already happening. Just subscibe to frame.eye0/1 when the frame publisher plugin is running.
That was fast .. great, thanks a lot 😃
Works like a charm .. it's frame.eye.0 and frame.eye.1 (rather than frame.eye0 / frame.eye1 though)
great!
Hi all, I am using the latest version of pupil app. I was wondering would I be able to access the pupil coordinate real time if I run the app from the source code and change parts of the code?
Hi @user-006924, you can also access the pupil coordinate and gaze coordinate (and much more) in real time from the app. No need to run source, unless you want to of course 😄
You can use the Pupil Remote plugin
and a helper script like what is shown in pupil-helpers here: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py
You will need to install zmq
and msgpack
to run this script. But that is much less overhead compared to setting up dependencies and running from source
Does this answer your question?
Yes thanks a lot ^^
great!