hi,In the fixation.csv file, the duration of fixation is second?
@mattiaparentela#1655 yes. time units are seconds.
@mpk thanks. You recommend me to leave the min-duration of fixation to 0.15s ?
@user-d811b9 I would recommend it. However you may want to change it depending on what literature you follow.
Thanks!
@mpk I found on literature review that fixation duration can be established equal to 200-400 ms. therefore I have to reduce this value . The question is , how can do this ? ( or better I can reach this precision ?
there should be a settable field in the detector.
200ms == 0.2s, the slider works with 0.05s precision. Set it to 0.2 and filter all fixations with durations longer than 0.4 in your post-processing step. This way you keep the fixations that you are interested in
Thanks
Hey guys. I'm using pupil for Oculus Rift, For some reason since yesterday it seems like there is no infrared lighting in one of the eyes
I can see an image in the eye, but when its covered, like when putting on the device, one of the eyes goes completly black
Is there some way towards finding the exact problem?
hi! sounds like the IR leds are not active. Can you check if the cable fromthe ring to the camera is ok?
Hey guys! I'm wondering if the pupil headset has a serial number, and where can I find it
@user-a633bf hey,it does not. We suggest to use the order ID as identification.
ok, thanks (:
@mpk, yeah that was the problem, must have gotten disconnected somehow. reattached and works fine 😃 ty
Great to hear!
Hi guys, when open pupil player on ubuntu the CPU work ever at 100%. Is it normal ? But the main problem is also that only one of four work ever at 100%
@user-d811b9 Yes, it is. Currently a lot of plugins do their calculation in the main thread (e.g. fixation detector). We will introduce a new way to do computation in the background. If all plugins adapt this new interface the processing load will be spread to all cores evenly.
@papr thanks!
Hello, I tried to run Capture and Service on Windows 7 SP1 64-bit (i5-6600T, 8Go RAM), and something (either the world view GUI, one of the eye views, or all) crashes almost every time after a minute or more. And it uses 100% of the CPU all the time. Is it only related to the fact that I am running it on Windows 7, or is it abnormal ? Any solution ? I can move my setup to Windows 10, but it would same me a lot of time if I could run it on Windows 7. Thanks!
@user-8c0069 I am sorry but we do not support Windows 7. You will need to upgrade to Windows 10.
My guess is that you will save more time by upgrading and asking again if the issue persists than searching for the issue on Windows 7.
Btw, it is always helpful to save the log files and the terminal output, independent on which platform you are on.
Save and share
🙃
@papr Yes, I know you only support Windows 10, it is just that I have read some comments here and on the Google Group saying that it could work on Windows 7, so I was thinking some of you may have met the same problem...
Yes, I should have started with a log, sorry.
So this is the most frequent issue : after a couple of minutes, one of the view crashes. For this one, after about a minute, I added surface markers in the field of view of the world camera, it crashed shortly after :
Everytime : "Capture failed to provide frames. Attempting to reinit."
This message also appears always when I try to launch screen markers calibration (for world, eye0 and eye1). It crashes instantly after pressing the "C" button. However, no problem with manual markers or natural markers...
Thanks if you can help me...or tell me that this is only due to Windows 7 incompatibility.
Hi
I have a (hopefully) small problem with pupil capture. When I plug my pupil headset into the USB port of my computer, capture is not able to detect the cameras. It displays two unkown USB sources but both sources throw the error "The selected camera is already in use or blocked". I am running Ubuntu 16.04 64-bit and both cameras can be access without problems using gucvview. Any ideas what the problem could be? Thanks!
@user-2ae698 please make sure your user is part of the group 'plugdev' alternativly you can run capture as sudo. We highly recommend the first option though.
@mpk Thanks a lot, that did the trick.
Hello all, I have a question, how does the player handle a world video that is longer than the recorded data? Specifically I'm using the HMD trackers, so data is recorded in "fake capture" mode. I tried it with a one minute video on 20 second gaze data and it seemed to visualize gaze points throughout the whole video.
The reason I ask is I wanted to try and record a screen capture of something in VR while recording eye tracking data, and then use that as the world video in the Pupil Player. I realize the FOV of the screen capture and display won't be a 1:1 to match up but would like to visualize the results of this. Naturally the screen capture video and gaze data would have to be aligned as well, which I could do with trimming once I know the time that gaze recording started in the world video.
Hi, in player application I have found the manual gaze correction plugin. The value of x and y correction modify the position in height and width of my gaze , but the unit is referred to ?
@user-d811b9 It modifies the gaze positions' norm positions
But this plugin is not a great tool to correct your recorded gaze data. It will be replaced with the natural feature based offline calibration.
The unit is video width, height.
https://github.com/pupil-labs/pupil/blob/master/pupil_src/player/manual_gaze_correction.py#L42-L44
@mpk It is not
@papr have the bundle version . I have to copy this plugin ?
@papr I think your quoted code and statement do not agree :-).
@user-d811b9 No, the bundle includes all plugins
@mpk I am confused.
Since we are modyfining the norm pos the units are video width and height.
@papr these are my plugins
In the player application
0 ,0 is bottom left of the video frame, 1,1 is top right. That makes the unit screen size.
@user-d811b9 please use gaze correction for now. @papr is referring to new code in the master branch.
Ah, now I understand my confusion. I mistook your statement for pixels.
@user-d811b9 original question was only about the units of manual gaze correction. Thus my answer.
@user-d811b9 And yes, the new plugin is not available in the bundle yet. I thought you were referring to the manual gaze correction
@papr @mpk thanks
I hope we were able to help!
when this plugin, that seem to be very goo,d, will be available for bundle version ?
We hope with the next release. There are some issues we need to solve first.
The idea is that you can recalibrate sections in Player to correct the gaze mapping.
@mpk GREAT !!!It is possibile before my degree in october ? 😬
Yes. I think we should be able to do this next week.
very very good!! thanks for all @mpk
Hi all, I'm running a Unity Application with the HTC Vive add-on and i'd like to capture the what the user is looking at only the capture doesnt capture the rendered HTC Vive image, any tips?
nothing out-of-the-box that taps into the rendered display of the vive?
@user-ef7690 nothing out-of-the box for that. We recommend using the unit plugin and its screen recorder funtction.
Hi all, may I ask you a question. How can I get gaze position from pupil remote?
@user-7c81c5 you will need to run this script: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py
I ran that script, but I didn't find gaze position from that
@user-7c81c5 did you un-comment Line 24: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py#L24
Yes
Is the port correctly specified and matching what you see in Pupil Capture's Pupil Remote plugin?
Yes port is correct
I got message from pupil remote but there are no gaze position in that
I got only timestamp, topic , norm_pos, confidence , base_data
That is the gaze data object.
Norm pos is the normalized 2d gaze position
Oh I see
thank you 😄
@papr - could you make an issue in pupil-docs to add a sample gaze message to message format in the docs: https://docs.pupil-labs.com/#message-format
Ok ! That fine
We already have an example for both, pupil and gaze positions: https://docs.pupil-labs.com/master/#pupil-datum-format
Another question, gaze position is based on two eye tracking or one eye tracking ?
This is decided dynamically by the gaze mapper depending on how much data is available.
The gaze datum contains the key base_data
which includes the pupil positions on which the datum is based on
@user-7c81c5 if you use a binocular calibration the data from both eyes is merged into one gaze position.
should only data from one eye be available the gaze mapper will use only one eye as @papr suggested.
Yes - thanks @papr for the link (forgot that we merged that already 😄 )
Thank you 😄
hi all! I have a version of the pupil eye-tracker with the stereo world camera, however one of the world images is inverted vertically. Is there any easy way to fix it? Sure, I can invert it in my own application, where I use the eye-tracker, but I'd rather prefer a global solution
@user-c77dda since we cannot make a custom camera for just this one use case, we do indeed have a flipped camera. We always just flip the image display or the actual image. I hope that is ok!
ok I see. Do you happen to have any samples of using the camera and/or rectifying the stereo?
@user-c77dda since we dont do any stereo vision in Pupil I would recommend checking out how the opencv project does this.
is there any way to track head position with the pupil headset?
Yes, if you have a fixed surface that you track. You can use reverse homography to calculate the head movement.
eye0 - [WARNING] video_capture.uvc_backend: Capture failed to provide frames. Attempting to reinit.
Do anyone know how to fix this ?
It repeats again and agian
hey ~ i took off the eyetracker the other day and the nosepad broke off :( i'd like to try and glue it - are there any recommendations regarding the glue / material of the device?
Hi @user-f0de5d I would suggest that you send us an email at sales@pupil-labs.com and reference the order ID related to this headset and we can set up a repair. Repairing with glue could work as a very short term fix, but long term fix would be for us to make a repair.
I am having problems installing the drivers in my windows 10 pc, I do it the way it says in the get started part but the drivers dont appear in windows devices but when i look for them in system>aplications they seem to be installed. Any idea why this happens?
It wont be a problem if the program would recognise the glasses but it doesn't
@user-8e3dd6 could you please let us know what hardware you are using?
Do you have a DIY setup or hardware provided by Pupil Labs?
I have the Pupil Headset
@user-8e3dd6 could you let me know the configuration. E.g. world camera high speed + 120hz eye camera binocular?
If you are using a Pupil headset, after installing drivers you should see the devices under Device Manager>libusbK
@wrp The problem I got is that i follow these steps
And when i go to device manager it doesnt appear libusbK
But when I look for them in System>Programs they seem yo be installed pupil Cam0, cam1,cam2
Any idea on how to make it works?
I've tried to desinstall them, reset and try again and just happens the same
Hey! Currently I am using HMD Calibration (using Pupil with Unity with Virtual Reality HTC Vive), but I was wondering how HMD 3D calibration would be different from the normal HMD Calibration. Would it be better to use HMD 3D Calibration? If so, do I need to change the way my calibration are oriented or does HMD 3D calibration not need me to change the markers?
Also another question, I am using Dual monocular calibration, is this different from binocular calibration? Our application in Unity is pretty far already in development and we have already done many great things with Pupil, but we think we can try to optimize our calibrations / the way we process data from Pupil. I cant seem to find the binocular calibration plugin anywhere
For people who are interested in what we have done already, see this reference image. We made a supermarket in 3D and collect eye data to map this in heatmaps within Unity
I downloaded the source code and pupil_v9012_ winsows_x64.zip and extract them anyways. Execute the capture_pupil and this error shows up and no image
@user-7f5dd1 I can do TeamViewer remote setup to try and help you install drivers
please email info@pupil-labs.com to arrange a time
Okey
Thank you so much!
@user-7f5dd1 - welcome, let's see if we can resolve this quickly 😄
Hi, I habe the problem of randomly loosing connection to world and eye cameras (2 of them) on my glasses. Suddenly the connection breaks and no device is detected (it happens in the middle of a recording). I am on Windows using the drivers provided. The setup worked for 3 weeks with dayly use and not problems. I did not change anything (it is a labotory computer, neither hard- or software was changed) Do you have any hint what may cause that?
all three cameras at once?
yes all three at onece
once
My guess is the USB cable. Can you try a different one?
we although thought about a loose connection
I'll try eventhough it is only 3 weeks old
An alternative is that you send the headset back to us. We have a look and replace all components required (for free of course). This is quite fast. But nothing beats replacing a fautly cable in terms of speed :-).
Thanks for the offer. We'll have look, it should be no problem as we have two of your headset --> two kables
If it does not help I'll send you an email with further details
@user-7f5dd1 - I am pleased that we were able to quickly resolve the driver installation. A note for other users on Windows 10. Windows often likes to install drivers for devices automatically when the device is connected. You can either stop Windows from doing this for Pupil hardware when the devices are connected and then install drivers. OR if Windows does install drivers for cameras automatically, you will need to uninstall drivers that Windows installs before installing libusbK drivers.
I will update the docs with this information about driver installation for Windows users
@mpk - it seems the kable really broke... Kind of embarassed that I did not think of the easy faults
hey guys!
@user-f79453 sup
i was gonna ask what the cheapest way to build one myself would be, and if there was a file I could 3d print myself, but then I saw the docs on the pupil website and how the shapeways cost supports the program and decided against it
Rip 😄
hahahah
but any other suggestions would be great
the diy BOM has a HD-6000 as an eye cam
but as far as I can tell that only goes upto 30fps
i'm planning on using it to eyetrack people watching movies/films/etc
Hi @user-72b0ef I see your question and will get to it a bit later - apologies for the delay
is that enough for my application
?
@user-f79453 you can try the 30hz route
@user-f79453 please see https://docs.pupil-labs.com/master/#diy
@wrp hey! is this all current? do i need more than this? sorry I found you guys through an old hackaday article
@user-f79453 we have not updated the DIY BOM in a while, but if you find all parts it will work. There are many [email removed] that are using DIY setups. Many uvc compliant cameras will work with the software
oh okay cool
is there a subforum somewhere specifically catering to the DIY builds?
No, I do not think so
@user-f79453 there is no official sub-forum for people to discuss DIY builds, but you are welcome to discuss DIY builds here as well
If there is enough demand for this kind of channel, we can certainly make one - but a critical mass of at least something like 10 people would be needed to demonstrate the need for a separate channel IMO
@user-72b0ef Have you looked at and/or tried the new unity3d plugin - this incorporates the 3d calibration procedure - https://github.com/pupil-labs/hmd-eyes/tree/master/unity_pupil_plugin
okay cool!
any slim chance someone here happen to be from Pakistan too?
Hi guys , in case of I installed the bundle version of pupil on ubuntu 16.04LTS , is it necessary to install the linux dependencies ??
No, it should run out of the box. I assume something is not working, if you had to ask?
@user-d811b9 as @papr noted, if you are running a bundle (on any OS) all you need to do is run the executable. Are you encountering any difficulties?
@wrp no difficulties . Some times when close or open the camera from the application , I see a yellow messages related to a corrupt jpeg data
But seem to be warning
@user-d811b9 yellow messages are warnings and not errors. You may see warnings from time to time, this is normal
Good! Thanks @wrp
@user-d811b9 welcome 😸
@wrp no worries and thanks for your reply anyway! I know we have tried the plugin atleast once (also to look at code and calibration), but I see there are some updated files which I will definitely check out! It could be very possible that we already implemented 3D calibration within our application, but we just didnt know it :)))) Anyway, thanks again, I will take a look at it today and will come back to you.
@user-176502 I m glad to hear that it was a trivialy solved problem!
has anyone tried determining based off the pupil values that are streamed the degree to which someone is payting attention? I'm sort of confused by the pupil diameter literature etc
Hi all, I'm new here. So please bear with me if this had been asked and answered. Anyone else has problem with Pupil glasses causing EMI interferences to an EEG headset (Emotiv Epoc+)? I'm looking at EMI shielding the wires. Not sure if I need to shield the eye and world cameras too. Any help will be much appreciated in advance!
Also, anyone else has issues with UV from the sun? How do you get around that (other than staying indoors)?
@user-d811b9 I think that you want the fixation of a certain user? A fixation usually happens when a user looked atleast 0.2 - 0.8ms or something (dont know exact) to a certain object / point
@user-ed537d I think that you want the fixation of a certain user? A fixation usually happens when a user looked atleast 0.2 - 0.8ms or something (dont know exact) to a certain object / point
@user-d811b9 sorry, wrong tag :X
@user-ed537d Pupil already has a fixation plugin which also lets you change the fixation distance and the fixation time. I havent yet read the stream of data this plugin sends, but it shouldnt be too hard to figure out of you try some things yourself ^^
I question regarding compability, did someone think about adding third party eyetracking data import? e.g using Player to analyze SMI data
@user-d7b89d is there a open format specification for the smi data format?
I did not find anything. Nevertheless raw eyetracking data as well as videos can be exported which gives you (in my understanding) everything you need?
Could you send us an example export such that we can have a look at it?
of course, I can. Which mail adress would you like?
thank you, you'll get the mail soon
Thanks
@user-d7b89d I received your example recording. May I ask what type of analysis you want to do with Pupil Player that you cannot do with the SMI software?
On the one hand marker detection, on the other hand adjust fiaxtion parameter
So, do I understand it correctly, that you use an SMI eye tracker but our surface tracking system?
we bought two of your systems but had to do two experiments with the old smi system because we had to implement your system first. We included your markers in the old smi setup to be able to use marker detection.
at this point we are looking for a way to include the marker detection, the easiest would be to be able to import the data in player
in the current study we already use the pupil glases including markers
ok, I understand
I think the best thing would be to write a conversion script and to convert the videos manually
Do you have documentation on what the fields in the ...Samples.txt
fiel mean?
yes I do
Could you send me this as well such that I can look into how to convert the fields?
should be in your inbox
Yes, thanks. The biggest challenge will be to extract the correct video frame timestamps and to convert them into our format. Although, I cannot promise you anything since this is a free time project for now.
thank you very much, if you need anything let me know.
👍
quick question, with binocular eye setup (using 2 camera's), the camera images in Pupil seem to be flipped (on the X and Y axis), is there a way to fix this? I pressed the button for flipping the image, but it only flips it in the Y axis and not on the X axis
All of the data seem to be sent correctly to unity
@2SQRS#4078 the button flips x and y but only for display. Calibration takes care of the transformation for us.
@user-72b0ef yeah the issue isn't so much fixation as much as it is that the subject starts to fall asleep and wasn't actually performing the requested task.
I just used confidence levels and saw that the times where the video showed the subjects eyes closign the confidence levels dropped and when i plotted each trial over time in fact you see that confidence levels on >50% showed a lower than 80/90% confidence
i'll definitely take a look at the fixation detector tho!
Hi, I recently saw that during a recording session, the capture has a good fix on the pupil, but it switches between 3d models without any noticeable reason. I was wondering, is there a way to lock somehow the model that I think fits the video? would it help with the heuristics of the eye tracking?
@user-1486c3 Try reducing the model sensitivity in the Pupil Detector 3D menu in the eye windows.
@user-d7b89d are you using SMI? Can I talk to you in private messages? I have a few questions I would like to ask 😄
@papr should i do it after a good model was found? i mean, does it reduce the threshold for change of model, or does it restrain the model to some parameters' value?
@user-fe23df I used it for a few studies but changed to Pupil now. Of course you can, hope I can help
core what is the 'theta' and 'phi' data received by streaming gaze data over internet?
@user-b9cbde these are the polar coordinates of the detected pupil ellipse center on the 3d eye model. You can use them alternatively to the Pupil normal.
Hello. We just received the HTC Vive eye tracker add-on. We've been trying unsuccessfully to use it on either a Windows 10 machine or Ubuntu 16.04.2 (Release: 4.11.5-041105-lowlatency) machine.
On Windows, the cameras originally show up in device manager under "Imaging Devices", but pupil-capture does not recognize them ("unknown device"). We replaced the native drivers with libusbK drivers using zadig. Now pupil-capture recognizes the cameras (PupilCam1 ID0/1) but no images are shown. We get
[INFO] video_capture.uvc_backend: Found device. Pupil Cam1 ID1.
On Ubuntu, pupil_capture gives [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied.
Missed one of the Windows warning messages in the previous post: [WARNING] video_capture.uvc_backend: Capture failed to provide frames. Attempting to reinit.
Just tried on MacOS Sierra and Pupil capture worked first try, but it kept pausing and losing frames about once every 3 seconds.
On Linux, I can get a videostream using guvcview, but pupil_capture still gives the ghost mode error. If I select one of the 2 'unknown' devices in the "Activate source" list, I get [ERROR] video_capture.uvc_backend: The selected camera is already in use or blocked.
@user-5d12b0 You need to add your user to the plugdev group and restart your computer
Please make sure you user is member of the group plugdev.
Jinx!
On windows you will need to make sure our driver is used on the device
Thanks @papr and @mpk. I'll be right back, because I'm typing from the linux box I need to restart. While I wait, how do I get "your driver"?
The error in Mac we would need more info.
Check the release notes on GitHub these is a link to the relevant section on docs.pupil-labs.com
Now working in Linux. Now chastising grad student for not finding the Windows drivers.
:-)
Thank you for your help. We are up and running. We'll try to get the HMD calibration going on our own.
@user-5d12b0 thanks for the update - happy to hear that you got set up
Hi all: considering buying into the Pupil labs system. Are there plugins / buit-in functionality that identify blinks, saccades, pursuit movements, fixations, etc.?
Blinks, and fixations, yes. Pursuit movement : not yet
@papr Thanks for the quick reply!
No problem
Quick question, my python script simply stops and wait forever on topic,msg = self.sub_socket.recv_multipart()
Pupillabs capture is running and the service is on at tcp://127.0.0.1
Any ideas/comments/suggestion/advise?
Can norm_pos be out of [0,1] range ?
Yes
This happens for e.g gaze data when it is not mappable to the world frame, i.e it is not within the frame
When this happen. What should I do ? Should I do the calibration again ??
If this happens often, this is a symptom of a bad Pupil detection
Generally it is OK for these out-of-bounds values to be there. They are meaningless for analysis though, i.e. you will need to filter them after exporting. It is generally recommended to filter by confidence as well, since this is the first indicator for a Datum's quality
@papr Thank you
Sure thing :)
Hello 😃 I am wondering if you guys could help me figure out why the "composite parent" device is not visible.
Hey @user-05da2f which operating system are you using?
@papr Windows 10 x64
The funny thing is that this is a system where the composite device was visible in the past and now we can't get it to appear...
Are you referring to the cameras not being listed in Pupil capture or the cameras not bring listed in the windows device manager?
@papr the cameras are listed in the windows device manager
If I am not mistaken, there should be a new category libusbk
there is, and the cameras are listed
@papr but we only get input from the left eye. I have a colleague here who told me that this is a driver issue and that in the past when he had this issue the problem was that the driver was not installed on the composite parent or something...
Yeah, the driver situation is tricky on windows. Please have a look at the driver section over at https://docs.pupil-labs.com/master/#windows-driver-setup
@papr I have already followed the instructions to the letter but unfortunately I only get left eye in Unity. My colleague has an older driver version in windows 10 and he gets both eyes
With old drivers (composite parent visible)
Using new driver installer (composite parent no longer listed there...dunno if it's relevant) 🤔
ignore the arrows and notice the libusbK devices tree. With the old drivers it said (Composite Parent) and with the new ones it does not.
I'm not sure if this is the correct place to get support for this problem. If it is not please tell me whom I should contact for support. @papr
It is the correct place, but I am not an expert in this field. @wrp is the person for such detailed questions.
@papr Any idea when @wrp will be available?
Hi @user-05da2f I am online now
composite parent is not relevant
Please show hidden devices and uninstall any instances of auto-installed drivers within the Imaging devices
category
for Pupil Cameras
You can send us an email at info@pupil-labs.com if you are still having trouble and I can help you via remote session if necessary
BTW - a note to the community using Windows 10 - updates for Windows 10 will remove libusbK drivers. If you perform a Windows update you will need to re-install drivers. Please install drivers before plugging in the Pupil headset.
I get gaze data, but base data have only a pupil data. But I do two eye tracking. Why doesn't it give me two pupils data?
@user-1ada9f can you check if the pupil datum's eyeid
field is always the same or if it is alternating between 0 and 1
sometime it is 1 sometime it is 0
ok, this means that binocular gaze mapper is not able to correlate the pupil data
could you re-run the calibration procedure?
it happened in process of getting pupil data for hmd calibration (python version)
when I do calibration the pupil data should be gotten from gaze subscription, isn't it ?
The pupil datum stream always works as long as there is at least one eye window open
I open two eye windows for each eye.
The pupil data in the gaze datums is only the pupil data the gaze is based on.
Can you confirm that you never get gaze data with two pupil base datums?
Yes
when I do calibration the pupil data should be gotten from gaze subscription, isn't it ?
Not necessarly. If you want pupil data, always subscribe to the pupil topic. If you want gaze data, use the gaze topic.
it happened in process of getting pupil data for hmd calibration (python version)
There is no valid gaze data before calibration. The gaze mapper needs to calibrated first before it sends sensible data. Subscribe to the pupil topic to get all pupil positions. These can be used for the calibration
@papr Thankyou now i get it !
👍
Are there any way to get pupil data at specific timestamp ?
Subscribe to all pupil data and check the timestamp
field of each
This is difficult to do anyway since the timestamp precision is very high. There is no guarantee that ther eis a pupil datum for the timestamp 1234.567891234
What is your use case?
Is anyone else aware that the download link to the pupil labs apps is being flagged as malware?
@user-05da2f I was not aware of this
let me try to re-create this issue
Yes, I also see this page
this is new...
now I'm afraid to open the files 😦
^_^
I can reproduce it as well
This is not so nice... nothing changed with the bundles - I can assure you that there is no malware in there
perhaps github changed their servers for these files
A quick search showed that other people are having the same issue with their own repositories
@wrp I just read your reply for the Windows drivers. So every single time windows does an update I have to reinstall the libusbk drivers?
@user-05da2f unfortunately, yes
we will work to improve the driver installation process - but for now this is the state of driver installation on Windows
And I have to remove the imaging devices?
@wrp
Not all of them
You need to enable the hidden devices and remove those that are Pupil cams
aah! ok I see. thanks.
Thanks for responding @papr -
There were indeed two pupil labs devices under imaging, I removed them.
Wow - that's a lot of extra devices
@user-05da2f I would recommend uninstalling all Pupil Cam1 devices
@wrp I assume windows creates one for every different usb port you plug it in?
and then re-install
@user-05da2f - unplug headset, uninstall all drivers for Pupil Cam1, restart the computer, reinstall drivers, plug headset back in
@user-05da2f All these extra zombie drivers may be from multiple installations
will do thank you very much for the support 😅
Please be careful to install drivers prior to connecting the Pupil headset on Windows 10
Windows likes to try to auto-install drivers for the devices
So install drivers before connecting Pupil to your machine if possible
@wrp since you're here any chance you can help me get the unity plugin running?
I followed your instructions to the letter. Then I load the unity plugin, configure the location of pupil_service 0.9.12 , select autorun
@user-05da2f can you migrate this to vr-ar and @ mention @user-5ca684
@wrp hmd-eyes aside, Despite this "clean" driver installation procedure that we just did, my app still does not receive input from the right eye. Whereas in the past by using zadiag and some other driver installation hacks it worked...
@user-05da2f - have you tested the headset on another system (macOS or Linux for example?)
@wrp I am testing on MacOS right now.
and do you see both camera feeds on macos?
@wrp I see both camera feeds on Windows too using pupil_capture
@wrp it's just that through unity I cannot talk to the service and get both eyes.
@wrp but that could be anything actually, not the drivers problem
@wrp which is why I was trying to run hmd_eyes to see if it's my software problem or something else
OK - thanks for the feedback. This seems like an hmd-eyes issue then
yes, most likely.
@user-05da2f @papr I contacted github regarding the releases issue and received this reply:
Thanks for the report!
We've received a few reports of users encountering this warning, and have opened an internal issue for our team to investigate.
I can't offer an ETA on a fix, but will let you know as soon as we have any news to share.
@user-05da2f The above issue has been fixed.
@papr thanks!
Hello, me again! So, I just had my product owner coming by and tested our whole setup in front of him. Now it appears that my eyes are being tracked extremely well, so calibration and data was very useful and accurate. Now we tried the setup on different people, but apparantly we have some pupil tracking issues when it comes down to other people. I have tried selecting a different ROI and pupil intensity ranges, but still at some angles the pupil cameras dont detect some people's eyes that well.... There are so many variables I can change within pupil_capture that I do not know what the best settings are. Is there some sort of "best practices" for the settings as far as the pupil cams go? Like, what is the best resolution for the pupil cameras to detect pupils, 1920 x 1080 or a lower resolution? What pupil intensity range is the best one? I am using a HTC Vive setup with binocular cams and our data must be very accurate, because our research is based on different people (over 50). The pupil cams seem to detect my eyes better than other people's pupils, I have dark brown eyes, while people with lighter eye colors seem to be an issue for the cams to detect.
Hey @user-72b0ef We recommend lower resolutions for the eye cameras since it allows you to capture eye images at 120 Hz and requires less computation
@papr Yes, I figured that one out myself also 😄 good thing you can confirm it 😃
It is important that the cameras are focused well. Black blobs, eg. make up, are known to create false-positive pupil detections.
In the eye window you can change the mode to algorithm
. In the bottom left you see three circles. The green one needs to be between the two red ones
the two red circles represent the pupil min and max parameters in the Pupil Detector 2d menu
@papr Yes, thank you, but I already know all about this hehe. It is just that some people's pupils dont get detected at an angle...
or sometimes when they look directly into the camera, they wont get detected at all
very strange, if you'd like, I can maybe send some pictures?
Yes, please do so for both: normal mode and algorthm mode
you want me to send them via an email address or....?
You can send them in a PM on discord
@wrp committed the mouse_control.py fixes in pupil helpers
sorry it took so long
just updated my pupil folder and now get this
~/pupil/pupil_src$ python3 main.py MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. Process world: Traceback (most recent call last): File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap self.run() File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run self._target(self._args, *self._kwargs) File "/home/nps/pupil/pupil_src/launchables/world.py", line 76, in world assert pyglui_version >= '1.3' AssertionError
It means that you need to update your pyglui installation
It means that you need to update your pyglui installation
sweet it worked
wait pyre is to old
how do i update all packages
do i just still use the install function?
Not sure. Just update them one by one. This way you know what changed if something breaks
yeah
Also make sure you use the HTC Vive depth adjuster. It allows you to aims the cameras.
@mpk it looks like your response is aimed at @user-72b0ef (just to help clarify for others who are reading)
Yes. I'm only my phone and the app did not recognize the user handle
Thanks for adding that!
You're welcome 😃
@mpk Alright, I'll make sure to check that out, however I think this is not the issue, since my eyes are tracked properly, but it seemed to be different for each person
@wrp Thanks btw for your consultancy earlier yesterday, I have tweaked the contrast / brightness / gamma and stuff, going to run some tests today \o
@wrp if driver installation makes the devices show up under imaging devices how can I remedy this?
@user-05da2f I am not sure I completely understand your issue.
If you use our driver installer, then you should see drivers under libusbK
If you just plug in the cameras to a Windows 10 machine, and let Windows 10 do it's "magic", then drivers will be installed automatically and will likely list our cameras as "Imaging Devices"
You will need to uninstall drivers for Pupil Cam
under the Imaging Devices
tab in Device Manager
and re-install drivers
did I understand your question correctly?
(also make sure to show hidden devices to make sure you see and can uninstall everything accordingly)
@wrp it works now but driver installation is a mess. I bet next time I restart or connect to a different port things will be different :/
@user-05da2f Welcome to Windows.
@user-05da2f - we will work to improve the driver installer. Restarting or "connecting to a different port" will not affect your driver installation
Windows updates however, will wipe out the drivers unfortunatel
unfortunately**
@papr to be fair. I have other devices that don't get this when an update is installed.
@user-05da2f yes, this is true and is a fair assessment
Our Windows support is farily recent, so we are working on improving it at the moment
Driver issues are a "non issue" on macOS and Linux - only issue on Windows 😿
Anyway. Hang in there guys 😃
@user-05da2f a bit more detail here: We are using usb cameras that support the generic uvc standart. This is nice because you can use the cameras with any software. However when running them at high speed though one usb bus we use special drivers.
Windows will install generic drivers for all compatible devices and whipe our drivers on update.
The solution is to loose the generic interface but we like to keep for its benefits.
There are most likely ways of telling windows not to overwrite our drivers. But we are relativly new to windows land.
For now it means making sure that the drivers are installed and reinstalling when the cameras show up as unknow device
Hello, I am wondering how to track objects in Pupil Player using QR codes. Thank you! Sara
@user-4c03e2 our surface tracker does not recognize qr codes. It uses specialized markers. More on https://docs.pupil-labs.com/#surface-tracking
Yes thank you, that helps! Mainly I have 2 more specific questions: 1: It seems as though pupil player is only picking up on the marker that I have placed on a small moving object some of the time. Is there a way to get it more accurate? 2: I am wondering if I can get some kind of quantitative output from the surface tracker plugin that tells me when each object is being fixated on and where it is located throughout the recording? Thanks again!
@user-4c03e2 can you share a video of that? I think that will result in better feedback!
quantitative output is given in form of heatmaps and exported reports in .csv format.
In theory, sure! But I don't know that I can upload a video because of the file size. I'm attaching screenshots of 2 consecutive frames from the video. In one of the frames, the program is detecting the surface. In the next frame, it has lost it. This pattern of detecting and then losing the marker is pretty consistent throughout the playback. Thanks!
great! Ok the issue is that you are missing a white boarder around the marker. Please add a boarder of 1/5 of the marker size and you should be all set.
Ok, thanks.
That helped a lot, thanks! Now, I'm just wondering about the output in the Exports file. Is there a way to get where each object is located, frame by frame? Thanks again!
Each entry has a timestamp. These are the frames' timestamps that they belong to
world_timestamps.npy
contains timestamps for eahc frame
my computer will not open this file. What application do I need to open it?
It is a numpy file. You can load it within a python script
Usually it is not required to know the frame indices since all data (but pupil positions) have a world frame timestamp. Therefore, you can use the timestamps directly to correlate your data
May I ask, how you plan to analyze your data? Or are you just exploring how everything works?
Well, there are things I'll need. One is to know where each object was located throughout an interaction between 2 people, regardless of where each person is looking. The other is to know which object the person wearing the eye tracker is focused on during the session.
Using timestamps seems more fitting than frame indices then. The fixation detector will yield fixations for each surface btw
You can give surfaces names and define them with two or more markers. You will get less noise if the markers are spread out as far as possible. Motion blur is a problem though since it decreases detection rates for surface markers as well as calibration markers
Recording with higher framerates is therefore recommended
Ok thanks! How do I give surfaces names? I looked for this in pupil player but couldn't find it.
In the surface tracker menu, each surface has a submenu. it has a text field to change each's name
Thanks papr..I'll pick up working on this tomorrow!
i don't know if anyone has had a chance to try out the matlab pupil code i wrote @wrp @mpk were you able to test it out?
someone messaged me from the google group and asked for a link
@user-ed537d I did not test yet. I personally am not a Matlab user. I would love to merge your code into pupil helpers so that other people can use it, but would like to have someone test as well. Can you make a PR for your code so that it is more visible in the pupil-helpers repo?
Hi everyone, has anybody successfully used pupil mobile with any phone other than nexus?
I was able to use a monocular headset successfully on a One Plus 3
Unfortunately it does not work to run 3 cameras in parallel
@user-006924 I think people are also using the HUAWEI P9
Thanks ,I have a monocular setup myself minus the nexus phones : )
the 5x is pretty affordable these days: https://www.amazon.de/LG-Smartphone-Display-Android-Marshmallow/dp/B015Y5398A/ref=sr_1_1?ie=UTF8&qid=1501245336&sr=8-1&keywords=nexus%2B5x&th=1
Can using a fish eye lens camera instead of either world or eye cameras mess with the mapping happening between the pupil and scene polynomials?
In 2d mode that should be fine. 3d mode can get confused. We are adding new distortion models soon
@wrp sent
Thanks @user-ed537d - I see the PR
Hi all. I just downloaded the new release of pupil capture and I can't open it on my mac. I saw someone else posted about this issue on the download site. Have there been any fixes?
Not yet but we are working on it
Will be added a guide to explain as work the new feature. ?
is there a previous version you would recommend going with for now?
0.9.10 should work for linux
0.9.10 should work for linux
we're running mac os
0.9.10 should work for linux
0.9.12
0.9.13 should work for linux as well.
0.9.12 should work for mac
Got 0.9.12 installed. Trying to connect to android phones running pupil mobile. I've confirmed the connection to my router and mobile appears to be working correctly. However, pupil capture won't show the input from the phones. Can anyone help me debug?
I have pupil mobile set up under 'capture selection'
but it says no hosts found
Make sure that your network allows upd connections. This is used to discover the devices in the network.
are you running a small wifi or part of a bigger institutional network?
it's a dedicated router not connected to the internet
how do I check if it allows upd connections?
ok.
what os are you running?
mac os 10.9.5
ok. same as me.
any thoughts on how to detect pupil mobile?
They should just show up.
make sure the app is running on the phone.
restarting to see if that helps
phones should be on 'transcode to h.264', correct?
for the world camera yes. the eye I leave with mjpeg.
Are you setting this from the android app or Pupil Capture?
android app
damn. restarted. still nothing in pupil capture. all my networks are connected, but not seeing any remote hosts
ok.
i'm seeing 'pupil mobile commspec v3' greyed out
but it says 'no remote hosts found'
the version v3 is the current. This is correct. I have the same and the latest version of Pupil mobile of my nexus 5x here.
Both are connected to the same wifi. that is created by my router.
All devices have ip from the routers dhcp server.
yes. I can see an ip address on each android, so they appear to be connected.
also seeing an ip address on my mac
all ip addresses match up
just did a ping from the terminal window. received data back, so I know the devices are talking
ok.
then it must be udp.
I m note sure what router you are using. Maybe your mac is also set to block upd?
we (the zre discovery protocol) uses port 5670
it's an asus rtn66u router
the mac was working before. Not sure what happened. any idea how to check if the mac is blocking upd?
you had pupil mobile connections working before?
yes.
ah ok.
What version of Pupil mobile are you running?
just updated today
it's been glitchy, so i thought the upgrade might help
app verion 0.13.0
I run the same version .
if this ran before on your machine and same router but no longer I m a bit puzzeled.
me too 😃
huzzah! Turned the firewall settings off. Now it works
so i think you were right--it was blocking the connection
I m glad it got resolved. You can set for FW to allow Port 5670 UDP then you should be able to turn it back on.
I alsor recommend using the eye cameras at qvga when runnning Pupil mobile. It helps with bandwidth.
actually, can you confirm a few things with our setup?
sure.
so we have two trackers running. one just has the head cam. The other has both eye and head cam.
we want to sync them
Ah ok.
make sure that time_sync is turned on.
This will make sure that the timestamps are all using the same clock.
Do you use two macs or one mac running two instances of Pupil capture?
one mac
Then you can run two instances of Pupil Capture. One for each headset.
turn on Pupil sync and Pupil groups.
Now you have a few options.
stream and record locally.
or record on the android device.
we have been recording on the android device as our backup system. but we intended to stream data to the laptop. How should we optimize that?
weird. i'm just seeing the pupil labs with the head cam.
there we go. fixed
once you have time sync active you can record on the android device locally and transver the recordings afterwards.
we use AIRDROID app to transfer files. But you can use anything you like.
can we also record locally on the mac?
or should we not do that?
The video files have a .time file that contains timestamps for each frame. You can use these to align the video streams.
We do this for eye and world in player already.
If you want to align more streams you will have a write a small script that does that.
I have to go now but lets continue this discussion. I think we should make a small script that shows how to do that. Or even a player plugin that merges two video streams.
ok. One quick question: if I wanted to launch another capture on my mac, how do I do that?
open terminal and type /Applications/Pupil\ Capture.app/Contents/MacOS/pupil_capture
open a new terminal window and do the same again for the second instance.
thx. I'll reach out again tomorrow morning after I arrive. Maybe we can optimize a bit more. cheers!
@everyone I just made two demo videos for offline calibration and tracking: Check them out and make corrections in the CC if you want 😃
@mpk I have a question about offline calibration . In my analysis I have to recorder how gaze change during a driving simulation . In some case , the movement of test driver or better the headset , distort the gaze mapping . I think that using the offline calibration I can solve this problem , but the question is , how ?
can you share a sample recording?
Can I do a video with my phone or I have to send the file create by player ?
best to share a full recording via google drive or the like.
Ok can I share the full recording . You have a email ?
This is a big file , 1.9Gb
@mpk I sent you a video from my phone because to download the folder on google driver is expensive in term of time
Before the start of simulation I ask to see three element on road, but when the simulation started the gaze position is switched on top of road , probably the headset has moved. The previously question, how can solve this with the offline calibration