pip uninstall pylsl #remove latest version pip install pylsl==1.16.2
Thank you! I have Pupil LSL Recorder and Pupil LSL Relay plugins on the menu. That should be all for now with Pupil Capture side, in terms of installations.Do you think an HP EliteBook 640 G11 (16 GB RAM, 512 SSD, Intel Core Ultra 5), can record, with LSL (LabRecorder), all data from Pupil Capture (Core), 32 Chann. EEG (Biosemi), wav reproduction and wav recording (Matlab)?
I can't say with confidence one way or the other - this is the type of thing you'll want to evaluate through pilot testing.
By the way, using the LSL recorder and LabRecorder seems redundant, no?
digital surface tags
Connecting pupil core through active USB hub
Hi folks! Not sure if I am in the right place here, but I want to try to build a basic eye tracking project for reatil-market behaviour research. I got two cheap cameras: 1 on my forehead looking in the same direction and one filming my eyes close up. Can anyone recommend resources or repositories to build a basic calibration between these two? Basically setting up something like GazeRecorder with AI but for real life.
Hi @user-a24e17 , if you are looking for a code reference about eye tracking calibration, then you could check out the Pupil Capture source code.
Hi! Is it possible to use the Pupil software without the hardware and just with a Mac webcam?
Hi @user-a5a8c4 , Pupil Capture needs a UVC compatible camera. I am not sure if the Mac webcams are compatible in this way, but another expectation of Pupil Capture is that you have at least two cameras, with at least one mounted on the head in front of the eyes and the other mounted on the head looking forward at the world.
Are you trying to use your webcam as if it were a remote, desktop based eyetracking system?
I tried downloading the libraries from source but unfortunately ran into several package management issues so now I'm using the bundled distributions, but I can't find my webcam in the dropdown on Pupil Capture.
Hi, is it possible for the pupil software track from mp4 or mov video? I saw on the documentation of repository I need to use player but it's not works. If it's possible where I can read the documentation specifically for the player. Thank you!
Hi @user-02d0ed , I'm not sure I fully understand the question. Could you re-phrase it? Do you mean that you already recorded eyetracking data to a custom MP4 video and you want to load that data into Pupil Player? Can you share a screenshot from a video, as an example?
Thank you for you reply. Here is my videos will lookalike. The videos will have mp4 or mov format taken from vestibularFirst headset.
I see.
Pupil Player is not compatible with such a format. It expects recordings made with the Pupil Capture software that is designed for the Pupil Core headset.
Pupil Player expects at least 1 MP4 file for just one eye, in order to extract eye movement data. In this case, you could have one video for each eye, but it also needs other metadata to fully load and work with them. You could potentially copy the metadata from a working Pupil Capture recording and then split your Vestibular First eye video into 2 separate MP4s, called eye0.mp4
and eye1.mp4
, but I cannot guarantee that this will work as expected, as you will also need to generate Pupil Player compatible timestamps for each video.
alright thank you for your response!
Ah ok, is there anywhere that I can look at the requirements for capture? Yeah I want to use it for screen-based eye-tracking (so mapping pupils/eyes to coordinates on the screen)
That is the main requirement for the cameras.
If you want to do screen-based eye tracking, then I recommend looking into the Surface Tracker plugin, but it does require having a camera mounted on your head, similar to Pupil Coreβs world camera.
Also, please note that Pupil Core uses some different principles from remote, desktop based eyetrackers (eg, a fixed webcam mounted on the computer monitor). It is unlikely to work as expected with such a camera. If the camera is rather wired and can be positioned on your head, then that could work.
If you are interested, we also have a DIY guide for building your own Pupil Core.
Hi I was working in python with my pupil core eye tracker when the camera world stopped showing when i run uvc.device_list(), i only get my two eye cameras Is there something to do ?
Ok, it seems like it is a cable connection problem
Hi @user-c68c98 , glad you got it sorted, but if your Pupil Core eventually shows this problem consistently, then be sure to open a Support Ticket in π troubleshooting .
Hi,
Do you have the source code available for the neon companion app? I'd like to get it to work on an unsupported device and to add my own streaming interface, if possible. (I saw your device on VSS, and despite the price tag, my PI said provisionally yes, as long as I can customise it.)
Hi @user-83d349! Thanks for following up after VSS (I hope you had a great conference π). The source code for Neon Companion app is not open. But all of Neon's data streams are open format, and can be streamed via the realtime API. What is it exactly you need to customise?
If need be, I can sign an NDA
I figured. From what I gathered from your documentation, you limit the Android device availability because you are not sure that it has enough 5V power in USB host mode and probably not enough processing power. I want to make a stripped-down version that does not have any online access and all it does is to forward processed data to my environment.
Due to my location I have access to devices that have enough processing power and support, but they are not aimed for the western markets. So it would be unfair to ask you to certify them, especially since I can't even send you one.
Making a stripped down version of the app is not going to be possible. But with that said, the phone does not need to have online access. You need to create an account and log in to the app. After that, you can disable the phone's internet connectivity. To stream data, you just need a local network, and this can even be facilitated via ethernet cable. If you purchase a Neon, we can ship a Companion phone, or you can source one yourself. The list of supported phones and Android versions (in case you haven't seen it) is here: https://docs.pupil-labs.com/neon/hardware/compatible-devices/#android-os
This is what I was afraid of.
This statement: "the phone does not need to have online access" is in contradiction with this statement: "You need to create an account and log in to the app".
I am afraid this might well be a deal breaker for us. I forwarded this information to the decision makers in charge.
As stated, you can disable the phone's internet connectivity. If required, all data can be recorded and kept offline. We also have an API for working with recordings programmatically: https://github.com/pupil-labs/pl-neon-recording/
We would, however, recommend updating the Companion app from Google Play store when an update is available, as we push stability and performance improvements from time-to-time.
If creating an account once really is an issue for you, you might be interested using the Core pipeline with Neon: https://docs.pupil-labs.com/alpha-lab/neon-with-capture/#use-neon-with-pupil-capture But note that you won't be able to avail of Neon's native data streams. That is, gaze estimation will require calibration and so on. More in the Alpha Lab guide...
Yes, this might well be an option too, but some documentation is missing (broken links, such as https://docs.pupil-labs.com/developer/core/plugin-api/), which is why I reached out. I don't mind doing the calibration and I can set up a video AGC for your feature tracker to stay happy, but I really need to control what's on the local network. We use robots and linear stages, so latency and network utilisation is critical and is a safety issue. I am not sure about your open-source code's compatibility with new kernels and latency requirements. I know it's just mostly just UVC, and some extra stuff for the sensors and perhaps some GPIO in the Neon glasses, but you haven't updated the open-source stuff in two years. I think we both know that there have been some major changes not only in Linux and Android kernels, but in a lot of the dependencies since then.
Anyway thanks for the info. If the PI (and others) decide to spend β¬6000 + vat + duties + transport despite this, than our lovely procurement department will reach out.
For reference, the Core developer docs are now here: https://docs.pupil-labs.com/core/developer/#overview
Thanks. I also forwarded this information. It would be really helpful if you archived your old code that you no longer actively develop on Github.
To which code do you refer?
This is the old one that wasn't updated in years and has broken links for documentation: https://github.com/pupil-labs/pupil It seems that you still use it, but the latest release was in 2021. The latest commit that wasn't somebody else's pull request was in 2023. I think it's safe to say that you are no longer actively developing it. π The broken links for the documentation are in this section: https://github.com/pupil-labs/pupil?tab=readme-ov-file#developers
Thanks for your feedback and for catching the broken documentation links! I'll take a look at those.
Please note that this code is still used and maintained for bug fixes. At this time, we do not plan to archive the repository. Archiving would make the code read-only and prevent new issues, pull requests, or changes, which we certainly want to avoid.
but I haven't gone through the code in proper detail yet