πŸ’» software-dev


user-2a0b36 05 March, 2023, 22:47:04

Hi, I'm a software developer that has been looking into integrating Unity3D + Pupil Capture/Service into my product. I am looking to install an android build of a unity project with Pupil onto a eye tracking headset to create an untethered experience. My question is on whether Pupil has a Unity SDK and whether or not Pupil would be compatible with an Android build target in order to build out to Android devices. Any help is greatly appreciated!

user-becdcb 06 March, 2023, 08:10:43

Hi, I'm trying to getting datas from gaze and I'm using C# because I would use some AR and rendering stuff so I probably use Unity. I'm stuck on some problems and cant get any gaze data, 27 line is null so receivemultipart message give me null, I want to check if it is connect or subscribed succesfully but I dont know how to do, couldnt find it, please could you help me?

Chat image

nmt 06 March, 2023, 16:40:59

Hey @user-2a0b36! Check out hmd-eyes - this repo contains the building blocks to implement Pupil with Unity3D: https://github.com/pupil-labs/hmd-eyes. Note that Pupil Core software is not compatible with an Android build target

user-2a0b36 06 March, 2023, 16:44:48

Thanks for the response. Does that mean it would require a tethered experience?

I also have one additional question. Is it possible to create a calibration routine without looking at a known target (i.e., β€œGary, I need you to look up, now down, now left, now right.”)?

nmt 06 March, 2023, 16:43:29

Did you already check that you can subscribe to the gaze stream with our Python example (https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py) ?

user-becdcb 07 March, 2023, 00:54:18

I didnt see this example, after couple of try still gettin null, I tried substoanytopic method and still receivestring gave me null, I can get subport so req response is working but SubPub isnt, do I need to make some settings?

nmt 06 March, 2023, 16:56:36

Yes, it would be a tethered experience. We do have a demo calibration choreography in the hmd-eyes repo. You can modify its parameters: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#calibration

user-2a0b36 06 March, 2023, 17:14:56

I appreciate the quick responses. Thank you @nmt. You were a great help

nmt 07 March, 2023, 13:27:01

Can you confirm whether the Python example works?

user-becdcb 08 March, 2023, 09:56:49

I will check it tomorrow, because of timezone, my response take time sorry for that

user-b3b1d3 08 March, 2023, 09:29:44

Hello everybody, In relation to the transformation of the scan path surface coordinates to pixel coordinates of the image. https://github.com/pupil-labs/pupil-tutorials/blob/master/03_visualize_scan_path_on_surface.ipynb

user-b3b1d3 08 March, 2023, 09:30:10

Is there an example to do the same thing but for the coordinates of both eyes separately?

nmt 08 March, 2023, 12:52:15

We don't have an existing solution for this I'm afraid

user-b3b1d3 08 March, 2023, 09:30:18

Thanks in advance!

user-b3b1d3 08 March, 2023, 13:12:23

Ok, it coulld be somehow important in the research we want to do. I will give some updates in the solution we make. I was thinking as a simple solution to somehow combine the surface_gaze table with the normal gaze table, taking as reference the world index. I need to study more this, beacause it might be completele wrong. Other solutions might be more complex and require more time. Anyway thank you very much.

nmt 08 March, 2023, 16:41:17

The Surface Tracker transforms '2d gaze' from scene camera to surface coordinates. The problem with gaze normals are they are in 3d camera space – we find the 2d gaze coordinates by projecting the nearest intersection of lines in 3d space (i.e. gaze normals; the line going through the eyeball centre and the object that's looked at) back onto the 2d scene camera plane. What might help you is this dual monocular gazer: https://gist.github.com/papr/5e1f0fc9ef464691588b3f3e0e95f350. Off the top of my head, I'm not sure how the gaze coordinates for each eye are stored, however. Worth taking a look.

user-b3b1d3 13 March, 2023, 09:25:04

Thank you. I will take a look.

nmt 08 March, 2023, 16:44:19

This a graphical representation of what I describe:

Chat image

user-becdcb 09 March, 2023, 00:19:40

Python version has same problem, req-res working but subs not working. "topic = sub.recv_string()" line is gave no error and break the program after this line, even while still true program still working but gave me no result. -Sorry for the picture but my research center has restrict on computers so I could only take pictures.- @nmt

PS. "pyzmq version 25.0.0, msgpack 1.0.5(the documentation version is deprecated), python version 3.9"

Chat image

nmt 09 March, 2023, 07:36:42

First thing to note is that you'll need to wear and calibrate Pupil Core before obtaining gaze data. It's also worth mentioning that you either need to run the script on the same computer as Pupil Capture, or the computer must have a network connection to Capture that isn't blocked by a firewall.

user-becdcb 09 March, 2023, 07:39:05

I see, thank you for your help mate, I will test these things, have a nice day

user-9c3ec3 10 March, 2023, 22:34:59

Aside from your hardware, have folks adapted this to work with [albeit awkward] smartphone cameras?

nmt 13 March, 2023, 08:59:36

Hi @user-9c3ec3. Core software is compatible with third-party cameras that are UVC compliant. Check out this Discord message for details: https://discord.com/channels/285728493612957698/285728493612957698/747343335135379498. You might also be interested in Pupil DIY: https://docs.pupil-labs.com/core/diy/#diy Difficult to say whether you'd have success with the vast array of smartphone cameras πŸ€”

user-dd0489 16 March, 2023, 14:11:54

Hi, I was wondering if there's a way of getting the data for both eyes after using the dual monocular gazer plugin and later defining a surface? It seems like the surface plugin gives back the data of both eyes merged again and not as independent ones.

nmt 16 March, 2023, 17:39:44

Yes, the surface tracker will do that. You'd essentially need to modify it to transform the dual gaze estimates into surface space

user-d9bb5a 16 March, 2023, 18:09:51

Good evening, the camera turns on, what should I do? Maybe what driver is needed?

Chat image

nmt 16 March, 2023, 19:15:07

Please share the pupil capture.log file. Search on your machine, 'pupil_capture_settings'. Inside that folder is the .log file

user-d9bb5a 16 March, 2023, 19:18:30

capture.log

user-d9bb5a 16 March, 2023, 20:18:07

camera r200

user-d9bb5a 16 March, 2023, 21:53:26

Please help me set up my camera

nmt 17 March, 2023, 07:41:02

Realsense

user-9532c9 18 March, 2023, 23:04:31

Hello! I`m student and i want to use your program in my academic year project. is this any chance i can use your program with default camera in my laptop or usb-camera instead of your headset? I am trying to do this but i get this error. Thank you.

Chat image

wrp 20 March, 2023, 06:26:15

Question: What is the scope of your project? Are you building your own wearable eye tracker with your own eye and scene cameras?

You can use other UVC compliant cameras. If they are UVC compliant they should automatically appear in the Activate Camera drop down.

user-9532c9 20 March, 2023, 07:47:20

Thats exactly what i m trying to do in my project. I can see my camera in the Activate, but every time when i try to choose it, i get an error "this camera is already used or blocked"

wrp 20 March, 2023, 08:20:02

Custom camera with Pupil Capture

user-50567f 20 March, 2023, 19:06:55

Hello,

How is the heatmap calculated from the surface data? Is there a formula to it? Is there code I can access?

nmt 22 March, 2023, 17:10:43

Hi @user-50567f πŸ‘‹. You can see the source code here: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/surface_tracker/surface.py#L609

user-91a92d 23 March, 2023, 09:42:20

Hello, I have an issue running the script https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py with all message After the message frame.eye.0 I receive the following error: frame.eye.0: {'topic': 'frame.eye.0', 'width': 192, 'height': 192, 'index': 112322, 'timestamp': 2725.910855, 'format': 'bgr'} Traceback (most recent call last): File "/home/mathieu/pupilabs/test_reception.py", line 35, in <module> topic = sub.recv_string() File "/home/mathieu/pupilabs/lib/python3.10/site-packages/zmq/sugar/socket.py", line 927, in recv_string return self._deserialize(msg, lambda buf: buf.decode(encoding)) File "/home/mathieu/pupilabs/lib/python3.10/site-packages/zmq/sugar/socket.py", line 826, in _deserialize return load(recvd) File "/home/mathieu/pupilabs/lib/python3.10/site-packages/zmq/sugar/socket.py", line 927, in <lambda> return self._deserialize(msg, lambda buf: buf.decode(encoding)) UnicodeDecodeError: 'utf-8' codec can't decode byte 0x89 in position 34581: invalid start byte I think it is related to the image data Other messages are well reveived I am usind the dev branch zmq == '25.0.2' msgpack=1.0.0

nmt 23 March, 2023, 11:55:42

To receive frames, please use the receive frames helper script: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

user-91a92d 23 March, 2023, 14:44:39

Ok thanks. My purpose was to explore all the data send over the network, that why this bug was frustrating

nmt 23 March, 2023, 14:55:41

You can of course receive the topic name with the filter message script, as it's a string. But the payload is different and would need handling as such. See https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py#L49 and https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py#L91

user-91a92d 23 March, 2023, 14:59:17

Hi, I am trying to use the data from the head pose tracker with the msg send by the network API but the data I receive make no sense to me. Looking at the Head Pose tracker visualizer the result looks great but I can't manage to have the correct sign along the different axis with the network API. I am extracting the translation from the camera pose matrix with functions from ROS. I am missing something ?? edit Seams that I have a least manage a coherent result once but with some lag

nmt 23 March, 2023, 15:01:36

Hi @user-91a92d πŸ‘‹. I'm not sure I fully understand your question. Can you elaborate a bit on what your output is? What doesn't look right etc.?

user-91a92d 23 March, 2023, 15:06:16

I was looking at the sign the x,y,z value and they did not match what I was observing in the visualizer. Without changing it seams to be alright expect a delay 0,5s 1s.

user-00cc6a 29 March, 2023, 15:37:17

Hello everyone, can I know how to map gaze data on dynamic AOI’s ?

user-480f4c 30 March, 2023, 06:58:00

Hi @user-00cc6a, if by dynamic AOIs you mean for example scrolling down a website in your laptop you can do that using the Apriltags and the Surface Tracker plugin - you'd just need to put the markers at the corners of your laptop screen

user-00cc6a 29 March, 2023, 23:10:34

@nmt ??

End of March archive