πŸ₯½ core-xr


user-8779ef 01 February, 2023, 11:25:19
def get_timing_info(file_path):

    container_in = av.open(file_path, mode="r", timeout = None)
    time_base_out = container_in.streams.video[0].time_base
    num_frames = container_in.streams.video[0].frames
    average_fps = container_in.streams.video[0].average_rate

    container_out = av.open('vid_out.mp4', mode="w", timeout = None)
    stream = container_out.add_stream("libx264", framerate = average_fps)
    stream.options["crf"] = "20"
    stream.pix_fmt = container_in.streams.video[0].pix_fmt
    stream.time_base = container_in.streams.video[0].time_base

    packet_pts_out = []
    pts_out = []
    dts_out = []
    frame_time_base_out = []
    relative_time_out = []

    count = 0

    #for raw_frame in container_in.decode(video=0):
    for raw_frame in tqdm(container_in.decode(video=0), desc="Working.", unit= 'frames',total = num_frames):

        pts_out.append(raw_frame.pts)
        dts_out.append(raw_frame.dts)
        frame_time_base_out.append(raw_frame.time_base)
        relative_time_out.append( np.float32(raw_frame.pts * raw_frame.time_base) )

        for packet in stream.encode(raw_frame):
            packet_pts_out.append((raw_frame.pts, packet.pts))
#             container_out.mux(packet)

        count = count+1
user-8779ef 01 February, 2023, 12:55:15

user-8779ef 01 February, 2023, 12:55:30

The source vid, just in case.

user-8779ef 01 February, 2023, 12:55:44

Gotta run, but I'll check in later. Thanks again, Pablo.

user-626bf2 01 February, 2023, 13:28:34

hey, is there any way to get gaze location on 2D scene in unity? Everything I could find was about 3D scene in VR.

user-8779ef 02 February, 2023, 15:52:05

I don't work with Plabs, but I'll try to help. I assume you're referring to a panorama. Is this a static image, or a movie? If it's static, then you might consider https://docs.unity3d.com/ScriptReference/RaycastHit-textureCoord.html . You will have to figure out the relationship between UV (texture) coordinates and image location, because they are dependent on a few settings. If you're unsure if your solution is accurate, you might try it out on a test texture with a few fixation targets on a solid color background.

If it's a movie, then you can take the same approach, but will need to be careful to account for timing. Perhaps save out unity timestamps, pupil timestamps, and movie timestamps on every frame, so you can sync gaze to unity, and then unity to the movie.

user-d407c1 07 February, 2023, 07:46:38

Hi @user-89d824 Since your question was related to HMD-eyes, let's move it to the VR-AR channel to keep this tidy.

The exception you are showing in the logs, is from NetMQ (a ZMQ client written in C#) and it occurs when you attempt to send several send requests without receiving a response. In principle, both sockets need to go through the full cycle of sending/receiving as a part of the communication. In simpler words, Capture and Unity need to reply to each other before another signal is sent.

I recommend that you look at the SetupOnSceneManagement.cs in the Scene Management Demo. This shows you how gazeCtrl gameObject is enabled once the new scene is loaded. The SubscriptionsController shows you how this is connected OnEnable() and disconnected on Disable. Therefore, disconnecting and connecting through NetMQ(zmq) properly between scenes.

Why it fails sometimes for you? Without context it is hard to say, since I don't know your configuration, which SceneManager you use , or how do you check for button click. My guess is that the click button is checked in an Update function, and since the RecordingController checks also in Update if the Boolean has changed, you need couple of frames to trigger stop the recording. The scene manager will load the next scene ASAP and you might be changing the scene without having received the response from Capture.

In general, you can wait to check if the recording has stopped, with IsRecording, then disable the gameObjects to ensure that the sockets close, and then change the scene and enable them to reconnect.

user-89d824 09 February, 2023, 10:36:37

Thank you for your reply.

After I posted that question, the error appeared during the first attempt at recording when I ran my next experiment... that was new.

I'm currently working on something else but will drop a question (or two) again if I need more help.

Thanks again!

user-8779ef 07 February, 2023, 13:44:01

Hey @papr , have you moved the postHocVr gazer to the dev branch?

papr 07 February, 2023, 13:49:38

yes

user-8779ef 07 February, 2023, 13:44:48

I stepped away from running from source for a bit, and now see the issue with Py 3.6 deprecation is preventing me from running the post-hoc-vr-gazer branch.

user-8779ef 07 February, 2023, 13:49:57

Thanks πŸ™‚

papr 07 February, 2023, 13:50:21

I recommend using the latest develop branch + creating a new virutal environment based on the requirements.txt file

user-8779ef 07 February, 2023, 13:51:36

Trying with 3.9

user-8779ef 07 February, 2023, 13:53:48
Traceback (most recent call last):
  File "D:\Github\pupil\pupil_src\main.py", line 36, in <module>
    from version_utils import get_version
  File "D:\Github\pupil\pupil_src\shared_modules\version_utils.py", line 37, in <module>
    ParsedVersion = T.Union[packaging.version.LegacyVersion, packaging.version.Version]
AttributeError: module 'packaging.version' has no attribute 'LegacyVersion'
user-8779ef 07 February, 2023, 14:00:28

@papr Looks like they removed packaging.version.LegacyVersion

user-8779ef 07 February, 2023, 14:00:47

I changed to ParsedVersion = T.Union[packaging.version.Version, packaging.version.Version] and it runs. Hope that doesn't screw anything up πŸ™‚

user-8779ef 07 February, 2023, 14:12:13

Eww. Not working, but for unrelated reasons. I'll take this over to πŸ’» software-dev

papr 07 February, 2023, 14:14:04

Your develop branch is not up-to-date. Make sure to pull the latest changes and try again

papr 07 February, 2023, 14:14:31

The fix for this issue hast been added already https://github.com/pupil-labs/pupil/blob/develop/requirements.txt#L14

user-8779ef 07 February, 2023, 14:16:15

HRUM. I thought I was up to date. Thanks!

user-8779ef 07 February, 2023, 14:23:09

@papr FileNotFoundError: Could not find module
'C:\Users\gabri\anaconda3\envs\PupilLabs_2\li
b\site-packages\pupil_apriltags\lib\apriltag.
dll' (or one of its dependencies). Try using
the full path with constructor syntax.

Pupil_apriltags 1.0.4 has been installed

Chat image

papr 07 February, 2023, 14:34:03

Try pip install pupil-apriltags -U

user-8779ef 07 February, 2023, 14:26:27

It looks like apriltag.dll exists in that location.

user-8779ef 07 February, 2023, 14:26:49

...so, it must be missing a dependency. Requirements.txt has been installed.

user-8779ef 07 February, 2023, 14:30:58

Just going to go ahead and comment out the related lines in player.py

user-8779ef 07 February, 2023, 15:32:26

@papr Getting a few of these with a file:

Chat image

papr 07 February, 2023, 15:32:56

is this from an unmodified recording?

user-8779ef 07 February, 2023, 15:32:33

is this something I can ignore?

user-8779ef 07 February, 2023, 15:32:59

yes

user-8779ef 07 February, 2023, 15:33:07

This is about 50% through finding pupils.

papr 07 February, 2023, 15:33:33

these are decoding errors, prob. save to ignore

user-8779ef 07 February, 2023, 15:33:54

thx. So, these are from the same file I have been trying to transcode

user-8779ef 07 February, 2023, 15:34:27

Well, same pupil folder. The errors here are related to the eye video, not to the world video.

user-8779ef 07 February, 2023, 15:36:23

FWIW, I've given up on my efforts to view the optic flow transcode in pupil player. It would be nice, but... oh well.
I'll assume gaze timings are consistent relative to the start of the video and will overlay gaze using my own code, independent of PPlayer. My only fear is that temporal mismatches will accrue over the 20 minute recording.

user-8779ef 07 February, 2023, 15:37:13

That is - the gaze overlay won't align with the visual informatoin that was visible at the time the eye movements were made.

user-d0c376 09 February, 2023, 22:34:30

Hi. I did a sample test using one of the demo scenes in the starter project with Pupil Capture, and I found that the recorded data contained some data with a sampling frequency higher than the advertised 200Hz. For many readings it was ~360Hz sampling frequency. Is this possible for the hardware, or is it likely an error?

nmt 10 February, 2023, 14:51:44

Hi @user-d0c376 πŸ‘‹. The VR Add-on has free running eye cameras that has two implications for sampling frequency: Pupil data - contains datums from both the left and right eye. If you don't filter for a specific eye, you can end up with more samples than you might initially have expected Gaze data - We employ a pupil matching algorithm that tries to match eye image pairs for gaze estimation. Read more about that here: https://docs.pupil-labs.com/developer/core/overview/#pupil-data-matching

user-44c93c 10 February, 2023, 20:27:32

Hello. The L/R diameter_3d values in our data output alternate but are rarely reported for the same timestamp. In other words, L eye timestamps are inconsistent with the R eye timestamps. Is there any way to set it up such that L/R diameter_3d values are both given for each sample with the exact same timestamp? More or less, we are wondering if it’s possible to make both eye cameras capture data at the exact same time, which would result in the two circled numbers (and each subsequent pair of rows) being the identical timestamp? This is from the HTC Vive Binocular Add-on hardware running through pupil capture. Thanks!

Chat image

nmt 13 February, 2023, 10:09:07

The eye cameras are free running, i.e. not perfectly in sync. Read more about the data matching algorithm in my message above πŸ™‚

user-9eed85 15 February, 2023, 15:53:03

Is there any impact of putting high values of gravity and mass in to the virtual object ?

user-886683 20 February, 2023, 07:09:42

Is there anyone have experience on preprocessing and ploting pupil diameter with VR eye tracker?

nmt 26 February, 2023, 06:58:40

Hey! Recommend checking out this message: https://discord.com/channels/285728493612957698/285728493612957698/869558681363177472

user-886683 26 February, 2023, 14:50:20

@nmt Thanks!

user-8779ef 27 February, 2023, 11:50:26

I can double check that, but I don't believe it was.

user-8779ef 27 February, 2023, 11:50:47

My default is always to opt for the larger recording

user-b14f98 27 February, 2023, 11:52:49

Sorry - you sent the message to an account I don't use much anymore.

user-8779ef 27 February, 2023, 11:53:33

I mostly use this one. bok_bok is on vacation.

user-31ef4c 27 February, 2023, 16:14:11

Hope I'm asking the right chat room, I'm looking to acquire the HTC Vive Pro 2 system and integrate eye tracking within the hardware. Is your HRC Vive add on be compatible with the Vive Pro 2 system? (https://docs.pupil-labs.com/vr-ar/htc-vive/)

nmt 27 February, 2023, 16:43:10

Hi @user-31ef4c πŸ‘‹ This is the right channel! Unfortunately, our existing Vive add-on is not compatible with Vive Pro 2. We can offer a de-cased version of our Add-on with cameras and cable tree for custom prototyping. If you are interested, please send an email to [email removed]

End of February archive