πŸ₯½ core-xr


user-8779ef 01 February, 2023, 11:25:19
def get_timing_info(file_path):

    container_in = av.open(file_path, mode="r", timeout = None)
    time_base_out = container_in.streams.video[0].time_base
    num_frames = container_in.streams.video[0].frames
    average_fps = container_in.streams.video[0].average_rate

    container_out = av.open('vid_out.mp4', mode="w", timeout = None)
    stream = container_out.add_stream("libx264", framerate = average_fps)
    stream.options["crf"] = "20"
    stream.pix_fmt = container_in.streams.video[0].pix_fmt
    stream.time_base = container_in.streams.video[0].time_base

    packet_pts_out = []
    pts_out = []
    dts_out = []
    frame_time_base_out = []
    relative_time_out = []

    count = 0

    #for raw_frame in container_in.decode(video=0):
    for raw_frame in tqdm(container_in.decode(video=0), desc="Working.", unit= 'frames',total = num_frames):

        pts_out.append(raw_frame.pts)
        dts_out.append(raw_frame.dts)
        frame_time_base_out.append(raw_frame.time_base)
        relative_time_out.append( np.float32(raw_frame.pts * raw_frame.time_base) )

        for packet in stream.encode(raw_frame):
            packet_pts_out.append((raw_frame.pts, packet.pts))
#             container_out.mux(packet)

        count = count+1
user-e3f20f 01 February, 2023, 11:26:40

But you would not use this to play back in Player. Please use the raw recorded scene video as input.

user-8779ef 01 February, 2023, 11:27:15

This is from the raw scene video

user-e3f20f 01 February, 2023, 11:28:24

Got it, my mistake. Let me have a closer look

user-8779ef 26 February, 2023, 22:26:20

@user-e3f20f I'm at a critical point now where either I have to decide if I should give up on encoding a video that is compatible with player. To remind you - we were under the impression that the right way to handle this was to set my packet pts and frame pts to the same values as the decoded values. That makes sense, but no luck yet. In our final interaction, we found that even for a world.mp4 generated by pupil labs, the packet pts != frame pts. I sent a vid and you said you would check it out.

Have you made any progress, or do you plan to look more closely in the near future? Shall I abandon all hope on this quest?

user-8779ef 01 February, 2023, 11:28:55

No prob, and thanks. If I haven't already sent this along, let me know if you want access to this repo

user-8779ef 01 February, 2023, 12:55:15

user-8779ef 01 February, 2023, 12:55:30

The source vid, just in case.

user-8779ef 01 February, 2023, 12:55:44

Gotta run, but I'll check in later. Thanks again, Pablo.

user-626bf2 01 February, 2023, 13:28:34

hey, is there any way to get gaze location on 2D scene in unity? Everything I could find was about 3D scene in VR.

user-8779ef 02 February, 2023, 15:52:05

I don't work with Plabs, but I'll try to help. I assume you're referring to a panorama. Is this a static image, or a movie? If it's static, then you might consider https://docs.unity3d.com/ScriptReference/RaycastHit-textureCoord.html . You will have to figure out the relationship between UV (texture) coordinates and image location, because they are dependent on a few settings. If you're unsure if your solution is accurate, you might try it out on a test texture with a few fixation targets on a solid color background.

If it's a movie, then you can take the same approach, but will need to be careful to account for timing. Perhaps save out unity timestamps, pupil timestamps, and movie timestamps on every frame, so you can sync gaze to unity, and then unity to the movie.

user-d407c1 07 February, 2023, 07:46:38

Hi @user-89d824 Since your question was related to HMD-eyes, let's move it to the VR-AR channel to keep this tidy.

The exception you are showing in the logs, is from NetMQ (a ZMQ client written in C#) and it occurs when you attempt to send several send requests without receiving a response. In principle, both sockets need to go through the full cycle of sending/receiving as a part of the communication. In simpler words, Capture and Unity need to reply to each other before another signal is sent.

I recommend that you look at the SetupOnSceneManagement.cs in the Scene Management Demo. This shows you how gazeCtrl gameObject is enabled once the new scene is loaded. The SubscriptionsController shows you how this is connected OnEnable() and disconnected on Disable. Therefore, disconnecting and connecting through NetMQ(zmq) properly between scenes.

Why it fails sometimes for you? Without context it is hard to say, since I don't know your configuration, which SceneManager you use , or how do you check for button click. My guess is that the click button is checked in an Update function, and since the RecordingController checks also in Update if the Boolean has changed, you need couple of frames to trigger stop the recording. The scene manager will load the next scene ASAP and you might be changing the scene without having received the response from Capture.

In general, you can wait to check if the recording has stopped, with IsRecording, then disable the gameObjects to ensure that the sockets close, and then change the scene and enable them to reconnect.

user-89d824 09 February, 2023, 10:36:37

Thank you for your reply.

After I posted that question, the error appeared during the first attempt at recording when I ran my next experiment... that was new.

I'm currently working on something else but will drop a question (or two) again if I need more help.

Thanks again!

user-8779ef 07 February, 2023, 13:44:01

Hey @user-e3f20f , have you moved the postHocVr gazer to the dev branch?

user-e3f20f 07 February, 2023, 13:49:38

yes

user-8779ef 07 February, 2023, 13:44:48

I stepped away from running from source for a bit, and now see the issue with Py 3.6 deprecation is preventing me from running the post-hoc-vr-gazer branch.

user-8779ef 07 February, 2023, 13:49:57

Thanks πŸ™‚

user-e3f20f 07 February, 2023, 13:50:21

I recommend using the latest develop branch + creating a new virutal environment based on the requirements.txt file

user-8779ef 07 February, 2023, 13:50:54

Uh, what Python version?

user-8779ef 07 February, 2023, 13:50:33

Just about to do that.

user-8779ef 07 February, 2023, 13:51:36

Trying with 3.9

user-8779ef 07 February, 2023, 13:53:48
Traceback (most recent call last):
  File "D:\Github\pupil\pupil_src\main.py", line 36, in <module>
    from version_utils import get_version
  File "D:\Github\pupil\pupil_src\shared_modules\version_utils.py", line 37, in <module>
    ParsedVersion = T.Union[packaging.version.LegacyVersion, packaging.version.Version]
AttributeError: module 'packaging.version' has no attribute 'LegacyVersion'
user-8779ef 07 February, 2023, 14:00:28

@user-e3f20f Looks like they removed packaging.version.LegacyVersion

user-8779ef 07 February, 2023, 14:00:47

I changed to ParsedVersion = T.Union[packaging.version.Version, packaging.version.Version] and it runs. Hope that doesn't screw anything up πŸ™‚

user-8779ef 07 February, 2023, 14:12:13

Eww. Not working, but for unrelated reasons. I'll take this over to πŸ’» software-dev

user-e3f20f 07 February, 2023, 14:14:04

Your develop branch is not up-to-date. Make sure to pull the latest changes and try again

user-e3f20f 07 February, 2023, 14:14:31

The fix for this issue hast been added already https://github.com/pupil-labs/pupil/blob/develop/requirements.txt#L14

user-8779ef 07 February, 2023, 14:16:15

HRUM. I thought I was up to date. Thanks!

user-8779ef 07 February, 2023, 14:23:09

@user-e3f20f FileNotFoundError: Could not find module
'C:\Users\gabri\anaconda3\envs\PupilLabs_2\li
b\site-packages\pupil_apriltags\lib\apriltag.
dll' (or one of its dependencies). Try using
the full path with constructor syntax.

Pupil_apriltags 1.0.4 has been installed

Chat image

user-8779ef 07 February, 2023, 14:26:27

It looks like apriltag.dll exists in that location.

user-8779ef 07 February, 2023, 14:26:49

...so, it must be missing a dependency. Requirements.txt has been installed.

user-8779ef 07 February, 2023, 14:30:58

Just going to go ahead and comment out the related lines in player.py

user-e3f20f 07 February, 2023, 14:34:03

Try pip install pupil-apriltags -U

user-8779ef 07 February, 2023, 14:59:39

OK. I have it running now (without the head tracker plugin), but will try this out later tonight and report back

user-8779ef 07 February, 2023, 15:32:26

@user-e3f20f Getting a few of these with a file:

Chat image

user-e3f20f 07 February, 2023, 15:32:56

is this from an unmodified recording?

user-8779ef 07 February, 2023, 15:32:33

is this something I can ignore?

user-8779ef 07 February, 2023, 15:32:59

yes

user-8779ef 07 February, 2023, 15:33:07

This is about 50% through finding pupils.

user-e3f20f 07 February, 2023, 15:33:33

these are decoding errors, prob. save to ignore

user-8779ef 07 February, 2023, 15:33:54

thx. So, these are from the same file I have been trying to transcode

user-8779ef 07 February, 2023, 15:34:27

Well, same pupil folder. The errors here are related to the eye video, not to the world video.

user-8779ef 07 February, 2023, 15:36:23

FWIW, I've given up on my efforts to view the optic flow transcode in pupil player. It would be nice, but... oh well.
I'll assume gaze timings are consistent relative to the start of the video and will overlay gaze using my own code, independent of PPlayer. My only fear is that temporal mismatches will accrue over the 20 minute recording.

user-8779ef 07 February, 2023, 15:37:13

That is - the gaze overlay won't align with the visual informatoin that was visible at the time the eye movements were made.

user-d0c376 09 February, 2023, 22:34:30

Hi. I did a sample test using one of the demo scenes in the starter project with Pupil Capture, and I found that the recorded data contained some data with a sampling frequency higher than the advertised 200Hz. For many readings it was ~360Hz sampling frequency. Is this possible for the hardware, or is it likely an error?

user-4c21e5 10 February, 2023, 14:51:44

Hi @user-d0c376 πŸ‘‹. The VR Add-on has free running eye cameras that has two implications for sampling frequency: Pupil data - contains datums from both the left and right eye. If you don't filter for a specific eye, you can end up with more samples than you might initially have expected Gaze data - We employ a pupil matching algorithm that tries to match eye image pairs for gaze estimation. Read more about that here: https://docs.pupil-labs.com/developer/core/overview/#pupil-data-matching

user-44c93c 10 February, 2023, 20:27:32

Hello. The L/R diameter_3d values in our data output alternate but are rarely reported for the same timestamp. In other words, L eye timestamps are inconsistent with the R eye timestamps. Is there any way to set it up such that L/R diameter_3d values are both given for each sample with the exact same timestamp? More or less, we are wondering if it’s possible to make both eye cameras capture data at the exact same time, which would result in the two circled numbers (and each subsequent pair of rows) being the identical timestamp? This is from the HTC Vive Binocular Add-on hardware running through pupil capture. Thanks!

Chat image

user-4c21e5 13 February, 2023, 10:09:07

The eye cameras are free running, i.e. not perfectly in sync. Read more about the data matching algorithm in my message above πŸ™‚

user-9eed85 15 February, 2023, 15:53:03

Is there any impact of putting high values of gravity and mass in to the virtual object ?

user-886683 20 February, 2023, 07:09:42

Is there anyone have experience on preprocessing and ploting pupil diameter with VR eye tracker?

user-4c21e5 26 February, 2023, 06:58:40

Hey! Recommend checking out this message: https://discord.com/channels/285728493612957698/285728493612957698/869558681363177472

user-886683 26 February, 2023, 14:50:20

@user-4c21e5 Thanks!

user-e3f20f 27 February, 2023, 05:51:05

The only explanation I had for this at the time was that this was a compressed recording (less storage, more CPU option in the recorder menu), correct? I have not created such recordings myself for a while.

I think you do need two things: packet pts == frame pts and a video codec in which the packets are decoded in an order in which they were encoded, e.g. mjpeg.

It is possible that you need to increase the Bitrate to get the same quality as with more modern video Codecs.

Please see also the note that I sent you via DM.

user-8779ef 27 February, 2023, 11:50:07

This was not a compressed recording

user-8779ef 27 February, 2023, 11:50:26

I can double check that, but I don't believe it was.

user-8779ef 27 February, 2023, 11:50:47

My default is always to opt for the larger recording

user-b14f98 27 February, 2023, 11:52:49

Sorry - you sent the message to an account I don't use much anymore.

user-8779ef 27 February, 2023, 11:53:33

I mostly use this one. bok_bok is on vacation.

user-31ef4c 27 February, 2023, 16:14:11

Hope I'm asking the right chat room, I'm looking to acquire the HTC Vive Pro 2 system and integrate eye tracking within the hardware. Is your HRC Vive add on be compatible with the Vive Pro 2 system? (https://docs.pupil-labs.com/vr-ar/htc-vive/)

user-4c21e5 27 February, 2023, 16:43:10

Hi @user-31ef4c πŸ‘‹ This is the right channel! Unfortunately, our existing Vive add-on is not compatible with Vive Pro 2. We can offer a de-cased version of our Add-on with cameras and cable tree for custom prototyping. If you are interested, please send an email to [email removed]

user-31ef4c 27 February, 2023, 16:44:21

Thanks for the fast response @user-4c21e5 , I appreciate it!

user-4c21e5 27 February, 2023, 17:15:51

No probs. I just realised I omitted the word not - now added.

End of February archive