🥽 core-xr


user-4169ac 04 October, 2019, 03:40:09

hey guys, I was wondering whether I could use the add-ons to capture the clear eye image in low light condition(when users are wearing the hmd)? Thanks for your help!

user-16802d 04 October, 2019, 07:47:00

@fxlange Have the team come to any conclusions in regards to whether you want to make add-ons for the Oculus Quest and Valve Index? And in case you decide to make it, when would you guess the earliest point it could be ready would be?

user-09f6c7 04 October, 2019, 07:54:39

Hi~ I'm using "unity_pupil_plugin_vr" project and recording the playing windows via FramePublishing class. And when I trying record there is some problem what It cannot record UI things in "Canvas" but only recording eye0 and eye1 textures. Is there any solution for record with UI?

user-09f6c7 04 October, 2019, 08:10:07

@user-09f6c7 solution is change the Render Mode to World Space...

user-4ddde4 04 October, 2019, 17:15:01

Hello, I am trying to run the 'Pupil Labs VR Demo - v1.0.exe' hmd-eyes program, but it says not connected. I have windows10 x64 and all the required dependecies installed per guides. I can see both the cameras on pupil capture. Am I missing a step after opening the pupil capture and then starting the unity exe example program? Thank you!

user-4ddde4 04 October, 2019, 17:37:03

The default port was in use by something else. I am able to get the unity project to work. Thanks!

user-141bcd 07 October, 2019, 09:17:50

@fxlange are calibration data stored by hmd-eyes and can be reused for offline gaze-detection?

user-c43af3 08 October, 2019, 19:45:41

Is anyone aware of a way to use hmd-eyes with other VR engines like babylon.js?

fxlange 09 October, 2019, 07:38:30

@user-141bcd - yes, in Pupil Player you can switch from showing recorded gaze data Gaze From Recording to offline gaze detection Gaze From Offline Calibration, which would use the recorded calibration data stored as part of the recording.

fxlange 09 October, 2019, 07:55:30

hi @user-c43af3, hmd-eyes is a plugin specifically build for Unity but the underlying open network API used by hmd-eyes is provided by Pupil Capture and Pupil Service and can be used for remote control and realtime data access: https://docs.pupil-labs.com/developer/core/network-api/#network-api. the only dependencies are msgpack and zeromq which for example are fulfilled in JS.

besides hmd-eyes I'm only aware of Vizard https://www.worldviz.com/vizard-virtual-reality-software supporting a high level communication with Pupil Capture/Service

user-c43af3 09 October, 2019, 15:11:11

@fxl okay. That's pretty much what I expected... Do you have any information about calibration in a VR-scene? I just received the pupil core and I've tried to test the calibration features with a standard worldcam/eye0/eye1 setup, but even that didn't work.

user-8779ef 09 October, 2019, 19:29:45

@fxlange s there a suggested way to export arbitrary text data to file in a manner that is time-synchronized with the pupil capture recording?

user-8779ef 09 October, 2019, 19:30:10

This data could encode object positions, transformation matrices, flags describing the world state, etc.

user-8779ef 09 October, 2019, 20:43:28

Annotations?

fxlange 09 October, 2019, 20:59:22

hi @user-8779ef - yes, annotations (= timestamped labels) might be what you are looking for: https://docs.pupil-labs.com/core/software/pupil-capture/#annotations

so far hmd-eyes has no high level support for remote annotations but it is on our list.

user-b09747 10 October, 2019, 01:18:13

Hello, I have another problem. I build an application from Unity and play an application in hololens. For share when I play app turn on hololens dev portal live stream and turn on pupill calibration, calibration UI has bigger than Unity Editor. So I can't success calibration. why is that? Do you solve it?

user-fc194a 10 October, 2019, 03:38:31

@fxl I found a way to send annotations by simply making custom methods in Request (based on SendRequestMessage but using the "topic" value instead of the "subject" as first frame) and RequestController (based on Send). I am then sending a Dictionnary with {"topic", "annotation"} as first key-value pair. It now works perfectly. Thanks for the hints!

fxlange 10 October, 2019, 07:43:53

Hi @user-fc194a great that you made it work and thanks for pointing the issue in the documentation out. We will fix it asap.

fxlange 10 October, 2019, 12:23:41

@user-8779ef & @user-fc194a remote annotations also support custom data (not just the label). basically all primitive data types (supported by msgpack - ideally flat and not nested). pupil player won't show the custom data but export them. for example you could add the camera/head pose coords to your annotations and send it every frame. in the case of a vec3, for example, it is recommended to add x,y,z separately instead of a float[].

user-8779ef 10 October, 2019, 12:25:02

I'm have to admit that I'm not incredibly familiar with msgpack. I send this as a key / value pair, so that the numerical data retains a label?

user-8779ef 10 October, 2019, 12:25:13

as a a dict, perhaps?

fxlange 10 October, 2019, 12:27:04

the annotation is basically already a dictionary, with the required keys. on top you can add your data as additional key value pairs, yes.

user-8779ef 10 October, 2019, 12:29:02

Ok, thanks. For write-out, it seems it would be handy to implement a helper-script that one can drop on a game-object, with radio buttons that allow you to select what to write out. FOr example...

user-8779ef 10 October, 2019, 12:29:21

"Position" "Matrix" "Albedo, ... who knows what else.

user-8779ef 10 October, 2019, 12:29:32

"Collision information?"

user-8779ef 10 October, 2019, 12:29:39

Hit locations etc.

fxlange 10 October, 2019, 12:31:05

working on an example today for v1.1, on top of the PR by @user-141bcd which we just merged. but probably not as feature rich as your proposal 🙂

user-8779ef 10 October, 2019, 12:31:20

Hehe. Well, thanks for working on it, anyhow. Good timing.

user-8779ef 10 October, 2019, 12:32:31

Also, without playing more, I can't anticipate what kind of "scheduling" issues might crop up. It would be nice if everything were written out on Update()

user-8779ef 10 October, 2019, 12:33:12

...but, you can imagine situations where folks want more flexibility. This requires some thought.

user-fc194a 10 October, 2019, 13:29:16

What kind of performance hit do we get when sending annotations? (I don't expect a single or a few annotations will have much impact but how about one or many on each Update? Or different sizes of dictionary?) How would you go about to test this? I am designing smooth pursuit tests with different paths so recording the position of the target(s) is essential. I am wondering if storing it to send it at the end is a better alternative than a call on each update.

fxlange 10 October, 2019, 13:41:46

@user-fc194a Sending data via a pubsocket is very cheap, once/many* per frame is definitely not an issue. For example the Screencast component sends full screen captures as raw byte arrays up to every frame.

(*many in a reasonable way)

fxlange 10 October, 2019, 14:52:58

@user-fc194a one additional note: because of the framerate difference between capture and your VR application, in theory the pubsocket might have to drop data frames. I can't tell you how big a pubsocket buffer is and it is probably zmq implementation dependent. so just keep it reasonable 🙂

fxlange 10 October, 2019, 14:56:47

@user-8779ef back to your orginal question (cc @user-fc194a): besides annotations you can always store data via unity directly and keep the time-synced with pupil capture recordings by converting unity time into pupil timestamps via the TimeSync component (https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#time-conversion). pupil player will also export annotations as a .csv file.

user-8779ef 11 October, 2019, 00:19:09

Alright! Thanks, fxl.

user-8779ef 11 October, 2019, 15:49:51

[email removed] When switching away from a scene with some pupil connectino objects in them

user-8779ef 11 October, 2019, 15:50:02

...and then back, I see the connection objects double.

user-8779ef 11 October, 2019, 15:50:20

I assume that this is because you have used 'dont destroy upon load,' but don

user-8779ef 11 October, 2019, 15:50:43

don't check for doubles. So, everytime I return to the scene, it loads a new pupil connection object.

user-8779ef 11 October, 2019, 15:53:19

See the Unity example here: https://docs.unity3d.com/ScriptReference/Object.DontDestroyOnLoad.html

user-b09747 15 October, 2019, 01:25:00

Hi, I don't have an answer yet, so leave it again. If I turn on the app developed by Unity in the hologens, open the hologens dev portal on the connected PC, and turn on the live stream, the calibration UI turns into a picture. Have you ever seen anything like this? What should I do?

Chat image

user-8779ef 15 October, 2019, 18:28:04

@fxlange For what it's worth, I am having trouble understanding the developer notes for time conversion.

user-8779ef 15 October, 2019, 18:28:26

I'm about to start playing around with the new timesynch utility. I'll let you know if I have any suggestions.

user-d77d0f 16 October, 2019, 07:58:55

Hi! I'm trying to get accuracy information via Pupil Capture, but I can't figure out how to use it. What is even supposed to appear on the "Pupil Capture - World" window, for me it is just a grey screen?

user-d77d0f 16 October, 2019, 09:00:57

I'm now realising this window is for a third "world" camera that I don't have since this is a VR add-on. However is there any other way to get accuracy information from the 2 eye cameras without this "world camera"?

user-3cf418 16 October, 2019, 18:53:32

Hi folks! I am new here. We have an HTC vive, and we are trying to get one pupillab add-on for the headset. I wonder if it's possible to customize our own analytics plugin to extract features from raw eye data from pupil's service and report/display the statistics through unity on our VR headset in realtime :)? Any suggestion on the documentation or project I should look at? Much appreciate it!

fxlange 17 October, 2019, 07:18:59

Hi @user-d77d0f - you can access all kind of data in realtime via the network api https://docs.pupil-labs.com/developer/core/network-api/#network-api.

For VR I would recommend to use our Unity plugin hmd-eyes https://github.com/pupil-labs/hmd-eyes. Please checkout out the readme and developer docs for how to get started: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#getting-started

fxlange 17 October, 2019, 07:23:20

Hi @user-3cf418 - your analytics plugin is running in 1⃣ Pupil Capture/Service or in 2⃣ Unity? For 2⃣ you can just use hmd-eyes to prepare the data (gaze/pupil/...) for your Unity plugin and take it from there but if I understand your question correctly that is not what you are doing 🙂 So for 1⃣ you could use the same network API mentioned above to publish your results (https://docs.pupil-labs.com/developer/core/network-api/#writing-to-the-ipc-backbone) and receive them in Unity via hmd-eyes (https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#low-level-data-acess).

user-3cf418 17 October, 2019, 15:12:53

@fxlange Hi, thank you, and that helps a lot! I planned to develop and run it in the Pupil Service, and I will check out the API 🙂

user-29e10a 18 October, 2019, 10:06:13

Did anybody tried out yet, to put the Vive Add-on into the Vive Cosmos? https://www.vive.com/de/product/cosmos/ Looks like it could fit without modifications...!

user-8779ef 18 October, 2019, 13:54:01

Hey @fxlange , trying to align data exported from Unity to a CSV with the timestamps provided by pupil player. Having a bit of trouble.

user-8779ef 18 October, 2019, 13:54:41

Here's a frame from pupil player.

Chat image

user-8779ef 18 October, 2019, 13:56:20

now, I just want to sanity check. I wrote out a row of data for each frame of the Unity update(). Each row of data in this CSV includes the pupil labs timestamp. Now, I should be able to search through my rows for the timestamp shown in that video, right?

user-8779ef 18 October, 2019, 13:56:37

...and that row of gaze data should align with the video frame.

user-8779ef 18 October, 2019, 13:57:39

From the screenshot, that that's a plTime of 100.905, so this frame should be approximately "np.where(dataFrame['plTimestamp']>100.9)[0][0]"

user-0eef61 18 October, 2019, 22:56:01

Hi all, does anyone know how to analyze/visualize number of blinks?

user-0eef61 19 October, 2019, 12:59:28

Is it correct if I use as blinks all the data that has confidence 0 ?

fxlange 19 October, 2019, 13:52:31

HI @user-0eef61 - Pupil Capture offers a blink detection plugin which you can access via hmd-eyes: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#blink https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#low-level-data-acess

user-0eef61 19 October, 2019, 19:26:37

@fxlange Thanks for your reply. Is that useful now that I have saved all the data from eye tracker?

user-0eef61 19 October, 2019, 19:26:54

At the moment, I have saved the data and I want to make some analysis

user-0eef61 19 October, 2019, 19:28:03

My question is what data gives information about blinks? For example in pupil_positions.csv or gaze_positions.csv? I read that when we lose the confidence, that can be takes as a blink but I am not sure

fxlange 21 October, 2019, 08:03:24

@user-0eef61 the blink detection plugin is also available in Pupil Player, so you could run it offline on your recordings.

But yes, blink detection is confidence based and works with an onset and offset threshold. Use pupil_positions.csv if you want to analyze both eyes separately otherwise gaze_positions.csv would work as well (for your custom solution).

fxlange 21 October, 2019, 08:15:59

Thanks for your feedback @user-8779ef, you are right that the scene switch demo is not setup to handle back and forth scene switches. The intention is to showcase how to maintain tracking while switching the scene but you probably have to adapt it to your use case. Still probably better to check that the object doesn't exist already (as in the Unity example).

Btw. Pull Requests are very welcome 🙂

fxlange 21 October, 2019, 08:20:50

Regarding synchronizing your data: Yes, that sounds about right. Matching timestamps can be more complex then a simple ">" comparison (also due to latency and depending on your usecase) but as long as you convert unity Time.realtimeSinceStartup to pupil time via the TimeSync component you should be able to find pairs this way. (Be aware that TimeSync uses double instead of float for pupil time.)

user-7ba0fb 21 October, 2019, 09:45:18

the add on used in VIVE is the same with VIVE pro?

user-bd800a 21 October, 2019, 11:50:15

Hi, I've been using the insets with my HTC Vive but while demoing it at different people I encountered different issues:

  • The image from the camera was either blurry or focused depending on the user's face geometry which resulted in a not always efficient pupil tracking. Is there any chance you upgrade the insert with cameras similar to the 200Hz ones on the Pupil Core as they don't need to be focused?

  • The camera mostly had the cheek in and the eye up in the image meaning the lower eyelid was partly covering the pupil hence messing with the tracking. Playong with the ROI and the Algorithm parameters did help but not in all cases. Is there a certain way to set the Vive on someone's face to ensure the camera are positionned correctly? Or are you considering another placement of the cameras on the inserts?

  • The heat generated by the cameras which can be quite hard to cope with, esoecially in a closed enclosure. While the cameras on the Pupil Core also heated a lot, there was a greater airflow around them. The heat generated byt the 200 Hz comeras is rather low I believe, having them on the inserts would be quite beneficial :D

Thank you for the help,

user-8779ef 21 October, 2019, 17:54:25

@user-bd800a Out of curiousity, are you using the 120 Hz cameras? They have updated their integration once in the past year or so, to cameras that I do believe can run at 200 Hz.

user-8779ef 21 October, 2019, 17:55:06

Check the webpage to see if your inserts look like the most recent update, and if the camera specs match.

wrp 22 October, 2019, 02:07:20

@user-bd800a - @user-8779ef is correct. The most recent versions of the Vive/Vive Pro add-ons (Since around March 2019) are 200Hz eye cameras. If you want to get the 200Hz version, or to discuss sales related topics please email sales@pupil-labs.com

user-bd800a 22 October, 2019, 07:10:00

Ah okay, I think the inserts I have are relatively ancient indeed. Thank you!

user-2693a7 23 October, 2019, 18:23:17

Hi, I'm using the Vive Pro inserts and Pupil-Service with Unity. I noticed recently that the x and y values from GazeDirection were very good, the z values needed a transformation of "1- [val]" to get the value we were expecting. Is there an offset related to this? Is this a known issue? The transformation always uses the value of exactly 1

user-7ba0fb 24 October, 2019, 01:01:13

Hi, whenever I try to operate the ScreenCast demo, my Pupil Capture seems to stop working and shows no response. what's the problem?

fxlange 24 October, 2019, 11:46:41

Hi @user-7ba0fb, which version of Pupil Capture are you using? Running on windows I suppose? Could you please send me (via DM) your <user path>/pupil_capture_settings/capture.log after reproducing the crash.

fxlange 24 October, 2019, 11:52:58

Hi @user-2693a7, very strange. The GazeDirection is normalized and should always point in the direction you are facing. We are actually filtering it to make sure it does. Are you using the default unity camera or a camera of some sdk (steam vr)? Could it be that the transform used as reference for the GazeVisualization (or your custom script) is facing backwards?

user-7ba0fb 24 October, 2019, 12:25:13

@fxlange sure, thanks

user-8779ef 24 October, 2019, 13:08:59

@fxlange I've been playing HMD 1.0. Great job, buddy. This thing is slick. Huge improvement.

user-8779ef 24 October, 2019, 13:09:59

Really looking forward to the use of annotations.

user-8779ef 24 October, 2019, 13:11:47

...and a longer-term request. It would be great if player allowed you to import arbitrary time-series data (like a pursuit velocity signal) and represent it there. I know that there is already some limited functionality there, but for it to be useful we really need to be able to add the data as an inset (or even pop-out window) and manipulate the axis range.

user-8779ef 24 October, 2019, 13:12:53

It is VERY powerful for a researcher to be able to export data, filter and compute new signals, import the data, and then compare them to the screencast.

user-8779ef 24 October, 2019, 13:13:19

Computed signals are almost illusory, and require constant sanity checks. This would provide those checks.

user-8779ef 24 October, 2019, 13:13:38

Also, @papr , since he's the pupil player guy 🙂

papr 24 October, 2019, 13:18:36

@user-8779ef This feature request already exists for a while 😉 https://github.com/pupil-labs/pupil/issues/1048

papr 24 October, 2019, 13:19:10

We still agree that it is a good idea but unfortunately we did not have the resources to implement it yet.

user-8779ef 24 October, 2019, 14:53:07

Haha, well, whoever suggested that was very forward-thinking ;P

user-8779ef 24 October, 2019, 14:53:31

Thanks, and keep up the good work. We've been very impressed by everything lately.

user-7ba0fb 25 October, 2019, 07:54:52

I add the “sceencast camera” component to my own unity project, I add it as a child object of my VR camera, but I found the origin transform of “sceencast camera” was not the same with my VR camera. Thus, I tried to change its transform manually to be consistent with my VR camera, but actually the two cameras did not in the same position (only close to each other) in the scene. In the Caputure window, I can see the view of the “sceencast camera”, and when I moved my VR headset, it followed. Also, I failed to operate the data record demo, whenever I pressed R, it shows not start recording in unity, while the Capture shows recording done but actually it didn’t start recording.

user-8779ef 25 October, 2019, 11:17:52

Hey, @fxlange , does a 'pupil time' of zero coincide with the start of the recording, or the start of the server?

fxlange 25 October, 2019, 11:42:16

Hi @user-7ba0fb - regarding the recording, that's a bug which is fixed on the develop branch but not yet released. It is an easy fix though. There is a conflict between the RecorderComponent and the Recorder demo. Both are listening on "R". You can just remove this behavior from the demo script (better not from the component).

fxlange 25 October, 2019, 11:42:54

Regarding the screencast cam: When adding it as a child to your VR cam - the screen cast cam transform should be 0,0,0 without any rotations.

fxlange 25 October, 2019, 11:43:49

hi @user-8779ef - thanks for your feedback yesterday 🙂 Annotations are also done on the develop branch and to be released very soon.

fxlange 25 October, 2019, 11:45:42

Pupil time of zero shouldn't match any of the two. If I'm not mistaken it it should be your daily OS clock.

user-8779ef 25 October, 2019, 11:46:55

Ah...oK. Here's what I'm trying to do - maybe you have an idea. I have a pupil recording. While recording the pupil video with screencast, I started logging my own data to CSV. I recorded the Unity time and Pupil time into each entry of my log.

user-141bcd 25 October, 2019, 11:47:28

@user-7ba0fb I don't know if that helps with your issue, but for me the camera Prefab in the Prefab folder didn't work. When I copied the Camera from the screencast demo it worked. (If I remember correctly, the physical camera settings on the prefab weren't set up in a way that would work - at least in my scene.)

user-8779ef 25 October, 2019, 11:47:50

Now, I would like to us the pupil video to find a frame of interest, and then inspect the corresponding data saved in my log file. The issue is that pupil player does not show Unity time, and the timestamp that it shoes has no relevance to the pupil timestamp associated with gaze data.

user-8779ef 25 October, 2019, 11:48:23

So, any ideas of how can I use Pupil Player to find a corresponding frame of data in my log?

user-7ba0fb 25 October, 2019, 11:50:36

@user-141bcd thanks for your kind reply. I just tried what @fxlange has told me and I solved my problem now.

fxlange 25 October, 2019, 12:00:52

@user-8779ef I see, because the pupil player only shows a relative timestamp.

user-8779ef 25 October, 2019, 12:00:57

Yep.

user-8779ef 25 October, 2019, 12:01:19

The best I can do is to initiate my logging at the same time as the pupil recording. This way, I can calculate time relative to the first frame (time on frame f = pupilTIme(f) - pupilTime(0) )

user-8779ef 25 October, 2019, 12:01:28

A bit restrictive.

fxlange 25 October, 2019, 12:01:36

In that case take the timestamp of your first sample of your recording + your relative timestamp of interest -> absolute timestamp.

user-8779ef 25 October, 2019, 12:02:15

How do I recover the timestamp of my recording's first frame?

fxlange 25 October, 2019, 12:02:41

By exporting via pupil player. https://docs.pupil-labs.com/core/software/pupil-player/#export

user-8779ef 25 October, 2019, 12:03:32

or, is it in pupil.timestamps.npy?

user-8779ef 25 October, 2019, 12:03:40

or gaze.timestamps.npy?

fxlange 25 October, 2019, 12:05:01

Better pupil timestamps but I'm not familiar with the *.npy and would recommend exporting as described above.

fxlange 25 October, 2019, 12:06:11

@user-141bcd thanks for the input. I'll check the screencast prefab before the next release.

user-8779ef 25 October, 2019, 12:06:26

So, the export creates a csv/spreadsheet that has the timestamps in it then? It has been a while, and I forget the contents of the export 😛

fxlange 25 October, 2019, 12:07:23

Yes: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-positions-csv should be what you are looking for.

user-8779ef 25 October, 2019, 12:07:56

Thanks!

user-8779ef 25 October, 2019, 12:08:42

fwiw, it looks like 'numpy.load('data/2019-10-24-15-26/pupil_timestamps.npy')[0]' will do the same thing

user-8779ef 25 October, 2019, 12:09:16

Of course, all of this will be moot once annotations are enabled and reliable. Looking forward to that

fxlange 25 October, 2019, 12:15:27

Yes, but keep in mind that sending a lot of data as annotations in a high frequency is not the intended use case for annotations. For example I noticed that pupil capture will be pretty busy logging all the data on screen, when sending 120 annotations per second. Not sure if that could become an issue.

user-8779ef 25 October, 2019, 12:23:55

Ah, interesting.

user-8779ef 25 October, 2019, 12:24:50

"logging data on the screen" - are annotations rendered over the scene image?

fxlange 25 October, 2019, 12:26:42

pupil capture will display them on top of the scene image, a short version of it. label and timestamp if I remember correctly. same in pupil player if the annotations plugin is running.

user-8779ef 25 October, 2019, 12:26:53

Ok. So that is not surprising, then.

user-8779ef 25 October, 2019, 12:27:34

I suggest another similar but more minimalistic function via msgpack - datalogging straight to file.

fxlange 25 October, 2019, 12:28:16

They are logged directly but on top of that displayed on screen.

user-8779ef 25 October, 2019, 12:28:34

How about a single bool with the annotation call to turn off rendering?

fxlange 25 October, 2019, 12:28:57

Great suggestion 🙂 @papr

user-8779ef 25 October, 2019, 12:29:52

I imagine that would speed it up quite a bit. As for the Unity -to-analysis pipeline, it would be nice if in Unity we have a simple helper script that can be dropped on any game object, and options for "export matrix" or "export position"...and perhaps "export hit upon collision, etc." What we feel is important enough to be included will likely expand over time.

fxlange 25 October, 2019, 12:30:03

But let's move the annotation handling in pupil capture to 💻 software-dev

user-8779ef 25 October, 2019, 12:30:11

Ah, Ok, will do.

user-8779ef 25 October, 2019, 12:30:13

Thanks!

user-0cedd9 26 October, 2019, 16:03:09

HTC Vive add-on connection problem

Chat image

user-0cedd9 26 October, 2019, 16:03:27

Hi guys, any idea how can I solve this problem?

user-8779ef 26 October, 2019, 17:38:07

Very likely an ip mismatch. Make sure the ip address in pupil captures matches the one in unity

user-8779ef 26 October, 2019, 17:38:44

In capture, i think it’s in remote settings. In unity, it’s in the connection game obj

user-7ba0fb 27 October, 2019, 03:16:16

hi,does anyone have ideas on generating heat maps in the VR environment(unity) to show distributions of fixations?

wrp 28 October, 2019, 01:30:29

@user-0cedd9 What do you see in the Device Manager? All cameras (Pupil Cam *) should be listed under the libUSBK category. Please check and let us know. If cameras can be seen in Windows Camera App, then they are likely in the Cameras section of device manager instead of the libUSBK section. Please could you run Pupil Capture again as admin, to re-initiate the driver install process.

user-94ac2a 29 October, 2019, 09:46:42

Does the algorithm work if I put two cameras under the eye or is there a specific location I have to put the camera?

user-8779ef 29 October, 2019, 17:02:13

@fxlange Just gave my first workshop in which I explicitly recommended folks ise pupil labs for vr. 30 or so attendees @ Giessen.

user-8779ef 29 October, 2019, 17:04:36

(Previously I just said it was still a bit of a work in progress). I was also very impressed by the PL HMD hardware version 2. Did a live data capture, and it went flawlessly.

user-09f6c7 30 October, 2019, 02:33:41

Hi~ I'm using 'unity_pupil_plugin_vr' codes for my unity project and when I running the codes on editor - it has no problems with FramePublishing and Recording.

Chat image

user-09f6c7 30 October, 2019, 02:35:23

But... When I build the exe file and run it - some problem is merged like this. FramePublishing and Recording is not working...

Chat image

user-09f6c7 30 October, 2019, 02:37:21

My debugging process is going but I can not find a fine clue yet. The 'thirdFrame' argument is null checked. Is there anybody get same problems?

Chat image

wrp 30 October, 2019, 02:42:29

@user-94ac2a can you clarify please? You're seeking to have two cameras per eye? Or you're asking about the location of 1 eye camera relative to the eye? If it is about the location of a single eye camera, then the answer is that the position can be changed, but it is important that you have a good view of all eye movements within the image frame and that you do not move the eye camera too far away from the eye.

user-94ac2a 30 October, 2019, 02:44:55

@wrp I am referring to one eye camera relative to each eye. How to determine too far? If I put too close, the camera will out of focus I think?

wrp 30 October, 2019, 02:46:27

@user-94ac2a in most VR HMDs I don't think you will ever have the problem of "too far" come to think of it.

user-94ac2a 30 October, 2019, 02:48:13

@wrp thanks.

user-94ac2a 30 October, 2019, 02:49:42

Does the image need to have entire eye or only the pupil itself is important for tracking

wrp 30 October, 2019, 03:19:54

Ensure that the pupil is visible within all ranges of eye movement

fxlange 30 October, 2019, 08:24:41

hi @user-09f6c7 - looks like you are using an older version of hmd-eyes. Depending on your project status please consider upgrading to v1.0. Your issue could be caused by a missing shader (the one needed for displaying the eye images). You should get a warning when running the build (assuming you run a dev build).

Please try

"Edit\Project Settings\Graphics" -> "Always Included Shaders" -> Add "Unlit/Texture"

fxlange 30 October, 2019, 08:32:49

hi @user-7ba0fb - for any sort of heatmap/scan path you need to intersect gaze data with your VR environment. based on that you can accumulate intersections and apply general heatmap logic/visualizations

the following links should help you get started: * https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#map-gaze-data-to-vr-world-space * https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#default-gaze-visualizer

user-94ac2a 30 October, 2019, 10:50:26

Can I run exported unity exe on PC screen instead of HMD?

user-29e10a 30 October, 2019, 13:10:39

@wrp I can confirm, that the Pupils Vive Addons fit into the Vive Cosmos. They are not as tight as in the Pro, but they will stay in place. Nevertheless, the Cosmos is crap. Good Display and comfortable to wear, but crappy tracking and one has to use additional software. Additional to SteamVR. Sucks!

user-29e10a 30 October, 2019, 13:11:25

But, didn't expect it, but there is also a spare USB port for the cameras...👍🏻

user-4a2c61 30 October, 2019, 17:34:28

When I Run the calibration in unity it seems to lock up capture as it tries to run the calibration. I ran Pupil_Capture as Admin and it seems to have fixed this. Putting this here in case someone else has the issue

user-09f6c7 31 October, 2019, 01:04:17

@fxl ok. I'll try that ways and thanks for response 🖖

user-09f6c7 31 October, 2019, 07:54:17

@fxlange The gray texture problem is solved with "Unlit/Texture"... And I checked v1.0 hmd-eyes what has so impressively increased but simplified structure - It is so amazing! Thank you!

fxlange 31 October, 2019, 09:57:04

That's great to hear @user-09f6c7, thanks for the feedback.

@user-29e10a Good to know, thanks a lot for sharing.

fxlange 31 October, 2019, 10:00:00

@user-94ac2a normally the Unity VR demo would run on the HMD and the PC screen. But what is your use-case exactly, tracking eyes in the HMD or of someone in front of the PC screen?

End of October archive