vr-ar


user-5874a4 01 August, 2017, 06:52:25

@user-5ca684 Thanks! That was some good info on threads and locks. So, In my case it should work well in the Update() method as that runs on the mainThread.

user-5ca684 01 August, 2017, 09:12:50

@user-5874a4 soon you can check out my dev branch for the single threaded version

user-abff35 02 August, 2017, 09:51:55

Hi Pupil Community! I am new to pupil software and am still getting my HTC Vive add-on fully up and running.However, after calibration, the results of the calibration are not ideal, and the three points will also be used from time to time, which does not seem to follow the pupil in real time.

user-489841 02 August, 2017, 10:19:07

Does anyone with knowledge in the Unity plugin have a second? Have a question regarding the accessible data.

user-489841 02 August, 2017, 11:36:34

Or can anyone point me in the direction of somewhere where I ask questions? ๐Ÿ˜ƒ

wrp 02 August, 2017, 11:59:42

@user-489841 this is the right place. @user-5ca684 is the right person to Target for questions regarding unity plugin

wrp 02 August, 2017, 12:00:11

@user-abff35 what calibration method?

user-489841 02 August, 2017, 12:00:56

Okay! Thanks @wrp! starts poking @user-5ca684 furiously

wrp 02 August, 2017, 12:01:09

@user-489841 you have access to all pupil and gaze data - also frame data is exposed via the plugin. What specifically are you looking for?

user-489841 02 August, 2017, 12:02:13

I want to know the difference between the confidence values gotten from Pupil.values.Confidences[] and Pupil.values.BaseData[].confidence, because they differentiate every frame.

wrp 02 August, 2017, 12:04:49

@mpk can you respond to this. This is not a unit question but a Pupil question

user-489841 02 August, 2017, 12:05:06

Awesome. ๐Ÿ˜„

mpk 02 August, 2017, 12:05:28

Am a right that Pupil.values.Confidences[] is related to gaze?

mpk 02 August, 2017, 12:05:54

Pupil.values.BaseData[].confidence would then be the uderlying pupil positions. These can have difference confidence.

user-489841 02 August, 2017, 12:06:47

So they are not referring to the same? Is that what I'm misunderstanding?

mpk 02 August, 2017, 12:07:38

you have pupil postions, these are the position of the pupil in the eye video.

mpk 02 August, 2017, 12:07:51

each datum comes with its own confidence.

mpk 02 August, 2017, 12:08:40

then we map one (monocular) or two (binocular) of these to a gaze datum (point you are observing in the world). this has its own computed confidence.

mpk 02 August, 2017, 12:09:04

all this under the condition that @user-5ca684 did not make changes to this structure in unity.

mpk 02 August, 2017, 12:09:37

and one last thing.

user-489841 02 August, 2017, 12:09:38

The plugin hasn't been updated in 2months+ so I don't think so. :p

mpk 02 August, 2017, 12:10:02

gaze data has a basedata field that contains the pupil data that was used to make this gaze datum.

mpk 02 August, 2017, 12:10:26

confusing to me here is that what I would refer to a gaze is called Pupil.values

user-489841 02 August, 2017, 12:10:27

So Pupil.values.Confidences[] refer to the confidences in eye position, and Pupil.values.BaseData[].confidence refers to the confidence in the calculated gazr datum?

mpk 02 August, 2017, 12:10:36

the other way around.

mpk 02 August, 2017, 12:10:46

Pupil.values.Confidences[] -> gaze confiudence

user-489841 02 August, 2017, 12:10:47

ah

mpk 02 August, 2017, 12:10:57

Pupil.values.BaseData[].confidence -> Pupil confidence

mpk 02 August, 2017, 12:11:06

AFAIK

user-489841 02 August, 2017, 12:11:49

Perfect. The reason I started looking into BaseData was that pupil diameter value in Pupil.values.diameter is never changed, but it is in BaseData.

mpk 02 August, 2017, 12:12:07

gaze does not have a diameter.

mpk 02 August, 2017, 12:12:18

I m not sure why the stuct would offer this field.

user-489841 02 August, 2017, 12:12:20

that makes sense in that context

user-489841 02 August, 2017, 12:12:24

Me neither

user-489841 02 August, 2017, 12:12:25

P

user-489841 02 August, 2017, 12:12:31

*:P

mpk 02 August, 2017, 12:12:37

yeah. @user-5ca684 please revise or correct my assumptions.

user-489841 02 August, 2017, 12:13:31

But anyways. Thank you @mpk. That made sense to me. I'll hang around for when @user-5ca684 comes back at some point. ๐Ÿ˜ƒ

mpk 02 August, 2017, 12:20:36

@user-489841 sounds good. Our plan it to contiously improve the unit app just like we do with the rest of the Pupil project!

user-489841 02 August, 2017, 12:22:52

I'm working in a research group that are in love with the PupilLabs eye-trackers for HDMs, so we are following you guys very closely. We are looking forward to that very much. ๐Ÿ˜‰

user-489841 02 August, 2017, 12:23:02

*HMDs

mpk 02 August, 2017, 12:24:25

@user-489841 thats great to hear! thanks for the love ๐Ÿ˜ƒ

user-05da2f 02 August, 2017, 14:29:39

How would you guys recommend I troubleshoot connectivity issues from the Unity3D side? I am trying to use hmd-eyes but I cannot connect to the pupil service

user-05da2f 02 August, 2017, 14:29:52

Wireshark output

Chat image

user-05da2f 02 August, 2017, 14:30:12

I turned off all firewalls, I can ping but I cannot connect to pupil_capture running on a remote host 192.168.2.1

user-05da2f 02 August, 2017, 14:30:36

I started "wireshark" to sniff packets but it seems like hmd-eyes on the unity side does not generate ANY traffic on tcp port 50020

user-05da2f 02 August, 2017, 15:53:10

The funny thing is that from an earlier version of hmd-eyes I can connect fine. It's just this latest version that is giving me trouble. Oh well.. just my 2 cents.

user-abff35 03 August, 2017, 01:10:46

@wrp i use the 2D calibration in HMD

user-fe23df 03 August, 2017, 05:45:03

@everyone does anyone know how I can turn of the gaze marker? Thanks!

user-fe23df 03 August, 2017, 05:46:03

@user-05da2f I had that problem at some point as well, but then it just started to magically work ๐Ÿ˜‚

user-5874a4 03 August, 2017, 08:51:16

@user-fe23df CalibrationGL.cs Line 47 Comment of the call to Marker method. That's where the marker is drawn on the screen. I found this while debugging with the code. I don't know if there is a better way to accomplish this.

user-fe23df 03 August, 2017, 09:11:01

@user-5874a4 thanks a lot! ๐Ÿ˜ƒ

wrp 03 August, 2017, 09:14:18

@here if you find an issue in hmd-eyes, please make an issue on the repo: https://github.com/pupil-labs/hmd-eyes/issues so that it can be tracked/fixed

user-56b6a5 07 August, 2017, 07:02:24

Hi. Iโ€™m trying to do 3D calibration and get gaze data from unity hmd plugin, but it doesnโ€™t workโ€ฆ First, I tried 2D calibration, it worked. However, I failed to do 3D calibration. When I started calibration in 3D mode, I could only see a white circle target following my eyes and nothing else happened. What should I do? I would really appreciate it if you answer.

user-56b6a5 07 August, 2017, 22:46:19

I think in 3D calibration it doesn't go to the next marker. Even though I changed the confidence condition from 0.4f to 0.0f, the white target didn't move. What's the problem?

user-fe23df 08 August, 2017, 08:45:40

@user-56b6a5 I also had a similar problem before, I think you should check the eye capture app if it's running properly and both eyes are detected well in the popped-up windows.

user-56b6a5 08 August, 2017, 08:59:11

@user-fe23df Thank you for your reply! When I did 2D calibration using pupil service, it seemed to be okay. The white target moved well. And both eyes were detected well in both 2D and 3D modes I think... Could you give me any advice, if you have?

user-fe23df 08 August, 2017, 09:08:34

@user-56b6a5 sorry, I have no idea then. In fact, I can't even do 2D calibration as the script switches to 3D whenever I press play. And I just realized our problems are not the same actually. Calibration ran ok for me, it's just that after calibration the marker was right on the camera. And that magically got fixed after a few days.

user-56b6a5 08 August, 2017, 09:11:42

@user-fe23df I envy you...๐Ÿ˜‚ Thank you very much!

user-fe23df 08 August, 2017, 09:12:15

@user-56b6a5 no problem at all, sorry I couldn't help :/

user-56b6a5 08 August, 2017, 09:27:18

@user-fe23df ๐Ÿ˜Š

user-5874a4 08 August, 2017, 21:10:09

@user-56b6a5 You could also possibly verify if in pupil capture, Calibration -> Method is set to "HMD Calibration 3D" and General -> detection and mapping mode is set to "3d". I noticed that sometimes these don't get set automatically on communication between unity and Pupil Capture. Like the Calibration mode change doesn't get reflected in Pupil Capture sometimes

user-56b6a5 09 August, 2017, 02:07:37

@user-5874a4 Thank you for your reply! I use windows 10 0.8.7 version and there is no HMD Calibration 3D option. Is it the reason of my problem?

user-56b6a5 09 August, 2017, 02:08:09

Chat image

user-5ca684 09 August, 2017, 08:22:24

@user-5874a4 it is planned to implement a customInspector toggle to turn off the markers, however our refactor also separates the visualization+calbration from the data receiving part

user-5ca684 09 August, 2017, 08:23:04
  • @user-fe23df
user-5ca684 09 August, 2017, 08:29:33

@user-fe23df there is a dev branch that you can check out if you are interested

user-fe23df 09 August, 2017, 08:30:00

@user-5ca684 yeah will do, thanks! ๐Ÿ˜ƒ

user-5ca684 09 August, 2017, 08:31:02

@user-fe23df it is strongly under development but it should work, and would love to hear your feedback

user-abff35 09 August, 2017, 13:41:56

Can you tell me how to display the tracking maker on the screen? I want to display a static marker in the middle of the screen to detect the accuracy of the tracking~~~where need I to modify that part of the code, I didn't find it.

user-5874a4 10 August, 2017, 23:15:40

@user-56b6a5 Mine is version 0.9.6. That's probably the issue for you. Maybe you can update and see if the issue resolves

user-fe23df 11 August, 2017, 20:26:49

hi, just to make sure: gazepoint3d and gazenormals, like eyecenters, are values in local (headset) space right?

user-fe23df 11 August, 2017, 20:26:56

@user-5ca684

user-72b0ef 15 August, 2017, 09:33:41

Hi everyone, I am looking at the List3D calibration points, because we noticed that during calibration some of the points were out of our eyes' reach. Now we were wondering whether it is smarter to change the depth values of the calibration points rather than the x and y values of the points. Also we would like to know whether there is a reason every point is calibrated on 100f depth instead of every point changing in depth. Thanks!

user-72b0ef 15 August, 2017, 09:47:06

As for the last question, we can imagine that because we are changing the X and Y values to calibrate different angles of the eyes, we want to calibrate on different calibration depths too

user-8779ef 15 August, 2017, 23:25:44

Hi guys - big supporter here. Keep it open source. So, what's the word on the hope of getting the cameras behind the HMD optics, with hot plates to align the optical axis of the camera with the optics? After some use, it seems clear that the off-axis camera angles are costing in accuracy inside the HMD

user-abff35 16 August, 2017, 12:19:54

I want to show the pupil data in the picture. I don't know the mapping between the panorama sphere and the image.Can someone give me some information.

papr 16 August, 2017, 12:38:47

@user-5ca684 Can you advise on the issue above?

user-abff35 16 August, 2017, 12:40:03

@papr ๐Ÿ˜€ okay

user-5ca684 16 August, 2017, 14:26:42

@user-72b0ef As far as I know it should not make a difference

user-fe23df 16 August, 2017, 14:27:30

@user-5ca684 I am recording from unity but I keep getting this error: FiniteStateMachineException: Req.XSend - cannot send another request NetMQ.Core.Patterns.Req.XSend (NetMQ.Msg& msg) NetMQ.Core.SocketBase.TrySend (NetMQ.Msg& msg, TimeSpan timeout, Boolean more) NetMQ.NetMQSocket.TrySend (NetMQ.Msg& msg, TimeSpan timeout, Boolean more) NetMQ.OutgoingSocketExtensions.Send (IOutgoingSocket socket, NetMQ.Msg& msg, Boolean more) NetMQ.OutgoingSocketExtensions.SendFrame (IOutgoingSocket socket, System.String message, Boolean more) PupilGazeTracker.GetPupilTimestamp () (at Assets/Scripts/PupilGazeTracker.cs:1773) FFmpegOut.CameraCapture.OnRenderImage (UnityEngine.RenderTexture source, UnityEngine.RenderTexture destination) (at Assets/FFmpegOut/CameraCapture.cs:159)

user-fe23df 16 August, 2017, 14:28:18

this basically makes my experiment frame rate drop from 300 to 30 fps and eventually crashes unity

user-fe23df 16 August, 2017, 14:28:41

Do you have any idea how I could fix it? thanks!

user-5ca684 16 August, 2017, 14:30:12

@user-fe23df Iwill take a look

user-fe23df 16 August, 2017, 14:34:14

@user-5ca684 thanks a lot!

user-fe23df 16 August, 2017, 14:41:07

oh yeah and following right after is "OnRenderImage() possibly didn't write anything to the destination texture!"

user-fe23df 16 August, 2017, 14:41:13

I suppose this is due to the error above

user-fe23df 16 August, 2017, 14:41:34

and this is probably why my videos in recordings are always broken...

user-5ca684 16 August, 2017, 14:41:51

@user-fe23df actually I just realized that the recording is being forcet to 30

user-5ca684 16 August, 2017, 14:42:09

on purpose

user-5ca684 16 August, 2017, 14:42:30

and so whole Unity

user-5ca684 16 August, 2017, 14:42:48

you can try to change that wher it is being set

user-5ca684 16 August, 2017, 14:42:55

but I have to say it is risky

user-72b0ef 16 August, 2017, 14:43:16

@user-5ca684 So changing calibration markers in depth also does not make a change in how well the markers perceive depth?

user-fe23df 16 August, 2017, 14:43:31

the whole OnRendering function is for recording the unity view right?

user-72b0ef 16 August, 2017, 14:43:40

@user-5ca684 Sometimes it seems the markers are more accurate on objects that are far away rather than closeby

user-fe23df 16 August, 2017, 14:43:48

If I only need the eyetracking data then I suppose I can just not use it?

user-72b0ef 16 August, 2017, 14:43:59

further*

user-fe23df 16 August, 2017, 14:44:06

because I'm thinking of resorting to commenting out the part that causes the error

user-fe23df 16 August, 2017, 14:45:00

In OnRenderImage there's the timeStampList variable that is updated with PupilTimeStamp, I wonder what it is for exactly

user-5ca684 16 August, 2017, 14:45:33

@user-72b0ef honestly I think @mpk knows more about accuracy, but as far as I know the depth of the markers should not make a difference in calibration

user-5ca684 16 August, 2017, 14:46:18

@user-fe23df if you only need the eye track data then you could try

user-fe23df 16 August, 2017, 14:46:30

@user-5ca684 Thanks!

user-72b0ef 16 August, 2017, 14:52:34

@mpk Hi!! Could you tell me a bit more about the depth of the claibration markers? It seems that after calibration at 100f depth, object that are further away tend to be more accurate for the marker.. Is it possible to calibrate on varying depths?

mpk 16 August, 2017, 14:53:33

@user-72b0ef the 2d calibration is accurcate for a certain depth. In your case my guess it that moving markers further away helps since you scene is genrally further away as well.

mpk 16 August, 2017, 14:54:00

for 3d this applies as well. But here you can presesnt markers at mixed depths.

mpk 16 August, 2017, 14:54:25

it is always advisiable to calibrate at the range you are going to look afterwards.

user-72b0ef 16 August, 2017, 14:55:23

@mpk I see, can I get an advantage by doubling the amount of calibration points and calibrating for 100f and for example 25f?

mpk 16 August, 2017, 14:55:38

yes. For 3d this is certainly true.

user-72b0ef 16 August, 2017, 14:55:55

@mpk That is really good news

mpk 16 August, 2017, 14:56:04

for 2d you will have to more or less decied on one depth.

user-fe23df 16 August, 2017, 14:56:21

@mpk what would be the correct construct of a gaze cast for each individual eye? is it from eyecenter to Vector3 (-gazenormal.x, -gazenormal.y, gazenormal.z)?

user-fe23df 16 August, 2017, 14:56:52

as I do this I get the gaze generally going into the right direction where my eyes cast, but the two rays don't really converge

mpk 16 August, 2017, 14:57:34

@user-fe23df make sure that you are using the gaze_norm and eye center data from the gaze datum, NOT the pupil datum (but I must say I m not sure if the naming is the same in unity.)

user-fe23df 16 August, 2017, 14:58:11

I'm using the data that was specified from the ReadMe, which is under pupil.values

mpk 16 August, 2017, 14:58:14

the rays may not converge but we also report the closest convergence point, its the 3d gaze point.

mpk 16 August, 2017, 14:59:04

I would make the rays have the same magnitude as the 3d gaze point. That should roughly have the desired effenct.

user-fe23df 16 August, 2017, 14:59:18

Yes, but when I look at a canvas at a certain depth and mark down where the 3 rays hit (left, right and main gaze point), they all are hitting at very different places

mpk 16 August, 2017, 14:59:43

That being said we are working on some super exciting updates for the 3d model tracker that will boost accuracy.

user-fe23df 16 August, 2017, 14:59:58

ah okay, I should try the magnitude trick then. Thanks!

mpk 16 August, 2017, 15:01:09

@user-fe23df If you are running pupil capture after 3d calibraiton you can turn on the gaze mapping visualzer. It should give you some more insight!

user-abff35 17 August, 2017, 02:10:19

@user-5ca684 Could you advise on the issue above? ๐Ÿ˜ฉ ๐Ÿ˜ฉ

user-fe23df 17 August, 2017, 08:32:07

@user-5ca684 could you help me figure out the NetMQ bug I mentioned above? I would actually still want to get a video capture, and the error freezes Unity

user-fe23df 17 August, 2017, 12:12:00

also, it would be great if I can keep the framerate from dropping after pressing record

user-fe23df 17 August, 2017, 12:12:36

I tried to change the values and prevent the CameraCapture from changing the framerate, but it still gets dropped to around 30-40

user-fe23df 17 August, 2017, 12:13:39

I wonder why you guys want to keep it that low, since for any VR stuff the framerate should be at least 90 fps

user-72b0ef 17 August, 2017, 12:58:54

Hey again! I would like to ask a question about the calibration and tracking part. When we use 2D we can follow each eye independently and put on gaze markers (red, green, blue). We work with patients that sufer from brain injuries, so it is really useful to track each eye separately. However we would like to step over to 3D, but still show markers for each eye independently. The problem now is that with 3D, only 1 marker shows up (because the eyes will always intersect with normal 3D tracking), but we would like to track each eye separately and still make use of the depth parameter. My question, can it be done?

mpk 17 August, 2017, 13:20:11

Yes. this can be done you will have to look at the eye normal and eye pos for each eye in the 3d gaze datum dicts. However the 3d calibration assumes that the subjects binocular vergences is functional.

mpk 17 August, 2017, 13:20:40

if you want to calibrate each eye indiviadually you will have to modify the source code to run two monocular caibrations seperatly.

mpk 17 August, 2017, 13:21:41

I think I should add that when looking at eye_normal in the gaze dicts you will have to assume a depth since it can no longer be inferred from vergence.

user-72b0ef 17 August, 2017, 13:22:09

exactly!

mpk 17 August, 2017, 13:22:11

in summary: You will get two gaze directions not two 3d gaze points.

user-72b0ef 17 August, 2017, 13:23:46

@mpk Sounds good! Thanks for the answer. I can imagine it is hard to assume a depth when somebody suffers from "cross-eyes"

user-72b0ef 17 August, 2017, 13:24:10

Also calibrating with cross eyes in binocular mode would be impossible

user-72b0ef 17 August, 2017, 13:24:23

so I think double monocular calibration is the way to go

user-72b0ef 17 August, 2017, 13:26:11

still what would be the benefit of choosing 3D calibration over 2D calibration if we are using dual monocular plugin?

user-fe23df 17 August, 2017, 13:35:27

@user-72b0ef I use individual eye data for 3d. What I did is shooting raycasts with eyecenters being the origin and gazenormals being the direction of the ray (-x,-y,z though). and when one of the rays hits a collider you can have a new game object positioned on the hit point, those should be your individual eye markers ๐Ÿ˜ƒ

user-fe23df 17 August, 2017, 13:35:37

I hope this helps somehow

user-fe23df 17 August, 2017, 13:36:15

I have a bit of problem with the accuracy of the hits, so if you implement in this direction it would be nice to hear how it performs for you

user-fe23df 17 August, 2017, 13:36:19

๐Ÿ˜„

user-72b0ef 17 August, 2017, 13:36:36

@user-fe23df Yes, this is very helpful! Thank you :). However,, we already do this for 2D

user-72b0ef 17 August, 2017, 13:36:43

The results for me are pretty accurate!

user-fe23df 17 August, 2017, 13:37:28

what do you mean you already do it for 2D? as in you project your 2D data into 3D space?

user-72b0ef 17 August, 2017, 13:37:43

yep

user-fe23df 17 August, 2017, 13:38:07

yeah it seems like doing 2D then converting to 3D is more accurate than everything 3D for me

user-fe23df 17 August, 2017, 13:38:24

I'd attribute that to the binocular/dual monocular mapping

user-72b0ef 17 August, 2017, 13:40:34

Exactly my point

user-fe23df 17 August, 2017, 14:02:49

@mpk when this error happens: FiniteStateMachineException: Req.XSend - cannot send another request, when the GetPupilTimeStamp function is being called, does that mean the stream of data is overloaded?

user-fe23df 17 August, 2017, 14:51:01

@user-72b0ef how did you deal with the framerate drop in VR?

mpk 17 August, 2017, 14:53:37

@user-fe23df I dont think that that is the case. My guess is something is going on with the unity plugin

user-fe23df 17 August, 2017, 14:54:08

@mpk so even when you hit record the frame rate doesn't change at all?

mpk 17 August, 2017, 14:54:09

@user-5ca684 is the one with knowledge on this.

user-fe23df 17 August, 2017, 14:54:23

oh sorry I thought you were @user-72b0ef

mpk 17 August, 2017, 14:54:27

@user-fe23df I was referrring to the FiniteStateMachineException: Req.XSend - cannot send another request, when the GetPupilTimeStamp function is being called, does that mean the stream of data is [email removed] how did you deal wi

user-fe23df 17 August, 2017, 14:55:45

@mpk yeah I suppose so, because pupil capture works fine without any problem

user-5ca684 18 August, 2017, 07:20:19

@user-fe23df As far as I know the framerate is actually set when you start recording. This error message however looks to me that at a certain point the plugin is sending more request than it can handle

user-5ca684 18 August, 2017, 07:20:56

@user-fe23df how do you have your unity set up? have you tinkered with the framerates anyhow?

user-5ca684 18 August, 2017, 07:21:20

@user-fe23df have you tried recording out of the box ?

user-e14fbd 18 August, 2017, 12:41:13

Hi everybody. For our Unity project, we currently want to detect both the duration of a blink of one and the gaze direction of the other eye simultanously. Any ideas on how to do that?

user-e14fbd 18 August, 2017, 12:41:40

We're currently testing the blink detection plugin, but we're open for any input or advice we can get

user-72b0ef 18 August, 2017, 14:39:14

@user-fe23df I dont have any framedrops at all ๐Ÿ˜ฎ

user-72b0ef 18 August, 2017, 14:39:34

@user-fe23df Only the captureRate goes a bit down, but after calibrating it is good again

user-fe23df 18 August, 2017, 14:57:02

@user-5ca684 yes I believe the framerate drop and the error message are two different problems

user-fe23df 18 August, 2017, 14:58:23

@user-72b0ef ah okay. And this is when you record right?

user-fe23df 18 August, 2017, 14:58:33

also, thanks a lot for your answers! ๐Ÿ˜„

user-72b0ef 18 August, 2017, 15:10:17

I dont use the FFMPEG plugin to record if that is what you mean....

user-fe23df 18 August, 2017, 15:10:38

Ah yeah, that's what I mean ๐Ÿ˜ƒ

user-72b0ef 18 August, 2017, 15:10:47

I only know about the captureratre decreasing when calibrating

user-72b0ef 18 August, 2017, 15:10:58

ah, forgive me then, I thought you were talking about something elser

user-fe23df 18 August, 2017, 15:11:09

it's okay, it's great to hear though :))

user-72b0ef 18 August, 2017, 15:11:42

I think this is because since we require 120 samples, and we uise 120hz camera's, we dont want calibration to end within 1 second ๐Ÿ˜›

user-72b0ef 18 August, 2017, 15:12:05

so the capture rate reduces to about 30

user-72b0ef 18 August, 2017, 15:12:24

so it takes 4 whole seconds of good eye detections in order to complete 1 calibration points

user-fe23df 18 August, 2017, 15:12:55

I don't really notice any drop during calibration for our project

user-fe23df 18 August, 2017, 15:13:06

maybe there is a drop also, but not too drastic

user-72b0ef 18 August, 2017, 15:13:11

its not a framedrop

user-72b0ef 18 August, 2017, 15:13:16

its a capturerate drop

user-fe23df 18 August, 2017, 15:13:21

oh yeah sorry

user-fe23df 18 August, 2017, 15:13:43

I didn't notice that one either

user-fe23df 18 August, 2017, 15:14:06

the pupil capture's rate is always kept at 30 and each eye camera's is around 80-90

user-fe23df 18 August, 2017, 15:14:19

for each eye, not sure what the rate is during calibration

user-72b0ef 18 August, 2017, 15:14:27

we have got values of around 240

user-72b0ef 18 August, 2017, 15:14:46

since we got 2x 120hz camera's it can go up to 2x 120 capturerate

user-fe23df 18 August, 2017, 15:14:58

hmm interesting

user-72b0ef 18 August, 2017, 15:15:11

I can make a screenshot of it if you would like

user-fe23df 18 August, 2017, 15:15:11

I should take a look again when I get back to the lab

user-fe23df 18 August, 2017, 15:15:19

oh that would be great of course! ๐Ÿ˜ƒ

user-72b0ef 18 August, 2017, 15:15:53

somebody is playinhg space pirates now, so it is great for experimenting with the cameras lol

user-72b0ef 18 August, 2017, 15:20:37

hmmm, seems like it goes up to 360 for jus ta few seconds after calibration

user-72b0ef 18 August, 2017, 15:20:42

but then it settles back to 120

user-fe23df 18 August, 2017, 15:20:59

hmm interesting

user-fe23df 18 August, 2017, 15:21:10

I now wonder

user-fe23df 18 August, 2017, 15:21:40

maybe the 30 that I remember is only during recording with the ffmpeg plugin

user-fe23df 18 August, 2017, 15:22:01

I do recall seeing 120 fps at some point for capture as well

user-5874a4 19 August, 2017, 05:41:10

We want to study how the perception of depth differs in vr compared to normal world. Anyone used the depth(z- coordinate) accurately? Is it really working? I don't see it anywhere near accurate. I placed two balls at varying depths and focused on them separately. Ideally, should you not expect the markers to be drawn at the place the balls are placed? I don't see that happening.

mpk 19 August, 2017, 06:08:57

@user-5874a4 the 3d calibration is in a proof os concept stage for vr.

mpk 19 August, 2017, 06:09:23

it works much better using a normal headset in the real world.

mpk 19 August, 2017, 06:09:54

We are woking on improving the 3d calibration in vr. Any contributions and feedback is always welcome!

user-5874a4 19 August, 2017, 06:21:19

@mpk Thanks for letting me know this. Just curious to know, whatโ€™s that which makes 3d calibration and computing depth in VR difficult? I understand in the normal case, we compute depth by the intersection of gaze normals.

user-5874a4 19 August, 2017, 06:21:48

Also, How does it work better using a normal headset in real world?

mpk 19 August, 2017, 10:17:44

@muzammil#8393 I think we are still doing something not quite right with the position of the eyes in vr space.

wrp 24 August, 2017, 02:30:27

@user-1eecd4 questions migrated from core channel. Questions:

1-     Do Pupil Service and Pupil Capture are the same? If not which one of them is more suitable to work with unity? as Iโ€™m running them and unity locally in windows 10.

2-    What are the differences between (Camera Image, ROI and Algorithm) modes? Which one is more suitable to work with unity?  Is it possible to modify the green circle that shows up around the eyeโ€™s boundary?


3-    I have adjusted my hmd to fit my face properly, however during the 3d calibration I wonโ€™t be able to see the white marker properly when they are at the corners. 

4-    How could the fixation and blinking plugins work in unity?
wrp 24 August, 2017, 02:33:06

Responses: 1. Pupil Service and/or Pupil Capture with hmd-eyes - You can use either Pupil Service or Pupil Capture. Pupil Service was designed to be used by other clients (hence "service"). Pupil Service is designed to be controlled via messages sent over the network. Therefore you should make sure that you properly close Pupil Service (otherwise you may have zombie processes of Pupil Service running in the background). Pupil Capture also works and can be used with the unity plugin.

wrp 24 August, 2017, 02:39:21
  1. Eye image modes - (1) Algorithm view visualizes the pupil detection algorithm - you should only enable this mode when modifying the pupil detection parameters as it is more cpu intensive. (2) ROI - sets a region of interest for the pupil detection algorithm. You can drag the blue circles to change the ROI. (3) Camera Image is the default mode and what should be used. 2..1 Green circle - the green circle you see in the eye window is a visualization of the eyeball sphere from the 3d pupil detector.
wrp 24 August, 2017, 02:40:05
  1. Calibration in Unity3d - @user-5ca684 can you provide some tips on moving the position of the calibration marker
wrp 24 August, 2017, 02:40:52
  1. Blink detection and fixation... and plugins in unity - You need to subscribe to these messages generated by Pupil Capture or Pupil Service in order to use them in your unity application
user-1eecd4 24 August, 2017, 09:36:39

Thank a lot @wrp for the clarification. I hope you don't mind if I come back with other questions ๐Ÿ˜€

wrp 24 August, 2017, 09:47:12

@user-1eecd4 welcome. please feel free to ask questions - that is one of the main purposes of these chats ๐Ÿ™‚

user-1eecd4 24 August, 2017, 09:49:51

glad to hear that @wrp

user-abff35 30 August, 2017, 10:11:56

Hi๏ผŒI have some problem. I used the 2D calibration mode in HMD.And when I finished the calibration, the tracking results did not seem to be very good and the error was larger. For example, in this picture, I see the green dot in HMD, But pupil read the dots on the red dot. What can I do to improve it

user-abff35 30 August, 2017, 10:12:10

Chat image

user-8779ef 30 August, 2017, 15:15:07

Hey guys, I'm curious - what's holding you back from using a IR hotplate and placing the camera inside to improve the eye imagery? Is it the lensing? Space? The complicated installation process?

mpk 30 August, 2017, 15:36:53

All three! We tried but it was not possible to do it as an add-on

user-5874a4 31 August, 2017, 05:27:54

Guys, I am not sure how to use recording feature in unity. When I click โ€œRecordโ€ after calibration, I could see a square shaped transparent box blinking in the middle of my screen, and goes away after the recording is stopped. Now, I could see an error in my console about directory missing. When I create that directory, the error goes away. Instead, I see a debug log that itโ€™s creating the video file with certain name. The application gets stuck saying โ€œProcessingโ€ฆโ€ and I see no video file actually being created. Can anyone point out if I am doing anything wrong?

user-9a0a2b 31 August, 2017, 12:20:57

Chat image

user-9a0a2b 31 August, 2017, 12:29:00

Hello, we're trying to set up the PupilGazeTracker for Unity and it successfully connects, gives a capture rate but does not update the left eye / right eye in the on-screen UI. However, when calling Pupil.values.GazePoint3D it does give us the Vector3 (position). Is this position in local or world coordinates? In any case it does not seem to reflect the correct location in the scene and the update-rate is not very good. Any suggestions to which method we should use to convert the input to a world-space position in Unity?

user-c68bba 31 August, 2017, 13:46:13

camera always reinit, why?๐Ÿ˜…

Chat image

mpk 31 August, 2017, 13:57:04

most likely the software configured to read from only one camera.

End of August archive