πŸ‘“ neon


Year

Month (messages)

mpk 29 November, 2022, 11:28:43

For the last ~3 years we have quietly been working on new hardware. We took everything we have learned over the past 10 years of making eye tracking tools and talking with researchers in all kinds of domains to make something that truly empowers them. As with Pupil Invisible we wanted to push the boundary of what had been done and was possible… and we think we arrived at something special.

We are excited to finally lift the veil and show what we have come up with! Go check out Neon https://pupil-labs.com/products/neon

We would love to hear what you think. Looking forward to your questions right here in this chat.

user-d551e1 29 November, 2022, 11:35:07

Entombed? Find a better word to use there.

wrp 29 November, 2022, 11:36:10

On it, thanks πŸ™‡

user-d551e1 29 November, 2022, 11:43:54

Researchers care about latency, triggers, speed, accuracy, psychtoolbox integration, and using high performance display screens. Maybe help clarify thse items with some hard numbers and information? I ultimately did not use Pupil for research because 200hz is not enough, there was no way that I found to intergrate my external hardware triggers into the pupil data, and the current integration with psychtoolbox and external screens was not mature enough. I hope you can fix these problems because I would love to have a sub-$35,000 eye tracker that really can work in Research. Thanks for listening to an old researcher’s rantings πŸ™‚

marc 29 November, 2022, 12:53:34

Thank's for the feedback @user-d551e1! We are aware that some research areas require very high sampling framerates beyond 200 Hz. This is quite difficult to accommodate in head-mounted eye trackers using current camera technology though. We may be able to make progress on that in future hardware generations, but until then we need to point you to remote eye trackers if that is a strict requirement!

Regarding integration with external hardware triggers there are a couple of options: - You could use LSL making use of our relay https://pupil-invisible-lsl-relay.readthedocs.io/en/stable/ - You could use the real-time API, e.g. using events: https://docs.pupil-labs.com/invisible/how-tos/integrate-with-the-real-time-api/introduction/ https://docs.pupil-labs.com/invisible/how-tos/integrate-with-the-real-time-api/track-your-experiment-progress-using-events/ - To achieve super low latency you could connect the eye tracker via a network cable to other devices https://docs.pupil-labs.com/invisible/explainers/glasses-and-companion-device/#using-a-usb-c-hub

More precise numbers on accuracy, speed, etc. will be available as we get closer to the official release! We will prepare a white paper with a thorough evaluation similar to what we had for Pupil Invisible: https://arxiv.org/pdf/2009.00508

user-057596 29 November, 2022, 13:31:04

Very interesting development when do you see the AR aspect being available as this would be an extremely useful addition for what we want to do in medical training.?

user-057596 29 November, 2022, 14:37:41

Also if Neon can go directly into Pupil Capture does that mean that our software will directly work with Neon as it was integrated for Capture but didn’t work with Invisible. Furthermore the handset for Neon is the OnePlus 8T which is the same for Invisible so can we use those we already have with the Neon glasses and if yes will the earlier versions of OnePlus also work.

nmt 29 November, 2022, 16:03:38

Hey @user-057596 πŸ‘‹. - The AR headset visualisation is just a concept. If you want to build prototypes, you can get the module and attach it to your AR device with a DIY mount - Neon will be compatible with the Core pipeline when connected to Pupil Capture, yes. Note, however, that NeonNet will only run on the Neon Companion Device, not in Pupil Capture - Neon supports the OnePlus8 and OnePlus8T. If you bought Pupil Invisible after Spring 2021 you can use the Companion that came with it for Neon as a second device πŸ™‚

user-b14f98 29 November, 2022, 15:38:06

Hey folks - Any comments on VR integrations?

user-3cff0d 29 November, 2022, 16:03:42

How detailed is the whitepaper for the gaze estimation NN going to be?

user-92dca7 30 November, 2022, 08:46:04

Hey @user-3cff0d, the final form of the white paper is still shaping up. You can expect a similar level of detail as in our white paper on Pupil Invisible (https://arxiv.org/abs/2009.00508).

user-3cff0d 29 November, 2022, 16:04:01

In terms of architecture

nmt 29 November, 2022, 16:08:07

Hey @user-b14f98! Similar to AR, if you want to build prototypes, you can get the module and attach it to your VR device with a DIY mount πŸ‘

user-b14f98 29 November, 2022, 16:33:30

Any initial idea of which current headsets the unit fits inside without headset modification?

user-057596 29 November, 2022, 16:17:55

Hi Neil thanks for that, will you be able to stream the world view camera and eyetracking data to Pupil Capture and if yes will be in the same format as it was for Pupil Core glasses so we can use our own analysis software without having to modify it for the new system?

nmt 29 November, 2022, 16:39:36

Yes, the procedure would be the same as with Core: connect Neon to computer running Pupil Capture (via USB cable), calibrate and record in native Pupil Capture format.

user-057596 29 November, 2022, 16:41:07

Excellent πŸ‘πŸ» thanks Neil

nmt 29 November, 2022, 16:48:34

Note that one of the major strengths of Neon is that it will provide eye-state and pupillometry in addition to robust gaze estimation via the Neon Companion App (unlike Pupil Invisible). So there is no hard requirement for Pupil Capture, unless you have existing workflows of course!

mpk 30 November, 2022, 07:53:47

I want to add here, that we will roll this feature out via app update Q1/Q2.

user-7ff310 24 January, 2023, 19:33:54

Hi! Regarding the eye 3d pose data, I noticed your website says "in Pupil Cloud". Does that mean it will not be available via real-time API? The reason I'm asking is because for my application I need low latency eye pose data. Ideally I would process it in real time via my own custom app installed on the companion device. Would that be possible? Thanks!

user-c6ead9 29 November, 2022, 23:17:53

Regarding Neon; are the frames interchangeable? If so, how much would just a frame cost?

mpk 30 November, 2022, 07:55:40

Yes, you can change them, all you need is a small hex (allen key). We will sell the frames without module as well, the cost is not yet final, but I would say between ~200€ and ~600€ depending on the type.

user-057596 30 November, 2022, 10:34:37

Hi will you still be able to stream the world view camera and control the companion device from an IPad using the QR code

marc 30 November, 2022, 10:38:23

Yes, the monitor app behind the QR code and the real-time API will be the same as for Pupil Invisible!

user-057596 30 November, 2022, 10:45:09

Thanks Marc πŸ‘πŸ»

user-b14f98 30 November, 2022, 16:23:17

Hey guys, I assume that you have tested its compatibility with some existing consumer grade HMDs for VR. Can you say whether the unit does or does not fit inside any of them without headset modification?

nmt 30 November, 2022, 17:26:10

Did you have a particular HMD in mind?

user-b14f98 30 November, 2022, 17:45:04

Index, Vive Pro 2, Quest Pro. Really, any PC-connected HMD.

End of November archive