We are using Windows. We do use it on a (high spec) laptop. We are also using it on a desktop PC so there is a chance to compare the two.
One possible explanation is that the laptop's CPU resources are being maxed out when streaming the data. You can check this with a resource monitor.
Hi - any thoughts on why the Neon folders (Neon and Neon Export) don't include any of the .csv files - gaze, imu, etc?
Hi Neil - I'm only looking at the files on the internal storage of the phone at this point (using my PC linked to the phone via usb). Nothing else on the phone besides what it came with and Neon Companion.
Also, I'm recording this directly to phone storage - I'm not live streaming anything during records.
Hi @user-c10c09! I wasn't responding to your message π
ha! I thought it applied!
Apologies - need my eyes tested π
Hey @user-f6ec36 π ! Responding to your message (β https://discord.com/channels/285728493612957698/285728493612957698/1125395841155338331) here.
Regarding your question, Neon relies on a deep learning approach and provides calibration-free gaze estimation. However, if you notice a subject-specific offset that you would like to correct, you have this option in the Companion App on the phone.
Please also note that we offer a 30-day return, such that you can try Neon and return it if it doesn't meet your requirements. You pay shipping costs, we refund the price of the hardware.
I hope this helps!
No problem! The recommended workflow is to upload the recordings to Pupil Cloud, where you can generate and download the .csv files you mention. Read more about that here: https://docs.pupil-labs.com/neon/getting-started/understand-the-ecosystem/#pupil-cloud. The files stored on the phone are intermediate/binary formats that aren't human readable.
Perfik - thanks Neil!
Hi This is about Neon
Hi Neil
Nadia would not want to return . I want to buy more π . Just asking for support system or are the glasses strudy
That is great to hear @user-f6ec36! To have a better idea of Neon's performance and accuracy, I encourage you to have a look at Neon's website, where we have posted videos of different use cases (e.g., climbing, biking, playing tennis or piano among others): https://pupil-labs.com/products/neon/technology.
Thanks Nadia checked. Is there any place i can look at metrics in excel as part of output.
Yes, you can see the full list of exported data in our documentation https://docs.pupil-labs.com/export-formats/recording-data/neon/
I can see what it can do but not any metrics..
The link I sent you in my previous message provides you with overview of the csv files you can obtain and the associated metrics, e.g., the gaze.csv (https://docs.pupil-labs.com/export-formats/recording-data/neon/#gaze-csv) provides you with the gaze x,y coordinates among others, the fixations.csv (https://docs.pupil-labs.com/export-formats/recording-data/neon/#fixations-csv) provides you with the fixation data etc. If you want a more high-level overview of the data streams provided by Neon, please check out this link: https://docs.pupil-labs.com/neon/basic-concepts/data-streams/
Nadia unable to download the CSV files - its only directing me to the page. Can I download please .
Thank you Nadia. And what is the normal ship out period post order confirmation
The current order fulfilment time for Neon is around 4 weeks.
OK cool and shipping time to India is how much based on previous experience please
Hi @user-f6ec36 π ! Apologies for the delayed response. Shipping to India should take extra 3-5 days.
Hi there, We are testing the rate of data flow of eye video in both pc and laptop with pupil neon glasses by using (Simple Examples of Code Examples of real time API). It is around 60 data per sec in laptop and 200 per sec in pc. We would like to know how we can manage to get 200 data per sec in laptop. Our laptop information is Alienware and attached info:
Is your laptop connected to the network in the same way the PC is (wifi, ethernet, etc)?
Hi
Def today!
I have requested sales team for a Neon quote yesterday . When can I expect reply
Thanks Moritz
Hi Iβm setting up our Neon device but when I try to stream it to the IPad Iβm getting the gaze circle but the world view is gray a
Hi Iβm setting up our Neon device but
No worry . We have not heard form Miguel Sales team for quote and some queries we had. Can you chase up please
Hi @user-f6ec36! Apologies for the delay. We will make sure you will have the email with this information today. Thank you for your understanding!
No worry .
Got it Nadia thanks. I have a query before we choose our bundle. Do clients typically add the power adjustment . To be safe package in the first order itself. Or the standard is fine
Hi @user-f6ec36! Could you please clarify? I am not sure I understand your question. Are you referring to the "I can see clearly now" frame that is designed to hold prescription lenses?
Yes Nadia that is the one.
Thank you for clarifying @user-f6ec36 . This frame is designed to hold prescription lenses, and it has a quick-change feature to easily switch out the lenses for different prescriptions. This makes it easy to adjust to new subjects quickly. Lenskits will be available covering a wide range of prescription values (- 3 up to +3 diopter in steps of 0.5). You can see it on our website (it's called "I can see clearly now") https://pupil-labs.com/products/neon/shop . Please let me know if this information answers your question! π
Yes nkp .. Clarified
Can the eye tracking module be easily taken in and out of the Just Act Natural frame? We're interested in trying the Neon with the JAN frame, but we want the ability to make our own frame if the JAN does not work for us
See also this page in the docs with info on both swapping frames and designing your own frame! https://docs.pupil-labs.com/neon/glasses-and-companion/hardware-handling/frames/
Hi @user-edef2b π It is indeed feasible to remove it by loosening two screws located on the back of the module. To help you develop your own custom frame, we offer the Bare Metal nest, that includes just the electrical interface + open geometries to assist you in this process. If youβre interested, please send an email to info@pupil-labs.com
Thank you!
I could not it is only taking me to a page.
Just for confirmation, here is again the link that brings you to the page with the zip folders that you can download: https://drive.google.com/drive/folders/1Jg-IyTe4nvJdY9tdYHos6VEqc8XEFkDT Can you please confirm that you can access the folders?
Yes thanks .. I could . Mac is troubling with csv files . only prompt not opening tables. Let me work something out. Thanks so much
@user-f6ec36 which browser are you using? If you are using Safari, please have a look at this link if you keep having problems with downloading the data: https://docs.pupil-labs.com/invisible/troubleshooting/#my-enrichment-download-contains-only-an-info-json-file-and-nothing-else
OK got it .. Nadia thanks
Hi PL, I recently received an error while recording with the Neon device saying that the extimu device stopped working. I could not find the error code in the documentations. Can you please help me understand what might have caused this error for future cases?
Hey @user-ccf2f6 π ! Which version of the Neon App are you using? Please upgrade to the Neon Companion App version 2.6.30-prod.
Hey! Im having trouble recording in a darker area, on a flight simulator. I need the neon glasses to be able to record the information on the dash of the aircraft, yet it glares and looks quite blurry, any recommendations?
In low-light conditions, camera sensors need a longer exposure time in order to capture a good image. The effect of this is motion blur. You can mitigate the blur by setting a manual exposure to a relatively low value, but this comes at the cost of having darker images. It's a bit of a balancing act in low-light, and essentially impossible to get the best of both worlds.
A colleague points out to me that, with auto-exposure still enabled, you may find improved results by tweaking the backlight compensation value instead
When you invite someone to a member of a particular workplace they click the link that states βjoin workplaceβ in the sent email are they meant to be taken straight into the particular workplace or do they have to login with the accountβs details?
The new member has to be logged in to accept the invitation. If they are not logged in already when clicking the accept link, they will land at a login form.
Hi @marc Iβm confused, do they have to be logged into the account that is inviting them to join a workplace as this would give them access as an owner?
No, the invitee needs to be logged into their own account, not into the inviters account!
So can student set up an account without having to have purchased glasses and then be invited to a University departments account particular workplace at one of he designated access levels?
Yes! That's the intended use case!
Another question! Is there a way to see the metadata collected by the neon? When it comes to research we need the metadata to support findings in our papers! Thanks all!
Hi @user-a5c587 π ! What kind of metadata are you looking for? Is the data in the info.json sufficient? https://docs.pupil-labs.com/export-formats/recording-data/neon/#info-json
is it possible to change the resolution of the eye camera to get high res frame?
Hi @user-25fb27 π ! In the latest version of the Neon Companion App you have the option in the Settings to change the Gaze data rate. The highest is 200 Hz.
@user-25fb27, I'm sorry, I misunderstood your question. Correction: The resolution of the eye cameras cannot be modified.
Hi team, is there a way to download the eye videos on their own? Additionally, is there an enrichment where we can download the world video with the eye videos overlayed? Thank you!
Hello , any news about Q3 pupillometry release ? π
Hi @user-f77049 π ! I know how excited we all are about robust pupillometry with Neon, and I understand your anticipation. Unfortunately, we will need a little more time, and the feature won't be ready until later in the third quarter(Q3). We apologise for any inconvenience and disappointment this may cause you.
I can assure you that our team is working tirelessly to ensure that we deliver a product that lives up to your expectations.
Once it's ready we will make an announcement here and in our social media so you don't miss it! Have a nice day!
Hi Pupil Labs team, is there any way to use C++ for the real-time API or does anyone else have any experience using C++ instead of python?
Hi @marc thanks for setting that straight as regards students being able to be invited to access particular workspaces.
Then you are elegible for an onboarding call with one of our product specialists to getting you started
How can I schedule that in?
Please contact info@pupil-labs.com to request the onboarding
Hi PL, I understand that Workspaces are shareable with collaborators but is it / will it be possible to share individual Projects within a workspace?
Hi @user-7ee596 π ! Currently, only Workspaces are shareable with collaborators. Sharing individual Projects within a workspace is currently not on the roadmap.
Hi everyone! I'm new to eye tracking devices and I have a question related to virtual reality. Currently, I'm working with a Neon device and would like to know if it's possible to use it with a virtual reality glasses
Hi @user-ff2367 π ! Welcome to the community π Regarding your question, VR/AR prototyping is possible with Neon + our βBare Metalβ kit, but note that this is not a turnkey solution. There are currently two main considerations:
Hello @user-480f4c , thank you sou much!! I'm really excited about all the capabilities of Neon. I will look into your considerations, thank you!
Hey, we have encountered a technical problem with our neon glasses. A factory reset was performed on the phone delivered with the glasses, and we tried installing the neon companion app. Unfortunately, the app is now stuck at updating FPGA. What are we supposed to do? Thank you!
Hi @user-e0a71c ! Try unplugging Neon, force closing the app and reconnecting it, the update would restart.
Let us now if that solved the issue
Hey, the model 'Better safe than sorry' specifies 'Safety Rating in progress'. Might I ask what the difference is, especially compared to 'Just act natural'. Thank you!
Hi @user-80c70d! While the Just act natural
frame is like a regular pair of glasses, the Better safe than sorry (BSTS)
frame acts as protective glasses.
The safety ratings we are going for with BSTS are ANSI Z87.1 and CE EN166, which include things like blunt impact and liquid splashing protection. The certification process is still ongoing.
This safety rating is relevant in some settings, e.g. on some factory floors you need to wear such glasses.
Note that ALL our frames are tested for electrical safety and photobiological safety.
Hey @user-48526c π ! Regarding your message https://discord.com/channels/285728493612957698/285728493612957698/1130390293775122504, could you please elaborate on the issue? When you are connecting the glasses, can you preview the scene video or record data? If not, could you please send a screenshot of the App when the Neon glasses are connected to the phone?
Thanks your promptly respond... I will...send first recording video ay Pupil's Cloud and app's screen image
Thanks for sharing the recording @user-48526c. I have some follow-up questions/points.
In your first message, you mentioned that "After setting up the device and successfully completing the first recording, the device is no longer being recognized on the Neon app. The screen prompt to plug-in remains even after I have plugged the device using the USB cables that were provided by you.
" When you connect the glasses to the phone you should see some pop-up windows prompting you to allow Neon companion to access Neon Scene Camera v1, Neon Sensor Module v1. You should hit OK or click on the tickbox to always open Neon companion when the glasses are connected.
Having done that, could you please check if you can 1) preview the scene camera by clicking on the camera symbol on the bottom right part of the app and 2) start a recording?
When I connect glass to the phone, no sign pop up windows.
@user-48526c , I'm gonna be following up with an email with debugging steps! Thank you for your understanding and patience! π
Should I reset to Neon Companion App?
or how to setup for phone correct way?
what is the meaning of "Waiting for dns service"?
There is no pop-up window prompt.
OK...thanks.
Hey pupil labs team, just some quick questions concerning the recent updates to the companion app:
1.) Does the app reduce the scene camera quality when the phone is over-heating? (I think this comes from long recordings + uploading in the background) 2.) Does the app performance become reduced when the phone overheats while recording is still in progress? Or does the phone reduce the app's performance in this case? If so, is there a way to overwrite/prevent this? 3.) Is there an automatic backlight compensation like the one in pupil core (compared to manual compensation directly from the app settings)?
thanks!
Hi @user-07e923 π ! Thanks for reaching out. Regarding your questions, please see my points below:
1) We have not seen a reduction in scene camera quality when the phone is overheating. Would you be able to share a recording with us showing such decrease in quality? 2) Regarding app performance, during overheating, the gaze data rate might drop if the gaze rate is set to 200 Hz. 3) There is no automatic backlight compensation - you can only manually adjust it in the app settings.
Hi again @user-07e923 - I'm following up to provide a more detailed reply to your first question. The quality of the scene camera does not drop as a function of recording duration, overheating or any other factor. What can indeed happen though is that the colours of the scene camera might appear as washed out for some recordings if the ambient lighting changes and the parameters you set in the Neon App remain the same. This is what seems to have happened in your recordings as well. To explain a bit better what can happen, consider the following scenario:
You make two recordings in the same setup (e.g., room with sunlight) and with the same settings in the Neon App. The only difference is that the first recording is made at 3pm and the second one at 9pm. The recordings made at 3pm will look fine, but the recordings made later in the evening might have washed out colours because as the sun is going down, the room is getting darker, and therefore the scene camera is overexposed and washed out.
The good news are that this can be easily fixed by adjusting some of the Neon App parameters. We recommend either adding more ambient light in the room, changing to manual exposure to optimise for the scene camera exposure or changing the Backlight Compensation in the App settings.
Hope this helps!
3) Was also this case with PI. It was always on.
Both Invisible and Neon have AE. But this Neon you can choose the AE target via the Backlight Compension slider.
Hi Nadia, are you there?
I just sent you an email.
Hi, we are using the "duration" value from Neon recording info.json to calculate the recording latency : iMotions app uses API to start/stop recording, then we expect the difference between the iMotions study duration and the Neon recording duration should give us the latency (up to netoworking latency). It worked for a while, but recently we found that it gives wrong results after companion update. So I am wondering if you changed the algorithm for duration calculation recently? Related issue: I am using ffmpeg to see the Neon recording video duration, I found that video duration is always less then the duration from info.json (for example info.json duration 85541ms, while ffmpeg gives 83389ms video duration). What does info.json duration include? video duration + some internal latency?
Hello, the duration in info.json is time from user pressing record button , to the time user pressing stop button (in UI it's the same button). That is not related to the actual video duration. First it is not necessary to have neon connected to start recording. The hardware will start automatically streaming and recording if it is connected to an app that is already in a recording. Second, the time each sensor starts varies, so it can be expected that in general video duration is less than the info.json duration even if hardware is always connected.
If you carry out a Marker Mapper enhancement is automatically applied to all other recordings in a project as with the Gaze Overlay enhancement and like the Gaze Overlay enhancement is the Marker Mapper analysis for each recording downloaded collectively in the same download folder?
Hi @user-057596 π ! The Marker Mapper enrichment can be only applied to recordings that include visible markers. As long as the recordings in your project have visible markers, the enrichment will be applied. The .csv file you will download after applying the Marker Mapper enrichment will include data from all recordings within your project that had visible markers (ie recordings where the enrichment was applied successfully). I hope this helps!
Hi @user-480f4c thank you, that definitely helps.πͺπ»
Hi Pupil labs, it's me again. Just wanted to quickly report an odd bug during my data collection. One of the videos could not be played on the companion app, and there was a message saying that an error occurred during recording. This file is uploaded to Pupil Cloud. This has happened only once during my recordings since updating the companion app.
I have the latest companion app and firmware on the glasses.
Thanks for reporting it π¬ ! We will investigate what happened. To facilitate that, could you answer some additional questions?
Hallo Miguel, The error message was simply "We have detected an error during recording!". I did not really pay attention to it. The recording is uploaded to the Cloud.
The glasses serial number: 766202 Firmware version: 18 FPGA: 6
Log files
Hi. I wanted to know what the state of the IMU calibration procedure is. Will there be any notification of calibration procedure or status (I ve seen other IMUs provide an acc, gyro and mag variable, eg 0 = not calibrated, 3 = fully calibrated)... What would be the best practice to know if the IMU is running ok when recording starts and as recording runs a few minutes? Not planning in recording over 10 minutes. Thanks!
@user-b55ba6 we dont have a notification implemented, the best thing to do is to rotate the glasses around all three axis once connected and you should be good to go.
once calibrated the IMU should remain calibrated.
Hello. I am trying to get our neon hardware to interface with pupil capture. I am able to see the world view camera, but the eye cameras cannot be found. When I look in the device manager, it says it doesn't have any drivers installed. However, I am unable to find any drivers through this modal or on the pupil labs website. Any help would be appreciated.
Hi @user-44c93c! If you want to run Neon with Capture, you'll need to run from source : https://github.com/pupil-labs/pupil/tree/neon-support. You can read more about that here: https://docs.pupil-labs.com/neon/how-tos/data-collection/using-capture/#use-the-neon-module-with-pupil-capture Note that the solution is only working on Mac and Linux currently
Hi, I'm considering purchasing from the Neon range, and have some questions about prescription lenses fitting. Can "Just act natural" accommodate a presciption lens of my choice, ordered from an external prescription glazing house? Is the process of inserting and removing lenses similar to regular prescription eyewear. Can "All fun and games" accommodate a presciption lens at all? Thanks
Hi, Yes you can add you own lens into "Just act natural". I would recommend removing the Neon module before you add the lenses, its easy to do so and we can give extra guidance if needed.
At current the "All fun and Games" frame does not have the required lens groove, we will change that soon however, if you indicate this during your order we will hold it back until the new revision is ready.
Finally, we also have "I can see clearly now" a frame that comes with snap in lenses.
Is it possible to view the eye cameras live in the neon monitor app? I can see how to acces them on the neon companion app, but not on the monitor app. Thanks!
The eye cameras can be accessed via the realine api: https://github.com/pupil-labs/realtime-python-api
But at current this is not implemented in the monitor app. You can however use this example script that shows eye and scene video. https://github.com/pupil-labs/realtime-python-api/blob/main/examples/simple/stream_scene_eyes_and_gaze.py
also, is it possible to set the world view as a different camera feed than the forward facing camera built into the neon?
Im not sure I understand the question. Do you want to stream a camera feed that is not coming from the Neon Module?
alright, thanks for clarifying. That is not possible, what do you want to stream there instead? if its the phones camera or some VR feed, you might be able to build that by making addtions to our realtime api python client I linked above.
Thanks Moritz. We are having much more success with the realtime API python client. Appreciate the help.
Hello, I am interested in developing an application using eye-tracking data. I want to create our own Android companion app that can directly receive and process data from neon via a cable. Are there any instructions related to this? Or is there a way to connect neon with other computing devices like Raspberry Pi and process the data directly on these devices? Thank you in advance.
Hi @user-d569ee π ! It's great to hear that you are interested in Neon for building an application with eye tracking data. May I ask why you'd like to build a companion app from scratch rather than using the one provided?
Yes, Neon can be connected to other devices (Linux, MacOS), but it won't be using NeonNet (the neural network to estimate gaze), likewise if you do build your own Companion App, you would loose this capability, which is doing the heavy-lifting of estimating gaze.
If it is sensitive and you would not like to discuss the reasoning here, feel free to send us an email to info@pupil-labs.com
I ll add a bit to this. If you already want to use an android phone, use can just use the neon companion device. It will run the neon companion app. From there you can install you own custom app on the same device and have the two apps communicate via the real-time network protocol. This is based on rstp and rest.
I have an additional question. Do you have plans to support other phones for the companion app? I currently have the Galaxy S23 Ultra as a development phone, and I'm wondering if I need to purchase a OnePlus phone as well. Thank you
We will support individual additional phones in the future once the OnePlus 10 Pro becomes less available. A lot of work is necessary to get the app adequately compatible with each model, and additionally not every model fulfills the hardware/software requirements to run the Companion app. Thus, we will not be able to increase the range of supported phones to more than 1-2 currently available models. We can not yet say which phone will be added to the pool next.
When you buy neon. It will come with a 1+10
I didn't know that. Good to hear. Thanks!
Hi, Could I ask if there's a manual for detecting a 'blink' for neon by pupil device through pupil cloud? thanx in advance!!
Hey @user-839466 π ! Please have a look at our documentation on blinks where you can also find our blink detector whitepaper https://docs.pupil-labs.com/neon/basic-concepts/data-streams/#blinks
Hello! I was wondering if anybody was having trouble downloading the Neon Companion App on the Android Phone (oneplus 10 5g). I've tried switching the wi-fi, clearing the cache on googleplay, and turning the phone on and off, but none of these methods seem to work. It keeps downloading to 68% and just fails. I keep getting an error saying "can't download neon companion". It's a completely new device that I just turned on. There is plenty of storage on the phone and I've cleared the cache.
Hi @user-328c63 π ! What is your Android Operating System version? Is it Android 11, 12, or 13? You can locate this information in your phone's General Settings, specifically under the About Device section. I ask this because not all Android versions are compatible, please read more here. https://docs.pupil-labs.com/neon/glasses-and-companion/companion-device/#android-os - That said, the fact that the download stops half way through might point to network issues.
Thanks @user-664e7a! I was able to get the app to download. I apologize if this is a very trivial question, but the android device seems to be over 1000 ms off my device executing the code. When I would have this problem with the pupil invisible trackers, I could toggle the use network-provided time button by going to system-> date & time -> toggle network-provided time. However, on this oneplus 10 android vs. 12, I cannot find the toggle for the network-provided time. Is it still available for this android? (We did not order the android from a third party).
Hi @user-328c63 π ! Apologies for the delayed response. The OnePlus 10Pro comes with Android version 12 which displays the settings slightly differently. For Android 11 and lower the settings appear as you said: System > Date & Time > Use network-provided time
. The equivalent for Android 12 is Date & Time - "Set time automatically"
. You can find this information here as well: https://source.android.com/docs/core/connect/time I hope this helps!
Hello, I've observed that the downloaded video contains a few gray frames, and there's a red circle "gaze" appearing at the end of these gray frames. Could you kindly provide an explanation for the presence of these gray frames? Additionally, I'm interested to know when the gaze data, exported in the CSV file, started to be collected. For instance, did it begin at the start of the gray frame, in the middle of the gray frame, or at the end of the gray frame?
Hi @user-d76d7c! I am assuming those gray frames are at the beginning of the recording? The Neon module contains a range of sensors including the scene camera, eye cameras, IMU etc. When you hit the record button all those sensors are first initialized and then start recording. Some sensors take longer to initialize than others. The scene camera usually takes the longest to initialize, so for the first couple of seconds of a recording there are not scene camera frames yet. When playing the recording back in Pupil Cloud gray frames are shown instead, where the already existing gaze data is shown on.
Both gaze data and scene video frames are timestamped indicating when they have been generated. Since the framerate of the gaze data is 200 Hz and the scene video is only 30 Hz, some gaze samples will be generated at the beginning of a scene video frame, while others are generated towards the end. You need to compare the respective timestamps to check this in detail.
Hi! I have a quick question on 1) what does each diameter indicate in this each fixation? / 2) how could we export the fixation data by pupil labs? / 3) does the center point of the diameter of each circle in the image indicates the point which participants are currently focusing on? (I mean, what should I do if I would like to discren the fixation of two points within the same circle?) Thank you so much in advance for your assistance!!
Hi @user-839466 π ! Thanks for reaching out! Regarding your questions:
1) The diameter of the circle indicates the fixation duration - the larger the blue circle, the larger the duration of this fixation 2) To download the fixation data, you can right-click on the recording and download the timeseries data. This will trigger a download of a folder that will contain a fixations.csv file among others. The data included in this csv file is listed here in detail: https://docs.pupil-labs.com/export-formats/recording-data/neon/#fixations-csv 3) Fixations can happen in points very close in space which might cause that 2 fixation circles will slightly overlap. This is dependent on the gaze behaviour of the user, the task, etc. For a better visualization, you could consider a) the Gaze overlay enrichment that allows you to modify tha parameters (eg size, shape) of the gaze estimation circle (the red circle) (see here: https://docs.pupil-labs.com/enrichments/gaze-overlay/) and b) our scanpath tutorial that allows you to change the size of the fixation circle (see here https://docs.pupil-labs.com/alpha-lab/scanpath-rim/)
How to copy or move record from one Workspace to another one?
Hi @user-39737f π ! It is currently not possible to move recordings between workspaces. You can find more information about Workspaces & Projects here: https://docs.pupil-labs.com/neon/basic-concepts/projects-and-workspaces/#frequently-asked-questions
Thanks Nadia! About 1/3 of our videos seem to be failing to collect gaze data. Is there a way to lower the rate at which this happens?
Hi again @user-328c63 π Sorry to hear that you have been experiencing issues with your recordings. Could you please share the ID of one of the affected recordings? You can send them via DM on Discord to me! You can find the ID by right-clicking onto the recording and selecting View recording information
. Could you please also share the version of the Neon App that you are using (you can find that in the Neon App in Settings > About Neon companion
)?
Hi, I just wanted to ask if you are doing anything other than just recording, you failure rate is aabnormally high. Did you get errors during the recording?