Hi, I`m trying to use the ndsi package in python 3.6.8. I used the .whl from github to install the package and added ffmpeg and libjpeg dlls in my system path. Unfortunately, I still get this error. I would be very glad about any suggestions.
@user-19a759 Unfortunately, your screenshot does not tell which dll is missing. Please use this program to find out which one is missing: https://github.com/lucasg/Dependencies/releases
After downloading, open the Dependencies_Gui.exe and drag and drop the frame.[,,,].pyd
file (you can find it in your site-packages folder, path is visible in your screenshot) onto it. It should list the missing dlls in red.
Thanks. It tells me that avutil, avformat, avcodec and swscale dlls are missing. These are the ffmpeg dlls, right?
correct
Thanks! The problem was that I downloaded a version of ffmpeg that had no "lib"-folder. Downloading another version helped.
@papr Taking this to this channel now for the sake of clutter
It seems to be enabled now! However, I'm getting an error from line 58 of pupil_detector_plugins/visualizer_2d.py
,
if pupil_detection_result_3d["model_confidence"] <= 0.0:
KeyError: 'model_confidence'
This error is throws even when using the above example 2d detector
But not when using the built-in 2d detector?
Correct, the built-in 2d detector throws no such error
Do you see the following warning in the logs?
"Required 2d pupil detection input not available. Returning default pye3d datum."
Only when I explicitly disable all detectors except for the custom one and the pye3d one does that error appear
Yes
The default datum causes the issue. Looks like your detector is also being disabled or its datum does not have the correct method field
Before the return result
within the detector, the line I have is
result["method"] = "2d c++"
Are you sure result
is returned and not datum
? That would epxlain why setting it manually is not working. I am inferring this from the variable names in the 2d detector. I do not know about your exact implementation
@user-3cff0d which order does your plugin have?
order = 0.99
for the custom one I tried it with, and
order = 0.9
for the example one I downloaded
@user-3cff0d what is the value of the pupil_detection_method
attribute?
I do not see that in any plugin I have tried. Is that meant to be alongside order
and icon_chr
etc?
it is defined in the base class and fills the method field
please try setting it to 2d c++ instead of setting the method field manually
To confirm, I should set self.pupil_detection_method
within the __init__
method of the class that extends PupilDetectorPlugin
?
Setting self.pupil_detection_method
in my plugin like it is in detector_2d_plugin.py
did not work
Ok, we will try to fix this for our custom example plugins on monday. The solution should transfer to your code as well
Previously I was not using the create_pupil_datum function itself, but I just attempted to duplicate the last few lines of the detector_2d_plugin.py's detect() method by passing values into the function just like detector_2d_plugin.py' does. Now it is returning the datum with the ellipse added to it right before it returns. The issue is still there
Yeah, one of our assumptions is not working as expected. We will have to debug the 3d plugin to see what is goingwrong and what date it is actually receiving and why.
I can upload the example python file i downloaded with the slight modification to make it override the default 2d plugin and send its results to the 3d plugin, if it makes it easier for you
That would be helpful.
So you can try it with the exact file I'm trying it with
@papr
This is on the most recent version of the master
branch from the github page as well, I downloaded it fresh a few hours ago to make sure it wasn't a versioning issue
Hi, @papr , I think this is the place to ask about the debug and stuff. So I have updated the RealSense2 Backend for Pupil Capture v2.3+ as you said(always thanks for your advice). Unfortunately, It still has unloaded plugins. Below is my current envirnoment
OS: windows 10 python: 3.6.8 pupil: 3.0.7 C:. └─plugins └─pyrealsense2 ├─bin │ └─pycache ├─pyrealsense2 │ └─pycache └─pyrealsense2-2.41.0.2666.dist-info those are the errors I've picked, and also attached the whole capture logs(due to the limit of length of the message of Discode, cut off the eye1 log):
...
- eye0 - [DEBUG] plugin: Unloaded Plugin: <pupil_detector_plugins.pye3d_plugin.Pye3DPlugin object at 0x00000222C85CA978>
- eye0 - [DEBUG] plugin: Unloaded Plugin: <pupil_detector_plugins.detector_2d_plugin.Detector2DPlugin object at 0x00000222C85CA7F0>
- eye0 - [DEBUG] plugin: Unloaded Plugin: <roi.Roi object at 0x00000222C84CDD68>
- eye0 - [DEBUG] plugin: Unloaded Plugin: <video_capture.uvc_backend.UVC_Source object at 0x00000222C84E7CF8>
- eye0 - [DEBUG] plugin: Unloaded Plugin: <video_capture.file_backend.File_Manager object at 0x00000222C84CDF28>
- eye0 - [DEBUG] plugin: Unloaded Plugin: <video_capture.ndsi_backend.NDSI_Manager object at 0x00000222C84CDE10>
- eye0 - [DEBUG] plugin: Unloaded Plugin: <video_capture.uvc_backend.UVC_Manager object at 0x00000222C85E7438>
...
I am assuming the pye3d plugin and pye2d plugin is not working properly?
Hey, the selected messages are expected during an application shutdown. The application shut down due to an exception:
2021-01-11 10:17:29,215 - world - [ERROR] launchables.world: Process Capture crashed with trace:
Traceback (most recent call last):
File "launchables\world.py", line 736, in world
File "launchables\world.py", line 485, in handle_notifications
File "shared_modules\plugin.py", line 398, in add
File "C:\Users\goqba\pupil_capture_settings\plugins\realsense2_backend.py", line 230, in __init__
self.context = rs.context()
AttributeError: module 'pyrealsense2' has no attribute 'context'
It looks like pyrealsense2 is again not installed 100% correctly. Could you please list the files in pyrealsense2/bin
and pyrealsense2/pyrealsense2
?
also, on the site
https://github.com/pupil-labs/pupil/blob/master/docs/dependencies-windows.md the guide asks me to download FFMPEG v4.3through: Download FFMPEG v4.3 Windows shared binaries but, the clicked URL: https://github.com/BtbN/FFmpeg-Builds/releases/download/autobuild-2020-12-08-13-03/ffmpeg-n4.3.1-26-gca55240b8c-win64-lgpl-shared-4.3.zip is not found, and unable to download it. I am assuming the existing page is deleted or some what.
Hey, @papr thanks for letting me know. here is the directory of the full pyrealsense2.
└─pyrealsense2
├─bin
│ │ align-depth2color.py
│ │ export_ply_example.py
│ │ opencv_viewer_example.py
│ │ python-rs400-advanced-mode-example.py
│ │ python-tutorial-1-depth.py
│ │
│ └─__pycache__
│ align-depth2color.cpython-37.pyc
│ export_ply_example.cpython-37.pyc
│ opencv_viewer_example.cpython-37.pyc
│ python-rs400-advanced-mode-example.cpython-37.pyc
│ python-tutorial-1-depth.cpython-37.pyc
│
├─pyrealsense2
│ │ pybackend2.cp37-win_amd64.pyd
│ │ pyrealsense2-net.cp37-win_amd64.pyd
│ │ pyrealsense2.cp37-win_amd64.pyd
│ │ _version.py
│ │ __init__.py
│ │
│ └─__pycache__
│ _version.cpython-37.pyc
│ __init__.cpython-37.pyc
│
└─pyrealsense2-2.41.0.2666.dist-info
INSTALLER
LICENSE
METADATA
RECORD
REQUESTED
top_level.txt
WHEEL
just installed with command:
pip install -t [...]\pupil_capture_settings\plugins\pyrealsense2
You can go to https://github.com/BtbN/FFmpeg-Builds/releases and choose an appropriate release, e.g. ffmpeg-n4.3.1-29-g89daac5fe2-win64-lgpl-shared-4.3.zip
@user-d4549c Please see this message ☝️
Thank you for providing the files. It looks like you installed Python 3.7 instead of Python 3.6 (required to run with the bundle). An alternative, is to run Pupil from source as you are attempting it already.
Hey @papr I finally got the answer the reason why I got it wrong. First of all, the version of python should be 3.6.8 as you said. second, it would be great to install requirements the location that you provide right under the plugin was right. so that means, if someone is using the pip to install plugin, the user should install using this command:
pip install -t ~/pupil_capture_settings/plugins pyrealsense2 Thanks again for all of your help.
Hi everyone, I'm trying to send annotations from MATLAB to Pupil Capture, using the scripts on https://github.com/pupil-labs/pupil-helpers/tree/master/matlab. I'm confused about how exactly the functions are to be used through my own script. I need to look at some example usages of the functions, like send_annoation.m , what exact arguments it takes, etc.
Please see https://docs.pupil-labs.com/developer/core/network-api/#remote-annotations for reference. There is also a link to the python equivalent of the script which might be more explicit about the remote annotation usage
Thanks, I've already gone through the Python scripts, but to implment it in MATLAB, when trying to run the pupil_remote_control.m , I get the error:
Error using zmq.core.ctx_new Attempt to execute SCRIPT ctx_new as a function: C:...\matlab-zmq-master\lib+zmq+core\ctx_new.m
Could you please try this adjustment:
- ctx = zmq.core.ctx_new();
+ ctx = zmq.core.ctx_new;
Are you executing the script directly or are you calling it from external code?
I tried both and both give this error.
That I had tried too! still the same..
This is unexpected. This change should definitively (85% confidence) have a change in behavior. Unfortunately, I am no Matlab expert and I do not know what the exact issue is or how to solve it. 😕
I understand.. Is there a way I can contact the person who has coded the MATLAB scripts?
I would contact the author of the zmq matlab library.
I did 😅
Ah ok 😆 sure, thanks 😊
An other (personal) recommendation would be to not use Matlab at all. It makes your research far less accessible to other people compared to e.g. Python code. But it always depends on the dependencies that you need down the line. So, one is not always free to choose.
Yes I am coming to this decision after so many trials and errors. I will, thanks.
Just to give this one final shot, let me ask about the order I have to use the functions and scripts. I'm not sure if I'm using the scripts correctly at all.
1) Not as a function, no. You should see it as an example script. Generally, you should create a zmq context and socket at the beginning of your program, and close both at the end. This socket can be used repeatedly during your program. "control of Pupil Core" happens by sending requests to Capture through the socket instance.
2) From the pupil_remote_control.m
example:
% start recording
pause(1.0);
zmq.core.send(socket, uint8('R'));
result = zmq.core.recv(socket);
fprintf('Recording should start: %s\n', char(result));
% stop recording:
zmq.core.send(socket, uint8('r'));
result = zmq.core.recv(socket);
fprintf('Recording stopped: %s\n', char(result));
3) Correct.
4) See socket
from pupil_remote_control.m
5) While pupil_remote_control.m
sends control messages to Core, it cannot receive realtime data streams. That is what filter_messages.m
does. It subscribes to a specific data stream and uses the function defined in recv_message.m
to parse incoming stream messages
Perfect, I'm getting it now.
So far, it's all clear, except the usage of filter_messages.m
After creating the zmq context and socket at the beginning of my program, should I immediately create the "req socket" as in filter_messages.m ?
I need to know in what order this sending of the annotations and receiving the realtime data stream happens.
In other words, first there's creating the zmq context and socket at the beginning and at the end, and then, 'in between', there's creating and disconnecting the sub socket (or req socket), again only once. Right?
You need a req socket for the subscription port (changes on Pupil Capture start). Once you have that, you can create and connect the sub socket (can use the same context as the req socket). You can leave the sub socket opened until the end.
You should call recv_message()
regularly (as often as you expect the data to come in) if you want to process the real time data stream (e.g. pupil or gaze data).
Sending annotations via the req socket is independent of that.
What do you mean by "regularly"? I would need continuous data throughout my whole run. It would keep receiving once it's called, right?
Also, if I want all pupil/gaze/notifications/loggings, etc., should I set them seperately, like:
zmq.core.setsockopt(sub_socket, 'ZMQ_SUBSCRIBE', 'pupil.'); zmq.core.setsockopt(socket, 'ZMQ_SUBSCRIBE', 'gaze.'); zmq.core.setsockopt(socket, 'ZMQ_SUBSCRIBE', 'notify.'); zmq.core.setsockopt(socket, 'ZMQ_SUBSCRIBE', 'logging.');
or all in one?
Your program is likely to have some kind of event loop. One part of the event loop should be calling recv_message()
Be aware that this will be a lot of data. Your program needs to be fast enough to keep up. Else the background queue will fill.
Ah ok that's true. I do have several trials and I can record only for those intervals and not for the whole run. but, does recv_messages.m have have a beginning and ending? The duration is defined by bufferlength, right? Can you give me examples of bufferlength? like in seconds or what?
bufferLength is in bytes. See this as an example https://github.com/pupil-labs/pupil-helpers/blob/master/matlab/filter_messages.m#L68
Should you get the error message that the buffer length is too small, increase the value.
recv_message() returns immediately, if data is available in the background queue. Else it blocks until it receives data or the timeout of 1 second has been reached. See https://github.com/pupil-labs/pupil-helpers/blob/master/matlab/filter_messages.m#L50
I'm happy to announce I solved the MEX problem. It literally took me from that last message here until right now 😩
Now I get this error:
Error using recv Resource temporarily unavailable
Error in practice_run (line 196) result = zmq.core.recv(socket);
Know anything about it?
Hi again, do you know if this issue has been looked into any further?
My apologies. We have not looked into this further yet. Thank you for reminding me.
👍
@nmt took the time to look into this and was able to find the issue. The value needed to be smaller than 0.1
, not smaller than 1.0
. Therefore, increasing the order value to 0.9
moved your plugin back in the execution sequence. Everything should work as expected by changing the order to 0.09
.
Thanks, this fixed it right up!
Good afternoon! I have been playing with the python client and I have an issue when I try downloading the recordings. if I execute the script below:
from pupilcloud import Api, ApiException
api_key = XXX api = Api(api_key=api_key, host="https://api.cloud.pupil-labs.com/)
data = api.get_recordings().result last_recording = data[-1] saved_path = api.download_recording_zip(last_recording.id)
I get
Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/enrico/.conda/envs/ReperioConda/lib/python3.8/site-packages/pupilcloud/api/recordings_api.py", line 277, in download_recording_zip return self.download_recording_zip_with_http_info(recording_id, **kwargs) # noqa: E501 File "/Users/enrico/.conda/envs/ReperioConda/lib/python3.8/site-packages/pupilcloud/api/recordings_api.py", line 345, in download_recording_zip_with_http_info return self.api_client.call_api( File "/Users/enrico/.conda/envs/ReperioConda/lib/python3.8/site-packages/pupilcloud/api_client.py", line 338, in call_api return self.call_api(resource_path, method, File "/Users/enrico/.conda/envs/ReperioConda/lib/python3.8/site-packages/pupilcloud/api_client.py", line 170, in call_api response_data = self.request( File "/Users/enrico/.conda/envs/ReperioConda/lib/python3.8/site-packages/pupilcloud/api_client.py", line 363, in request return self.rest_client.GET(url, File "/Users/enrico/.conda/envs/ReperioConda/lib/python3.8/site-packages/pupilcloud/rest.py", line 235, in GET return self.request("GET", url, File "/Users/enrico/.conda/envs/ReperioConda/lib/python3.8/site-packages/pupilcloud/rest.py", line 223, in request r.data = r.data.decode('utf8') UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc8 in position 10: invalid continuation byte
I have tried with different files getting the same error, maybe I am not using the correct ID? As a workaround I have tested the following which works fine:
url = last_recording.download_url cmd = 'curl -O "'+url+'" -H "accept: application/octet-stream" -H "api-key:'+api_key+'"' proc = subprocess.Popen(cmd, shell=True)
I am able to reproduce the issue. I think the issue tries to decode the downloaded zip as utf-8 text, which fails expectedly. I will forward this to our cloud development team.
@user-0e7e72 thank you for reporting the issue, this will be fixed in the next version, until then a workaround would be to do:
with open('filename.zip', 'wb') as f:
f.write(api.download_recording_zip(last_recording.id, _preload_content=False).data)
Hi again! When developing plugins for Pupil Player, I've noticed that pupil detector plugins don't seem to share a g_pool
with normal Player plugins. When looking at the g_pool.plugin_by_name
values of a PupilDetectorPlugin
plugin and those of a Plugin
plugin, they're different- PupilDetectorPlugin
can only see fellow detector plugins. Is this intentional? The plugins I'm developing require some communication between the two types of plugins.
That is because PupilDetectorPlugin
live in the eye process and normal plugins live in the player/world process. Process do not share memory.
Ah, I see
Does this order value determine which plugin is sent to the 3d detector? If so, does a higher or lower order
value increase its priority?
Since pye3d uses the first datum, the lower the better
Great, thanks! Is changing the order
value after initialization meant to change anything about what plugin's data gets sent to pye3d? I've experimented with changing it dynamically to switch between plugins and it doesn't appear to do anything
The order is only relevant on load. It decides in which order the plugins are called
fixations=events['fixations'] print('fixations***:',fixations)
time=fixations['duration']
normx=fixations['norm_pos'][0]
normy=fixations['norm_pos'][1]
x_pos =int( normx * 1280)
y_pos = int((1 - normy) * 720)
events
is of type dict
which can be accessed via keys, e.g. "fixations"
.
events["fixations"]
is of type list
which can be accessed via indices (e.g. [0]
or via enumeration for x in y:
). This is empty if there is no fixation.
A possible fixation is again of type dict
within events["fixations"]
.
Therefore, you want to do the following:
if "fixations" in events: # always true if the fixation detector is active
fixation_list = events["fixations"]
for fixation_dict in fixation_list:
time = fixation_dict["duration"]
...
Does it mean that I should use fixations [0] instead of fixation
From my results, it seems that some fixation lists are empty and some are not.
Oh, I got it!I will try this method. Thanks for your patient help.
Should you ever end up in a similar situation, there are a few tricks to keep in mind.
container[key]
where the type of key depends on the type of containerlist
s are visualized using squared brackets, e.g. [x, y, z, ...]
tuple
s (immutable lists) are visualized using round bracktes, e.g. (x, y, z)
dict
s are visualized with curly brackets with key value pairs {key1: value1, key2: value2, ...}
type()
function to find out of what type your object is, e.g. type(events)
or type(events["fixations"])
. Should you not know how to access an object, knowing its class will help you find out by looking up the documentation for that particular class/type.Is there a plan to allow dynamic switching between installed pupil detector plugins, in regards to which one gets its results processed by the 3d detector? I think that with the introduction of easily-installable external pupil detector plugins via the plugin API, such a feature would be pretty valuable to those of us who want to work with more than 1 detector plugin.
Wouldn't it be a better idea to have different pye3d instances that each prioritizes a specific 2d detector? Especially, since pye3d is based on a time series 2d of detection
Hi, I have a question PI recording format in this document: https://docs.google.com/spreadsheets/d/1e1Xc1FoQiyf_ZHkSUnVdkVjdIanOdzP0dgJdJgt0QZg/edit#gid=254480793
I was trying to read the gaze data with Python. What is the character encoding format for gaze data in "gaze ps1.raw"? I tried standard latin-1 and utf-x. They result in decoding error or unreadable text.
Is that possible to do on my end?
Yes, vía a custom pye3d plugin class that does not look for "2d c++" but a more fitting name for custom's detectors method name. I can provide an example on Monday.
I see, like by just making a near-empty class that inherits the pye3d class?
Exactly, but I guess with a few more adjustments. Not sure if I caught all of them though. Overwriting detect()
is the important part to identify which 2d data you are interested in
class KevinsPye3DPlugin(Pye3DPlugin):
pupil_detection_identifier = "3d-kevin"
label = "Kevin's Pye3D"
def detect(self, frame, **kwargs):
self._process_camera_changes()
previous_detection_results = kwargs.get("previous_detection_results", [])
for datum in previous_detection_results:
# NOTE: CHANGE
if datum.get("method", "") == "2d kevin":
datum_2d = datum
break
else:
logger.warning(
"Required 2d pupil detection input not available. "
"Returning default pye3d datum."
)
return self.create_pupil_datum(
norm_pos=[0.5, 0.5],
diameter=0.0,
confidence=0.0,
timestamp=frame.timestamp,
)
result = self.detector.update_and_detect(
datum_2d, frame.gray, debug=self.is_debug_window_open
)
norm_pos = normalize(
result["location"], (frame.width, frame.height), flip_y=True
)
template = self.create_pupil_datum(
norm_pos=norm_pos,
diameter=result["diameter"],
confidence=result["confidence"],
timestamp=frame.timestamp,
)
template.update(result)
return template
@classmethod
def parse_pretty_class_name(cls) -> str:
return "Kevin's Pye3D Detector"
But with a custom __init__ or wherever it is that looks for the 2d c++
Hello guys, I am just starting to play around with your devices (Pupil Invisible and Pupil Core) and I am curious if there's a possibility to stream data from the Mobile app to another app within the mobile device.
Yes, via the network api https://docs.pupil-labs.com/developer/invisible/#network-api
Thanks, I'll give this a shot! Is that meant to be Pye3DPlugin(PupilDetectorPlugin)
or is it meant to inherit directly from whatever the Pye3D class name is?
Oh, right, I adjusted the example
Okay, thanks @papr ... And the mobile app can be only used with Invisible model or it doesn't matter?
Pupil Core is meant to be used with Pupil Core software https://github.com/pupil-labs/pupil/releases
The Pupil Invisible Companion app is only meant to be used with the Pupil Invisible Glasses.
Okay and regarding the invisible app, doe it work that I will install Invisible Companion App on my android phone and it will be running on background, and then I can write my own android app in Java to communicate with via Network API?
Please be aware that even though the network API (NDSI
) is definition is open source, there is only a client implementation in Python (pyndsi
). You would have to implement a java client yourself.
Correct.
Okay, thank you for info.
Hello, again, I am trying to run the pyndsi example but I cannot get over ffmpeg codec. The link provided in the README is not hosted anymore. Do you have some workaround?
thanks!
Hi! Because of some dependancies, I have to use MATLAB to communicate with Pupil Capture.
Is there a way to start a validation with ZMQ the same way one can start a calibration procedure (like zmq.core.send(socket, uint8('C'))
)?
Then, can I retrieve the Angle accuracy
and Angle precision
from the Accuracy visualizer plugin by listening to some messages? For example, I'm subscribing to the notify.
topic but I don't see those values coming up, so is there a way to retrieve them with ZMQ?
Regarding your first point: Unfortunately, not. You will have to send a notification [1], see [2]
[1] https://github.com/pupil-labs/pupil-helpers/blob/master/matlab/send_notification.m [2] https://discord.com/channels/285728493612957698/285728493612957698/786147188416970762
Regarding your second point: The result is not published as notification, just as a log. You can subscribe to logging.info
to receive it.
Thanks a lot @papr !
What socket type is it? I might need to see more of the code in order to judge what is going wrong.
Oh it was only because Pupil Capture was not running! Now it's working! Thanks 😀
I'm still a little confused about recv-message(). I receive this error, even if I call recv_,message() immediately after starting recording:
`Error using recv Result too large
Error in recv_message (line 7) topic = char(zmq.core.recv(socket));
Error in practice_run (line 369) [topic, note] = recv_message(sub_socket, 1024);`
To be able to record pupil diameter continuously during trials of 10 seconds in a 20 trial run in which visual stimuli is being presented, how can I use recv_message so that I have all those 10 seconds in a trial?
The first few seconds after starting the recording seems to be not valid for data processing, right? How many seconds would you say must be left out from to beginning?
Try setting the buffer length larger.
Re 1. Streaming data does not give you a guarantee that every datum is sent. In order to receive as much pupil data as possible, subscribe to pupil
, and do roughly something like this:
while trial:
[topic, note] = recv_message(sub_socket, 1024);
data = extract_info_from(note)
store_data_for_post_processing(data)
do_post_processing()
The loop should run as quick as possible in order to be able to process all data. If you are not quick enough, data might get dropped.
If you require data reliability, I suggest recording with Pupil Capture and extracting the data from the recording. If you need the data in real time, make sure that the real time processing of the data is quicker than the incoming data.
I tried up to 1024000 and still the same error, should I go higher?
That seems much larger than necessary. Not sure why you would get this message, unless you are also trying to receive the video stream
Is there already a ZMQ publisher for the image streams?
Yes, there is. I can link you the example later.
Is this a correct way of using send_annotation()?
annotation.topic = "annotation.topic";
annotation.label = "event";
annotation.timestamp = 'T';
annotation.duration = 0.0;
event1 = send_annotation( socket, annotation );
If not, can you tell me how to define the keys for annotation?
annotation.timestamp = 'T'; The timestamp should be a float value, not a string
corrected it to:
annotation.topic = "annotation.hi";
annotation.label = "event1";
annotation.timestamp = 1.0;
annotation.duration = 0.0;
event1 = send_annotation( socket, annotation );
gives me this error:
`Error using send_annotation Too many output arguments.
Error in practice_run (line 367) event1 = send_annotation( socket, annotation );`
send_annotation
does not return anything. The event1 =
prefix causes the error.
with the code like this:
annotation.topic = "annotation.topic";
annotation.label = "label";
annotation.timestamp = 1.0;
annotation.duration = 0.0;
send_annotation( socket, annotation );
I get this error:
`Error using dumpmsgpack>dump (line 74) Unknown type "string"
Error in dumpmsgpack>dumpmap (line 223) valuestuff = dump(values{n});
Error in dumpmsgpack>dump (line 72) msgpack = dumpmap(data);
Error in dumpmsgpack (line 22) msgpack = dump(data);
Error in send_annotation (line 9) payload = dumpmsgpack(annotation);
Error in practice_run (line 367) send_annotation( socket, annotation );`
Do you know what it's about?
Try converting your string objects to uint8 arrays. e.g. annotation.topic = uint8('annotation.topic')
.
Actually, normal strings should work, looking at https://github.com/pupil-labs/pupil-helpers/blob/master/matlab/pupil_remote_control.m#L53 Does Matlab differentiate between "
and '
(double vs single quotes)?
No I don't think MATLAb differentiates between 'and ".
I changed it to:
annotation.topic = "annotation.topic";
annotation.label = "label";
annotation.timestamp = 1.0;
annotation.duration = 0.0;
send_notification(socket, containers.Map({'subject'}, {'event'}))
result = zmq.core.recv(socket);
fprintf('Notification received: %s\n', char(result));
does not give any error and goes one. But I need t make sure it's actually sending the event information. Where can I see the notifications or annotations in the csv files?
This sends a notification but not an annotation! Could you provide the full code creating the annotation object? Enable the Annotation
plugin in capture. You should see a log message pop up when an annotation is received.
Yes, can I send my script to you directly?
Yes