Hi all π I would like to share with you our paper in which we use (and cite) the pupil labs eye tracker in order to predict user intention and use a robot arm in order to assist the user, who may have mobility limitations. There is a nice video attached to the paper in which we show three use cases, targeted for abled-bodied, paraplegic, and tetraplegic users, by reconfiguring the system to use suitable Human-Machine Interfaces (e.g. the eye tracker), assistive strategies, etc:
https://ieeexplore.ieee.org/document/9495281
S. Iregui, J. De Schutter and E. AertbeliΓ«n, "Reconfigurable Constraint-Based Reactive Framework for Assistive Robotics With Adaptable Levels of Autonomy," in IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 7397-7405, Oct. 2021, doi: 10.1109/LRA.2021.3098950.