FORW-RD Research

Robot-Interfaces for High-Skill Tasks and Training

NRT Faculty: Cagdas Onal
Project Description: This research aims to answer the following fundamental questions: How can humans learn high-skill tasks in a virtual environment, how can a human operator provide guidance via a virtual environment to a robot to perform high-skill tasks, and how can the robot cooperate? With this research, we study approaches to eliminate the need for costly real-world training programs and democratize training for tasks that humans and robots may perform together in the future workplace. These tasks may require a high level of hand-eye coordination and cognitive skills, typically informed by tactile sensations during manual operation, such as maintenance/manipulation of fragile objects. Such tasks may be difficult or dangerous to train for in the real environment. Our approach is to combine haptic devices (commercial and our custom-developed wearable systems) with embedded user motion sensing and high-fidelity virtual simulation environments to achieve a degree of realism required for effective training of human workers to perform high-skill manual tasks. Moreover, we seek to enable a human operator to guide a robot via haptic interaction with the virtual environment to perform such high-skill tasks more effectively with less skill required of the human operator, i.e., with the robot assisting with low-level autonomous actions. We will especially consider using compliant and soft robots for flexibility and operation safety. Our work will focus on canonical manipulation challenges that may include insertion tasks, deformable object interaction, and trajectory tracking within complex boundaries.
Student(s) on the project:
– Raagini Rameshwar (Ph.D., RBE), Mark Robinson (MS, RBE)
Relevant publication(s):
– E.H. Skorina, C.D. Onal, “A soft robotic wearable wrist device for kinesthetic haptic feedback”, Frontiers in Robotics and Artificial Intelligence Journal, (2018).
– E.H. Skorina, R. Rameshwar, S. Pirasmepulkul, T.K. Khuu, A. Caracappa, P.Luxsuwong, M. Luo, W.R. Michalson, C.D. Onal, “Soft Robotic Glove System for Wearable Haptic Teleoperation”, Waste Management Symposium, (2018).
– S. Pirasmepulkul, T.K. Khuu, A. Caracappa, P.Luxsuwong, M. Luo, W.R. Michalson, C.D. Onal, “Haptic glove as a wearable force feedback user interface”, US Patent App. No. 15/586,684, Publication Date 2017/11/9.
– S. Li, R. Rameshwar, A.M. Votta, C.D. Onal, “Intuitive Control of a Robotic Arm and Hand System With Pneumatic Haptic Feedback”, IEEE Robotics and Automation Letters (RA-L), 4(4): 4424-4430, (2019).


Optimizing Inkjet Printing of Silver Ink for Flexible Miniaturized Circuits

NRT Faculty: Pratap Rao
Project Description: This project is addressing reliability barriers that are preventing inkjet printing from being adopted for digital printing of electronic circuits and sensors and developing the ability to print narrow circuit traces with a width of fewer than 30 micrometers to enable printing of miniaturized circuits. Ink-substrate combinations and printing process parameters are being evaluated for commercially-available conductive, dielectric, and encapsulant inks on flexible substrates using a commercially-available printing platform with a high-throughput printhead. This printing capability is being investigated for digital prototyping of flexible and stretchable circuits and sensors for various applications including human-robot interaction. The project is funded and advised by NextFlex Manufacturing USA Institute. Raytheon Technologies Research Center, Carpe Diem Technologies, and Eastman Chemical Co. are participating in the project, while Boeing Company is serving as a project advisor.
Student(s) on the project: Nick Pratt, Ph.D. student


User Interaction and Assistance with Soft Robotic Systems

NRT Faculty: Cagdas Onal
Project Description: Soft materials are particularly suited for wearable systems as they provide a safe and comfortable means for human-robot interaction. In this research, we apply soft robotics to the problem of robot teleoperation, particularly regarding haptic feedback.
We have developed a teleoperation system with safe, realistic force feedback for intuitive control of a robotic arm and an anthropomorphic robotic hand as its end effector. The system interfaces with the user via a novel haptic data glove. This glove detects the state of the hand using inertial measurement units (IMUs) and custom curvature sensors and employs pneumatic muscles to provide force feedback. The glove itself weighs only 58 grams, and the glove combined with IMUs and tether weighs 213 grams. We used this glove to control a Kinova Jaco robotic arm and a custom 3D printed hand with embedded force sensors.
To provide realistic force feedback, we have developed “haptic muscles”, soft pneumatic pouches that fit around the user’s fingers. When a pouch inflates, it applies gentle pressure to the user’s knuckles, keeping the hand open. This imitates how a grasped object would prevent your fingers from closing, by instead pushing the fingers open.
Currently, our work involves using this teleoperation system as a learning-from-demonstration tool, to impart human perception of intricate forces onto an autonomous robotic system. The goal is for an experienced user to complete complicated tasks, made possible by the haptic feedback, and for the robot to use these demonstrations to learn how to do similar complicated tasks.
This project will continue to study ways we can study the physical interaction, collaboration, and assistance capabilities of soft robotic systems, including soft manipulators such as our origami-inspired continuum robots, and soft mobile robots such as our soft robotic snakes.
Student(s) on the project:
– Raagini Rameshwar (PhD, RBE)
– Robin Hall (PhD, RBE)
– Yinan Sun (PhD, RBE)
– Shou-Shan Chiang (PhD, RBE)
Relevant publication(s):
– H. Mao, J. Santoso, C.D. Onal, J. Xiao, “Sim-to-real Transferable Object Classification through Touch-based Continuum Manipulation”, International Symposium on Experimental Robotics (ISER), (2018).


Assisted tele-manipulation interface for nursing robots

Compare representative tele-nursing interfaces by their performance, workload and user learning efforts

NRT Faculty: Jane Li
Project Description: Tele-nursing robots provide a safe approach for patient-caring in quarantine areas. For effective nurse-robot collaboration, ergonomic teleoperation interfaces have to be developed that are intuitive and reduce the physical and cognitive workload. We propose a framework to evaluate the control interfaces that help to iteratively develop an intuitive, efficient and ergonomic teleoperation interface. We first present the usage of pre-defined objective and subjective metrics to evaluate three representative designs of contemporary teleoperation interfaces. The results indicate that teleoperation via human motion mapping outperforms the gamepad and stylus interfaces. The trade-off with using motion mapping as a teleoperation interface is the non-trivial physical fatigue. To better understand the impact of heavy physical demand during motion mapping teleoperation, we propose an objective assessment of physical workload in robot teleoperation using electromyography (EMG). We find that physical fatigue happens more in the actions that involve precise manipulation and steady posture maintenance. We further implemented the teleoperation assistance in the form of shared autonomy to eliminate the fatigue-causing component in robot teleoperation via human motion mapping. The experimental results show that the simple autonomous feature can effectively reduce the physical effort while improving the efficiency and accuracy of the teleoperation interface.
Student(s) on the project:
– PhD TC Lin, Achyuthan Krishnan
Relevant publication(s):
– Tsung-Chi Lin, Achyuthan Unni Krishnan and Zhi Li, “Intuitive, Efficient and Ergonomic Tele-Nursing Robot Interfaces: Design Evaluation and Evolution”, Submitted to ACM Transactions on Human-Robot Interaction (THRI), 2020.
– Tsung-Chi Lin, Achyuthan Unni Krishnan and Zhi Li, “Shared Autonomous Interface for Reducing Physical Effort in Robot Teleoperation via Human Motion Mapping“, in 2020 IEEE International Conference on Robotics and Automation (ICRA), 2020.
– Tsung-Chi Lin, Achyuthan Unni Krishnan and Zhi Li, “Physical Fatigue Analysis of Assistive Robot Teleoperation via Whole-body Motion Mapping“, in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2240–2245, IEEE, 2019


Perception-action coordination in the usage of active telepresence cameras

NRT Faculty: Jane Li
Project Description: This work aims to (1) understand human preference in camera selection and control in the teleoperation of a complex robot system with multiple perception and action components, and (2) develop robot autonomy for camera selection and control to assist teleoperation. We conducted a user study (video link) to investigate natural human perception-action coordination in the usage of various active telepresence cameras, in coordination with the gross and precise manipulation. The lessons learned from this human movement study helped us to identify the camera selection and control objectives , and inspires the design of human-robot interface, robot autonomy teleoperation assistance and best practice for teleoperator training.
Student(s) on the project:
– Open to perspective PhD students
Relevant Publication(s):
– Alexandra Valiton, Hannah Baez, Naomi Harrison, Justine Roy, and Zhi Li, “Active Telepresence Assistance for Supervisory Control: A User Study with a Multi-Camera Tele-Nursing Robot”, submitted to 2021 IEEE International Conference on Robotics and Automation (ICRA)
– Alexandra Valiton and Zhi Li, “Perception-Action Coupling in Usage of Telepresence Cameras“,  in 2020 IEEE International Conference on Robotics and Automation (ICRA),  pp. 3846-3852. IEEE, 2020,  Finalist of Best Paper for Human-Robot Interaction


Assisted navigation interfaces for high-mobility nursing robots

NRT Faculty: Jane Li
Project Description: This project aims to develop assisted teleoperation interfaces to enable nursing robots to operate in cluttered human environment at human operational speed. The proposed interface will leverage the state-of-the-art robot autonomy used as the driving assistance for (semi-)autonomous vehicles and assisted navigation interfaces for various mobile robots (e.g., UVA, UGV). Such interfaces will enable nursing robots to efficiently perform high-mobility tasks such as fetching-and-delivery of medical supplies, patient room cleaning and disinfection, as well as mobile active telepresence tasks for patient evaluation and social communication tasks. We will also investigate the integration of visual, haptic and verbal communication interfaces, and explore the mutual adaption between the assisted navigation interfaces and teleoperators.
Student(s) on the project:
– PhD Zhuoyun Zhong
Relevant Publication(s):
– TBA


Supervisory control interfaces for individual and team nursing robot

NRT Faculty: Jane Li
Project Description: This project aims to develop supervisory control for individual and team nursing robots. So far we have developed a graphical user interface that can supervise tele-manipulation tasks and handle the errors due to unreliable autonomy using action-level command. We have also developed a prototype interface for the supervisory control of nursing robot team (video demo link). Future work will expand these prototype interfaces for the supervisory control of comprehensive nursing tasks that involves the coordination of manipulation, navigation and active telepresence control.
Student(s) on the project:
– Open to perspective PhD students
Relevant Publication(s):
– Samuel White, Keion Bisland, Michael Collins, and Zhi Li,  “Design of a High-level Teleoperation Interface Resilient to the Effects of Unreliable Robot Autonomy“, In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 11519-11524, IEEE, 2020.


Learning physical properties for human-robot collaboration in object manipulation

NRT Faculty: Jing Xiao
Project Description: This research aims to make a robot learn the physical parameters of unknown objects in order to assist a human worker to pick up an object and manipulate the object.
Student(s) on the project:
– Sean McGovern (2nd year Ph.D., RBE)


Soft Psychophysiology: An Investigation of Soft Robotic Sensors as Psychophysiological Measuring Tools

NRT Faculty: Cagdas Onal and Jeanine Skorinko
Project Description: Creating a soft robotic psychophysiological sensor to detect stress in the trapezius muscle. Integrates psychophysiology and soft robotics.
Student(s) on the project:
– Shannon Carey (PSY/RBE)
– Sam Milender
– Ryan Breuer
Relevant publication(s):
– MQP REPORT: https://web.wpi.edu/Pubs/E-project/Available/E-project-051520-050050/