Locations

Wearable Intertace for Teleoperation of Robot Arms - WITRA

 
Vinicius Bazan Pinto Fernandes
Universidade de São Paulo
São Carlos, Brazil
See full Bio Contact
 
Entry date: 03-Jun-2014
Final Submission: 01-Jan-2015
WITRA (Wearable Interface for Teleoperation of Robot Arms) allows the user to teleoperate a manipulator robot in a intuitive manner. By capturing the movement of the arm of the user and calculating the position of the hand, the system uses the data to generate the trajectory of a robot arm, so that the user has control over the movement of the end-effector of the robot in cartesian space. By performing this remotely, it is not necessary for the user to share the same environment of the robot.
 

Project details

dream big

1. Project Motivation and Overview

The idea of developing WITRA was motivated by the concept of teleoperation of robots. Intended to be performed in a more intuitive manner that with the already existing products, the teleoperation of a robot arm is based of the movements of the human arm. Instead of using joysticks or smart devices, such as tablets, the user is able to operate the robot by simply moving his/her arm, in a very natural way. The goal here is that the robot arm act like an extension of the user's body, moving in the space with a trajectory that corresponds to the trajectory of the users hands. Furthermore, the wearable interface (WITRA) is intended to be fully wireless, providing the user with a more natural experience, without cables laying around.

Intuitiveness is here the keyword. A study performed at NYU Polytechnic School of Engineering by Vinicius Bazan, head of this project, showed that using a wearable interface for robot arm operation gives better performance than using other intuitive devices, such as videogame controllers, smartphones and tablets. WITRA is thus intended to enable any user - with or without technological background - to teleoperate a robot arm with the same grade of success of using his/her own arm to perform taks.

Teleoperation tasks include bomb defusing, operation in hazardous environments, such as ratioative rooms, deep-ocean-equipment maintainance, among many others. 

One further use for WITRA is for situations where the user is not strong enough to lift an object an can perform this task using a robot arm, but doing this as if he/she himself/herself were picking the object. 

The components used for this project are described in details as follows.

2. Components

2.1. Getting the user's arm movement

The human arm has basically 7 rotational degrees of freedom (DOF), namely: 3 at the shoulder, 1 at the elbow and 3 at the wrist. However, unless the robot arm is antropomorphical and built with the ecxat same DOFs of the human arm, the robot and the user's arm won't have the same geometry. An interessant approach to the implementation of WITRA is to make it possible to operate any type of robot, not only antropomorphical ones. 

Degrees of Freedom of the Human Arm

Moreover, to make it possible, an initial ideia would be to measure the angle of rotation of certain joints on the user's arm (different DOFs) and use this information as reference for the position of each rotational joint of the robot arm. However, as said before, this would actually make sense only for the case of an antropomorphical robot. For, say, a SCARA or cylindrical robot, it would be just as natural as using a tablet app for operating the robot, since it would consist of a simple correlation of actions (rotation of one joint at the user's arm or, for example, tilting the tablet) and results (the rotation of one single joint on the robot arm). 

For this purpose we've implemented a spatial correlation between the hand of the user and the robot's end effector. This comes into play by measuring each DOF of the human arm and using this information to calculate its foward kinematics, resulting on a (x,y,z) position of the users hand. On the other side, we want to control the robot's trajectory with cartesian coordinates as reference. This will be explained soon.

In order to measure each rotation of the human arm, inertial sensors (IMUs) are used. In fact, three of them are used: one at the user's hand, one at his/her forearm and the thrid one at his/her arm. Each IMU is capable of calculating its yaw, pitch and roll. and thus we have 9 measurements of angle. With this (redundant) information, one can assess the value of each rotation of each of the 7DOF of the user's arm, in order to calculate the foward kinematics. 

The IMUs used are the Razor 9DoF model.

IMU Razor 9DoF

2.2. Using Toradex module and data processing

The data from the three sensors is sent to a Toradex module over serial interface and processed using a real-time application. As previously mentioned, once we have the measurement of the 7DOFs of the user's arm, it is possible to calculate the forward kinematics of the human arm. This gives a (x,y,z) position, which is the input, after proper mapping, for the inverse kinematics of the robot arm. In other words, given a certain desired position for the end effector of the robot, it is possible to calculate a set of angles of rotation of its joints.

Block Diagram of the information

Figure1: Block-diagram of the information flow

Finally, the calculated angles of rotation of each robot joint will be sent over ethernet to the controler of the actual robot, which can be miles away from the user, thus making the teleoperation possible. 

Using the Toradex module will make the project totaly mobile and wireless. A previous version of the project used a PC instead of the Toradex module and thus many cables were necessary. The biggest problem was that the PC was not so mobile and added difficulties on taking the project to any place. With the implementation of a embedded computer everything will be easily at the user's hands. 

2.3. Robot Arm

The robot used in this project is a SCARA 7545 from IBM. This is a very practical robot, since it can reach a wide range of positions in space, in a better manner when compared to a cartesian robot. 

SCARA

The robot: a SCARA (Selective Compliance Assembly Robot Arm) 7545 from IBM


This robot has 4 degrees of freedom (DoF): 3 rotational and 1 prismatic. This configuration allows the robot to reach a great variety of positions in space, creating a versatile work envelope and also being faster than a cartesian robot. 

schematic

Figure 1: Schematic of SCARA.

The wide range of positions that this robot can reach in space is due to a wide reach of its joints. The first rotational joint can rotate from 0 to 200 degrees; the second, from 0 to 135 degrees; the prismatic joint can move from 0 milimeters (upper position) to 200 milimeters (lower position); the last joint (tool rotation) can rotate from -180 degrees to 180 degrees. 

3. System and Software

The software is responsible for acquiring data from the IMUs, processing (i.e, filtering and calculating the direct kinematics of the human arm and mapping to the robots workspace) and sending data over TCP/IP. The basic scheme is shown below:

software

 

4. Social Media

Make sure you also follow our news and updates on our social channels below. 

youtube       facebook       email


You can also watch our videos on the playlist below:

Updates Hide

thumbnail
Update #10108  |  03 Jul 2014
0
We've tested the IMU from LP-Research, the LPMS-B, which features bluetooth connection to a host.
DeLock Bluetooth USB Adapter
Update #10109  |  09 Aug 2014
0
Bluetooth donlges are a bit difficult to find because almost every laptop/device already has built in bluetooth capability. We tested a few adapters and found one that works well with the Toradex Colibri running Windows CE 7.
IMU Razor 9DoF
Update #10113  |  19 Sep 2014
0
In this update we present the IMU Razor 9DoF. This is the IMU we are going to use on our project from now on. We are using a Bluetooth modem to make it wireless.
Human Arm
Update #10117  |  08 Oct 2014
0
A key factor in this project is the kinematics of the human arm. We use it in order to convert the angles measured by the IMUs to the position of the user's wrist.
thumbnail
Update #10123  |  05 Nov 2014
0
In this update I show you how I'm acquiring real-time data from the IMU on the Toradex Colibri module and then sending it to MALTAB.
SCARA schematic
Update #10126  |  11 Nov 2014
0
We show here how we calculate the inverse kinematics of the robot SCARA 7545 from IBM in a simple geometric way.
thumbnail
Update #10127  |  11 Nov 2014
0
We show how to use the Robotics Toolbox from Peter Corke to generate a 3D animation of the SCARA 7545 robot in MATLAB
scara_work_envelope1
Update #10128  |  12 Nov 2014
0
The goal of this update is to describe how the geometric mapping between the work envelope of the user is mapped to the one of the robot.
thumbnail
Update #10130  |  16 Nov 2014
0
See how WITRA achieved a next step: using one IMU connected to Colibri+Iris sending data to MATLAB wirelessly to control SCARA's 3D animation.
thumbanail
Update #10135  |  26 Nov 2014
0
Impedance control is a effective way to provide the user and the surroundings of the robot with safety, preventing hard colisions.
thumbnail
Update #10137  |  28 Nov 2014
0
On this video you can see how we send information over TCP/IP from a PC to SCARA's server with the end-effector's cartesian position information.
arm axes
Update #10140  |  01 Dec 2014
0
A simpler way to calculate the direct kinematics of the human arm: no Denavit-Hartenberg convention, no transformation matrix and less calculations.
Update #10141  |  02 Dec 2014
0
Watch the video to see our second teleoperation test before we move to the actual robot. Now we are getting the user's hand position with 2 IMUs.
connections_scara
Update #10142  |  04 Dec 2014
0
It's time to show the actual robot being teleoperated. After validating teleoperation data, now it is possible to use WITRA with SCARA for which impedance control is implemented.
Update #10144  |  06 Dec 2014
0
We present here a quick video that shows the new addition to WITRA; the third (and last) IMU, at the user's hand. Now we have finer measures.
Qualisys
Update #10148  |  13 Dec 2014
0
Qualisys is a motion capture (or mocap) system. It was used to capture the movement of the human arm along with WITRA, so that we can compare the two motion capture systems.
solid words
Update #10151  |  17 Dec 2014
0
We used a 3D printer to print cases for the IMUs. This offers a better attachment of the IMUs to the user's arm, as well as protects the sensors.
hardware
Update #10152  |  23 Dec 2014
0
This update shows in details the final version of WITRA in terms of hardware. WITRA is using 3 IMUs, one button to control the robot gripper, Toradex Colibri + Iris and a WiFi adapter.
software
Update #10153  |  24 Dec 2014
0
Take a look on this update to learn in details how the code works. You can also watch the video, in which I guide you throughout the code.
final
Update #10154  |  24 Dec 2014
0
Happily we got to the end of our participation on Toradex Challenge and present you now all the cool things we've achieved during this time.

Comments Hide