Things used in this project
Hardware components :
Hand tools and fabrication machines :
-3D printer (200x200x200mm)
Software Tools :
-IAR for ARM
-Visual Studio C#
This project has been started to explore human/robot interactions, especially intuitive control based on IMU sensors and sensor fusion. Below are the main purposes of the project:
- Focus on inverse kinematics based on user hand position.
- Build a prototype to observe the feasibility of such system and develop needed software and hardware.
- The platform has to be modular to be able to answer most of the wearable robotic applications.
- Evaluate IMU sensors as interface between wearable robots and humans.
- Identify and solve the difficulties linked to wearable robots design and development.
The hand position of the user is obtained using 3 IMU sensors placed on the user arm. Using the 9 axes data, a model of the arm is built in a C# application. With this model, the position of the hand can be calculated. Finally the position is sent to the STM32F7.
GUI of the C# application
Inverse kinematics and robotic arm model
Once the hand position is retrieved from the IMU suit, the STM32F7 computes inverse kinematics solver to position the servomotors accordingly.
Denavit-Hartenberg model for the robotic arm and inverse kinematics formulas
A 3D printed case has been designed to host the STM32F7 discovery board, the battery and a PCB. 3D printing has also been used to build the 3 DOF arm.
There are several targeted use cases:
- Heavy weight lifting assistant (e.g. building construction, factory work…).
- “Backpack assistant”, integrated in a backpack, the robot can exchange objects with the user.
- “Smart Camera stand”, camera stand with intuitive control using IMU sensors.
- Implement force control to enable heavy lifting application.
- Add flexion sensors on the hand to extend the control over the robotic arm (pointing/grasping movement…).
- Implement fuzzy logic, neural network training and ANFIS network.
ANFIS (Adaptive Neuro-Fuzzy inference system) example on 2 DOF arm
- To be uploaded later
- Inverse kinematics algorithm in C#
//calculate angles for three servomotors to match given coordinates,
private Vec4 inverseKinematics3D(double x, double y, double z)
Vec4 servoAngles = new Vec4();
double theta1, theta2, theta3;
double X0, Y0, D;
double Z0 = 5;
double a1 = 5;
double a2 = 35;
double a3 = 20;
theta1 = Math.Atan(y / x);
X0 = Math.Sqrt(Math.Pow(a1, 2) - Math.Pow(Z0, 2)) * Math.Cos(theta1);
Y0 = Math.Sqrt(Math.Pow(a1, 2) - Math.Pow(Z0, 2)) * Math.Sin(theta1);
D = (Math.Pow(x - X0, 2) + Math.Pow(y - Y0, 2) + Math.Pow(z - Z0, 2) - a2 - a3) / (2 * a2 * a3);
theta3 = Math.Atan(-Math.Sqrt(1 - Math.Pow(D, 2)) / D);
theta2 = Math.Atan((z - Z0) / (Math.Sqrt(Math.Pow(x - X0, 2) + Math.Pow(y - Y0, 2)))) - Math.Atan((a3 * Math.Sin(theta3)) / (a2 + a3 * Math.Cos(theta3)));
servoAngles.x = imuzDraw1.Rad2Deg(theta1);
servoAngles.y = imuzDraw1.Rad2Deg(theta2);
servoAngles.z = imuzDraw1.Rad2Deg(theta3);
Schematics and circuit diagrams
CAD - Enclosures and custom parts