Finally, Intrinsic (a spin-off of Google-X) has revealed the product they have been working with the help of the Open Source Robotics Corporation team (among others): Flowstate!
What is Flowstate?
Introducing Intrinsic Flowstate | Intrinsic (image copyright by Intrinsic)
Flowstate is a web-based software designed to simplify the creation of software applications for industrial robots. The application provides a user-friendly desktop environment where blocks can be combined to define the desired behavior of an industrial robot for specific tasks.
Good points
Flowstate offers a range of features, including simulation testing, debugging tools, and seamless deployment to real robots.
It is based on ROS, so we should be able to use our favorite framework and all the existing software to program on it, including Gazebo simulations
It has a behavior tree based system to graphically control the flow of the program, which simplifies the way to create programs by just moving blocks around. But it is also possible to switch to expert mode to manually touch the code
It has a library of already existing robot models and hardware ready to be added, but you can also add your own
Additionally, the application provides pre-built AI skills that can be utilized as modules to achieve complex AI results without the need for manual coding
One limitiation (but I actually consider a good point) is that the tool is thought for industrial robots not for service robots in general. This is good because it provides a focus for the product, specially for this initial release
Flowstate | Intrinsic (image copyright by Intrinsic)
Based on the official post and the keynote released on Monday, May 15, 2023 (available here: https://intrinsic.ai/blog/posts/introducing-intrinsic-flowstate/), this is the information we have gathered so far. However, we currently lack a comprehensive understanding of how the software works, its complete feature set, and any potential limitations. To gain more insights, we must wait until July of this year, hoping that I will be among the lucky participants selected for the private beta (open call to the beta still available here: https://intrinsic.ai/beta/).
Unclear points
Even if I find interesting the proposal of Intrinsic, I have identified three potential concerns regarding it:
Interoperability across different hardware and software platforms poses a challenge. The recruitment of the full OSRC team by Intrinsic appears to address this issue, given that ROS is currently the closest system in the market to achieve such interoperability. However, widespread adoption of ROS by industrial robot manufacturers is still limited, with only a few companies embracing it.
Ensuring hardware interoperability necessitates the adoption of a common framework by robot manufacturers, which is currently a distant reality. What we, ROS developers, aim right now is to be able to have somebody build the ROS drivers for the robotic arm we want to use (like for example the manufacturers of the robot, or the team of ROS Industrial). However, manufacturers generally hesitate to develop ROS drivers due to potential business limitations and their aims for customer lock-in. Unless a platform dedicates substantial resources to developing and maintaining drivers for supported robots, the challenge of hardware interoperability cannot be solved by a platform alone (actually, that is one of the goals that ROS-Industrial is trying to achieve).
Google possesses the potential to unite hardware companies towards this goal, as Wendy Tan White, the CEO of Intrinsic mentioned, “This is an ecosystem effort” However, it is crucial for the industrial community to perceive tangible benefits and value in supporting this initiative beyond merely assisting others in building their businesses. The specific benefits that the ecosystem stands to gain by supporting this initiative remain unclear.
Flowstate | Intrinsic (image copyright by Intrinsic)
The availability of pre-made AI skills for robots is a complex task. Consider the widely used skills in ROS, such as navigation or arm path planning, exemplified by Nav2 and MoveIt, which offer excellent functionality. However, integrating these skills into new robots is not as simple as plug-and-play. In fact, dedicated courses exist to teach users how to effectively utilize the different components of navigation within a robot. This highlights the challenges associated with implementing such skills for robots in general. Thus, it is reasonable to anticipate similar difficulties in developing pre-made skills within Flowstate
A final point that I don’t see clear (because it was not addressed in the presentation) is how the company is going to do business with Flowstate. This is a very important point for every robotics developer because we don’t want to be locked into proprietary systems. We understand that companies must have a business, but we want to understand clearly what the business is so we can decide if that is convenient or not for us, both in the short and the long run. For instance, Robomaker from Amazon did not gain much traction because forced the developers to pay for the cloud while running Robomaker, when they could do the same thing (with less fancy stuff) in their own local computers for free
Conclusion
Overall, while Flowstate shows promising, further information and hands-on experience are required to assess its effectiveness and address potential challenges.
I have applied to the restricted beta. I hope to be selected so I can have a first hand experience and report about it.
You can open it by clicking on the Open button or you can download it by clicking on the download button, the ones pointed below by the red and green arrows respectively.
The ROSJect mentioned here basically contains the following git repositories on it:
When you open a ROSject, by default you have the Jupyter Notebook automatically open, but if that doesn’t happen, you can manually open it by clicking on Tools -> Jupyter Notebook as shown in the image below:
Now on the Jupyter Notebook window let’s click openmanipulator_morpheus_chair_notebooks, then click on Notes_Commands.ipynb to open it. That notebook contains the instructions we are going to follow in this post.
Start Demo Pick and place Everything:
In order to start the demo, let’s first run the Service Server that will be used to move the arm:
Following up our previous post about how to control a Robot Arm with ROS, today we are going to learn how to set up a MoveIt package for controlling the arm.
In short, in this post we will learn the following:
How to create a basic MoveIt package for your own robot
Test that package moving and planning with moveit.
Special Thanks to Clarkson University and specially James Carrol and its team for lending us the physical Open Manipulator robot.
Getting the code
We are going to create the MoveIt package which uses some URDF files. In order to get the URDF files, please get the following ROSject: http://www.rosject.io/l/b368f5c/
You can open it by clicking on the Open button or you can download it by clicking on the download button, the ones pointed below by the red and green arrows respectively.
The ROSJect mentioned here basically contains the following git repositories on it:
When you open a ROSject, by default you have the Jupyter Notebook automatically open, but if that doesn’t happen, you can manually open it by clicking on Tools -> Jupyter Notebook as shown in the image below:
Now on the Jupyter Notebook window let’s click openmanipulator_morpheus_chair_notebooks, then click on Ep3_MoveIt!_First_Steps.ipynb to open it. That notebook contains the instructions we are going to follow in this post.
The URDF file we are going to use
If you opened or downloaded the ROSject you will see that we have a package called open_manipulator_support_description with the .xacro file called open_manipulator_support.urdf.xacro. If you opened the ROSject on n ROSDS (ROS Development Studio), the full path to that file is: ~/simulation_ws/src/open_manipulator_tc/open_manipulator_support_description/urdf
The content of open_manipulator_support.urdf.xacro is:
Now that we know which urdf/xacro file we are going to use, let’s create a MoveIt package. On the ROSject that we are using that package is already created for you and it’s available on the path /home/user/catkin_ws/src/openmanipulator_ep2_movit_config, but we are going to show how we created it.
In order to open the MoveIt Assistant, the program used to create our first MoveIt Package, we used the following commands:
cd ~/catkin_ws/src
roslaunch moveit_setup_assistant setup_assistant.launch
MoveIt Assistant is a graphical application, so, after running the command aforementioned, on ROSDS you have to open the Graphical Tools by clicking Tools -> Graphical Tools as shown below:
After clicking on that button we should see MoveIt Setup Assistant as in the image below:
If you want the Graphical Tools to be opened in a different tab of the web browser, you can simply click on the Detach button as shown below:
Creating the MoveIt package from scratch
With the MoveIt Assistant open, let’s click on the Create New MoveIt Configuration Package button as shown in the image below:
Now we need to select our URDF file. For that, let’s click on the Browse button under the section Load a URDF or Collada Robot Model and select our URDF file mentioned earlier, which is on the path:
then let’s click Load Files as in the image below.
Now you should have the URDF loaded. Make sure you can see it properly as in the right side of the image below:
Generate Collision Matrix
The first thing that is vital for a robot arm when it moves, is to NOT HIT ITSELF. Which seems dumb but, it is really a common way to break a thousand-euros robot arm if it doesn’t have the correct safety features like peak torque detection or some kind of external perception.
We have to generate what is called Self-Collision Matrix. To do it we use the Self-Collision Matrix Generator. We need to generate this matrix because:
Doing this we detect which links will collide with each other when moving.
We will detect also the links which will never collide with each other, which then we can remove the auto collisions calculations for them, lowering the burden in the processing.
Detect which links will be always in collision and therefore we suppose that its normal and therefore we also disable the calculations.
Disable also the links adjacent in the kinematic chain which obviously we will disable also their auto collisions.
If you look on the left side image above, we are at the Start section. Let’s click on the Self-Collisions button that appears on the left and then click Generate Collision Matrix.
The buttons we clicked are shown below:
The Sampling Density value of 10,000 collision means how many random robot positions to check for self-collision. The higher the better collision detection matrix it generates, but we will need more time and processing power in parallel.
The Min Collisions of 95% means that for considering that the pair is always colliding has to be 95% of all the random positions tested.
After clicking Generate Collision Matrix we have the matrix generated as in the image below.
The links are listed in a Linear View. Let’s select the Matrix View because that view allows a better understanding.
The green parts of the image above means “Never Collide”.
Note that because the Collision Matrix is random, it could be that every time you generate a new matrix, the matrix appears a bit different:
Virtual Joints
A robot that moves around a plane is specified using a planar virtual joint that connects the world frame to the frame of the robot. A robot that doesn’t move, will use a fixed joint to connect the base frame to the world.
In our case we select:
Name: virtual_joint ( just to know its a virtual joint )
Parent: world
Child: the base_link, that we want to connect to world.
Joint Type: Fixed, because we won’t move.
You can play with this because, what if we select as a parent a link of another robot, like a turtlebot that moves around? These things we will go deeper when the time comes. DONT FORGET TO HIT SAVE.
Planning Groups
Doing Inverse Kinematics is computationally very intensive. This means that the simpler the kinematics to solve the better. That’s why normally we divide a robot into the maximum parts that allow a correct and easy IK calculation. Ex: a robot with TWO arms, normally will be divided into LEFT_ARM and RIGHT_ARM, because we don’t need to solve inverse kinematics for both, but it all depends on the use case.
We will choose the following:
Solver: kdl_kinematics_plugin/KDLKinematicsPlugin as the kinematics solver. This is the plugin in charge of calculating the inverse kinematics. It’s a generic Kinematic solver, that will be ok for now. It is the default one in Moveit!. It only works with Serial Kinematics Chains ( tensegrities and things like that won’t work).
Name: openmanipulator_arm seems appropriate.
Kin Solv. Attempts: 3 seems reasonable.
Planner: for now we leave this none
We add the joins!
In our case, we select ALL the Joints except the gripper.
Now we add the gripper through joints also:
joint8
joint9
id_6
NOTE that we are NOT adding anywhere the LINK7 that’s because it serves no purpose in the planning for the moment.
We now can store premade robot poses to set the robot in safe positions, calibration, very used positions, etc.
We added four poses:
Two for the OpenManipulator_ARM group
Two for the Gripper group.
End Effector:
We can add now the gripper as end-effector. This unblocks some functionality related exclusively with end effectors:
We set:
Name: gripper
Group ENDEffector: gripper
Parent Link: Link5
Parent Group: Not necessary here to state it.
Things we won’t set:
These are elements that we might enable after, but for now, we leave them unset because we don’t need them:
Passive Joints: These are for caster wheels and other arent actuated.
3D perception: We don’t have for the moment any sensors (We mey add it afterwards).
Simulation: When executed it tells us there is nothing to change.
ROS controllers:
Here we have two options:
Add the controllers manually: This allows us to select which type of control we are adding.
Auto-add FollowTrajectoryControll
For the moment we will AutoGenerate the FollowJointTrajectory Control. If we need to change it we just have to re-edit the Moveit package.
Author: Add the Author info
FINALLY, generate the package
Now is the time to generate the package that we will use:
Select the location. Hit browse and create a new folder with the name of the package you want. In our case, we will call it open_manipulator_morpheuschair_movit_config.
So, that is the post for today. We hope you guys enjoyed it. Remember that we also have a video showing everything that is in this post. Please have a look at the video if you didn’t understand some of the things that were explained here. If you liked the content, please subscribe to our channel and leave your comments on the comments section of the video, which is available at:
Welcome to Morpheus Chair, the series where you learn how to build and control real robots like the Open Manipulator that we are going to use today.
In this post, we are going to learn the first steps to use the servos used to control a robot arm, so that you can build whatever you want.
We are going to see how to use move the arm with python scripts, which will be the first steps to get a bit close to MoveIt, that great library that we are going to use in the next posts.
How to start a position control for Dynamixel Servos
How to move the joints publishing in Topics and Services of ROS
How to capture complex movements to after reproducing them.
Special Thanks to Clarkson University and especially James Carrol and its team for lending us the real robot used in this post.
Let’s get started, shall we?
Where to find what we need
Before we start, is worth mentioning that we need to be careful when working with the robot, because the servo motors are a bit strong. If you get your fingers or face in the trajectory of the arm you can get hurt. Just be careful.
You can download the ROSject and run all the commands in your own computer, but if you prefer, you can also run some of the commands online by using ROS Development Studio (ROSDS).
By clicking on http://www.rosject.io/l/b368f5c/ you will have a copy of the ROSject. You should see something like the image below:
You can now open the ROSject by clicking on the Open button or you can download it by clicking on the download button, the ones pointed below by the red and green arrows respectively.
The ROSJect mentioned here basically contains the following git repositories on it:
When you open a ROSject, by default you have the Jupyter Notebook automatically open, but if that doesn’t happen, you can manually open it by clicking on Tools -> Jupyter Notebook as shown in the image below:
Now on the Jupyter Notebook window let’s click openmanipulator_morpheus_chair_notebooks, then click on Ep2_Dynamixel_Control.ipynb to open it. That notebook contains the instructions we are going to follow in this post.
Once again, be careful when working with real robots because the servo motors used today, for example, they don’t have force feedback, so if somehow you get a finger stuck there it can be really harmful.
Connecting our robot to our local computer
We are assuming you already have the robot. Now let’s connect it to our computer:
once the USB cable is connected, let’s see that our device is connected. If we run the command below:
dmesg -wH
and disconnect our USB cable we will see a message saying so.
[ +16,678853] usb 3-1: USB disconnect, device number 9
[ +0,000491] ftdi_sio ttyUSB0: FTDI USB Serial Device converter now disconnected from ttyUSB0
[ +0,000059] ftdi_sio 3-1:1.0: device disconnected
If we now connect our robot again we should see a message saying confirming that.
[ +7,356777] usb 3-1: new high-speed USB device number 9 using xhci_hcd
[ +0,134790] usb 3-1: New USB device found, idVendor=0403, idProduct=6014
[ +0,000011] usb 3-1: New USB device strings: Mfr=1, Product=2, SerialNumber=3
[ +0,000006] usb 3-1: Product: USB <-> Serial Converter
[ +0,000005] usb 3-1: Manufacturer: FTDI
[ +0,000004] usb 3-1: SerialNumber: FT2H2ZXW
[ +0,003952] ftdi_sio 3-1:1.0: FTDI USB Serial Device converter detected
[ +0,000086] usb 3-1: Detected FT232H
[ +0,000876] usb 3-1: FTDI USB Serial Device converter now attached to ttyUSB0
Note that our device name is called ttyUSB0 according to the logs. The name is important because we have to use it when setting up our program.
Another way of getting more information about the connected device (robot) is with the command below:
lsusb
which will give us the following output when our robot is not connected.
$ lsusb
Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 006: ID 0cf3:3005 Atheros Communications, Inc. AR3011 Bluetooth
Bus 001 Device 004: ID 04f2:b2cf Chicony Electronics Co., Ltd
Bus 001 Device 003: ID 04d9:a067 Holtek Semiconductor, Inc.
Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
If we now connect the robot again, we can see it now:
$ lsusb
Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
**Bus 003 Device 010: ID 0403:6014 Future Technology Devices International, Ltd FT232H Single HS USB-UART/FIFO IC**
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 006: ID 0cf3:3005 Atheros Communications, Inc. AR3011 Bluetooth
Bus 001 Device 004: ID 04f2:b2cf Chicony Electronics Co., Ltd
Bus 001 Device 003: ID 04d9:a067 Holtek Semiconductor, Inc.
Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
By comparing the messages when our device is connected with the messages when it is not connected, we can clearly see that our robot is the Future Technology Devices International, Ltd FT232H Single HS USB-UART/FIFO IC entry.
Running the controllers
If you downloaded the ROSject and extracted on your home folder (/home/your_username), you should have a simulation_ws folder there. In that folder, we will find a package called dynamixel_workbench_controllers, more precisely on the ~/simulation_ws/src/open_manipulator_tc/dynamixel-workbench/dynamixel_workbench_co ntrollers path.
On that package, you can find a launch file called position_control.launch, which contains the following content.
Note that on the second line we have the name of our device /dev/ttyUSB0. If in your computer the device has a different name, please update that line with the correct value. You can use the command dmesg -wH to see what is your device’s name.
On the third line, we have the baud_rate which in our case is 3000000. To get the baud_rate or our device we just run the command below with the robot connected:
stty < /dev/ttyUSB0
which in our case showed that our speed baud is 3000000.
If everything goes ok, the robot now should be stiff. You shouldn’t be able to move it.
If after running the roslaunch command you have an error like the one below, that happens because you may have a different baud rate. If that is the case, please update the device_name and baud_rate variables on the launch file with the correct values.
[ERROR] [1562837899.024503906]: Not found Motors, Please check scan range or baud rate
================================================================================
REQUIRED process [position_control-2] has died!
process has died [pid 14723, exit code -11, cmd /home/rdaneel/ros_playground/open_manipulator_ws/devel/lib/dynamixel_workbench_controllers/position_control __name:=position_control __log:=/home/rdaneel/.ros/log/9c5dd618-a3bf-11e9-ba51-9cb70d2d9370/position_control-2.log].
log file: /home/rdaneel/.ros/log/9c5dd618-a3bf-11e9-ba51-9cb70d2d9370/position_control-2*.log
Initiating shutdown!
================================================================================
If you open a new bash terminal and run rostopic list you should see at least the topics below:
$ rostopic list
/dynamixel_state
/goal_dynamixel_position
/joint_states
/rosout
/rosout_agg
We can also check the services we have. With rosservice list we should have at least the services below:
If you followed the instructions correctly, running the commands rostopic list and rosservice list on ROSDS would give you the same output of when we ran on our own computer in which the robot is connected to.
Diving deeper
With rostopic list we can see that we have basically three topics:
There are the basic topics that should always exist with any robot if you use the controller we are using here.
If we want to know more information about a specific topic, like /dynamixel_state for example, we can just run rostopic info /dynamixel_state. In this command we will see the list of publishers, subscribers and the message type, which for this topic is:
$ rostopic info /dynamixel_state
Type: dynamixel_workbench_msgs/DynamixelStateList
With rostopic echo /dynamixel_state -n1 we can see the model_name of each servo motor.
With rostopic echo /dynamixel_state -n1 we saw the present_position of each joint in a raw way. By printing the messages published on the /joint_states topic we can see the positions in radians:
We can see that it has only subscribers, not publishers. That happens because the robot is listening to that topic and nobody is telling the robot to move yet. Let’s publish our first message there so that our arm can move.
Moving the arm
In order to move the arm, we have to publish a message on the /goal_dynamixel_position topic. For that we can use the command below:
We will fill the name, position, velocity, and effort with the same data we got from rostopic echo /joint_states -n1. We are just going to change the value of the first joint to 0.7. At the end our message will be something like:
Bean in mind that unit can be ‘rad‘, ‘raw’ or nothing. If we select raw or nothing, the effect is the same.
Creating a python ROS Node to move the servos
To move the arm with python code we have two options:
We move it with a Service Client ( /joint_command ): We can only move one joint at a time, but it waits until movement finished.
We move it publishing in a Topic ( /goal_dynamixel_position ): Moves everything at the same time, but there is no wait.
We are going to create a python class that allows us to use both. If you have the ROSJect, the ROS Package is already created for you at ~/simulation_ws/src/openmanipulator_morpheus_chair/openmanipulator_morpheus_chair_tutorials. We created that package with the commands below:
cd ~/simulation_ws
source devel/setup.bash
rospack profile
cd src
catkin_create_pkg openmanipulator_morpheus_chair_tutorials rospy dynamixel_workbench_msgs sensor_msgs
cd openmanipulator_morpheus_chair_tutorials
mkdir scripts
touch scripts/move_openmanipulator.py
chmod +x scripts/move_openmanipulator.py
On the move_openmanipulator.py file we have the following content:
#! /usr/bin/env python
import time
import rospy
from sensor_msgs.msg import JointState
from dynamixel_workbench_msgs.srv import JointCommand, JointCommandRequest
from std_msgs.msg import Header
class OpenManipulatorMove(object):
def __init__(self):
rospy.loginfo("OpenManipulatorMove INIT...Please wait.")
# We subscribe to the joint states to have info of the system
self.joint_states_topic_name = '/joint_states'
self._check_join_states_ready()
sub = rospy.Subscriber(self.joint_states_topic_name, JointState, self.joint_states_callback)
# We start the Publisher for the positions of the joints
self.goal_dynamixel_position_publisher = rospy.Publisher('/goal_dynamixel_position',
JointState,
queue_size=1)
# Wait for the service client /joint_command to be running
joint_command_service_name = "/joint_command"
rospy.wait_for_service(joint_command_service_name)
# Create the connection to the service
self.joint_command_service = rospy.ServiceProxy(joint_command_service_name, JointCommand)
rospy.loginfo("OpenManipulatorMove Ready!")
def joint_states_callback(self,msg):
"""
rosmsg show sensor_msgs/JointState
std_msgs/Header header
uint32 seq
time stamp
string frame_id
string[] name
float64[] position
float64[] velocity
float64[] effort
:param msg:
:return:
"""
self.joint_states_msg = msg
def _check_join_states_ready(self):
self.joint_states_msg = None
rospy.logdebug("Waiting for "+self.joint_states_topic_name+" to be READY...")
while self.joint_states_msg is None and not rospy.is_shutdown():
try:
self.joint_states_msg = rospy.wait_for_message(self.joint_states_topic_name, JointState, timeout=5.0)
rospy.logdebug("Current "+self.joint_states_topic_name+" READY=>")
except:
rospy.logerr("Current "+self.joint_states_topic_name+" not ready yet, retrying ")
def move_all_joints(self, position_array):
rospy.logwarn("move_all_joints STARTED")
# We check that the position array has the correct number of elements
number_of_joints = len(self.joint_states_msg.name)
if len(position_array) == number_of_joints:
if self.check_gripper_pos_safe(position_array[6]):
new_joint_position = JointState()
h = Header()
h.stamp = rospy.Time.now() # Note you need to call rospy.init_node() before this will work
h.frame_id = self.joint_states_msg.header.frame_id
new_joint_position.header = h
new_joint_position.name = self.joint_states_msg.name
new_joint_position.position = position_array
# These values arent used, so they dont matter really
new_joint_position.velocity = self.joint_states_msg.velocity
new_joint_position.effort = self.joint_states_msg.effort
rospy.logwarn("PUBLISH STARTED")
self.goal_dynamixel_position_publisher.publish(new_joint_position)
rospy.logwarn("PUBLISH FINISHED")
else:
rospy.logerr("Gripper position NOT valid=" + str(position_array[6]))
else:
rospy.logerr("The Array given doesnt have the correct length="+str(number_of_joints))
rospy.logwarn("move_all_joints FINISHED")
def move_one_joint(self, joint_id, position, unit="rad"):
"""
rossrv show dynamixel_workbench_msgs/JointCommand
string unit
uint8 id
float32 goal_position
---
bool result
:param joint_id:
:param position:
:param units:
:return:
"""
joint_cmd_req = JointCommandRequest()
joint_cmd_req.unit = unit
joint_cmd_req.id = joint_id
joint_cmd_req.goal_position = position
if joint_id == 7:
rospy.logwarn("CHECKING Gripper Value is safe?")
if self.check_gripper_pos_safe(position):
# Send through the connection the name of the object to be deleted by the service
result = self.joint_command_service(joint_cmd_req)
rospy.logwarn("move_one_joint went ok?="+str(result))
else:
rospy.logwarn("Gripper Value Not safe=" + str(position))
else:
# Send through the connection the name of the object to be deleted by the service
result = self.joint_command_service(joint_cmd_req)
rospy.logwarn("move_one_joint went ok?=" + str(result))
def get_joint_names(self):
return self.joint_states_msg.name
def check_gripper_pos_safe(self, gripper_value):
"""
We need to check that the gripper pos is -1.0 > position[6] > -3.14
Otherwise it gets jammed
:param gripper_value:
:return:
"""
return (-0.5 > gripper_value > -2.0)
def movement_sequence_test():
openman_obj = OpenManipulatorMove()
# NOD
joint_position_home = [0.08743690699338913, 1.0385050773620605, -2.345456600189209, -0.016873789951205254,
-1.4818254709243774, 0.0015339808305725455, -1.0599807500839233]
joint_position1 = [0.8897088766098022, 0.6059224009513855, -1.4419419765472412, -0.016873789951205254,
-1.4818254709243774, 0.0015339808305725455, -1.0599807500839233]
joint_position2 = [0.8912428617477417, 0.5859806537628174, -1.6060779094696045, -0.016873789951205254,
-0.8191457390785217, 0.004601942375302315, -1.0599807500839233]
joint_position3 = [0.8897088766098022, 0.6028544902801514, -1.8745245933532715, -0.015339808538556099,
0.5292233824729919, 0.003067961661145091, -1.0599807500839233]
# SAY NO
joint_left = [0.44332045316696167, 1.0630487203598022, -2.345456600189209, 0.5568350553512573, -1.483359456062317,
0.004601942375302315, -1.0599807500839233]
joint_right = [-0.20862139761447906, 1.0906603336334229, -2.3071072101593018, -0.6488738656044006,
-1.483359456062317, -0.4417864680290222, -1.0599807500839233]
joint_middle = [0.0076699042692780495, 1.1274758577346802, -2.325515031814575, 0.3344078063964844,
-1.4848934412002563, 0.46172821521759033, -1.0599807500839233]
# Pendulum
pend_left = [0.46479618549346924, 0.13345633447170258, -1.728796362876892, 1.5907381772994995, -1.6797089576721191, 0.004601942375302315, -1.0799225568771362]
pend_middle = [0.39269909262657166, 0.1595340073108673, -2.0984857082366943, -0.09817477315664291, -1.0615147352218628, -0.0015339808305725455, -1.0799225568771362]
pend_right = [0.006135923322290182, 0.42337870597839355, -1.8806605339050293, -1.306951642036438, -1.0661166906356812, -0.004601942375302315, -1.0799225568771362]
joint_position_sequence_nod = []
joint_position_sequence_nod.append(joint_position_home)
joint_position_sequence_nod.append(joint_position1)
joint_position_sequence_nod.append(joint_position2)
joint_position_sequence_nod.append(joint_position3)
joint_position_sequence_nod.append(joint_position2)
joint_position_sequence_nod.append(joint_position3)
joint_position_sequence_nod.append(joint_position1)
joint_position_sequence_nod.append(joint_position_home)
joint_position_sequence_say_no = []
joint_position_sequence_nod.append(joint_position_home)
joint_position_sequence_nod.append(joint_left)
joint_position_sequence_nod.append(joint_middle)
joint_position_sequence_nod.append(joint_right)
joint_position_sequence_nod.append(joint_left)
joint_position_sequence_nod.append(joint_middle)
joint_position_sequence_nod.append(joint_right)
joint_position_sequence_nod.append(joint_left)
joint_position_sequence_nod.append(joint_right)
joint_position_sequence_nod.append(joint_left)
joint_position_sequence_nod.append(joint_position_home)
joint_position_sequence_say_pendulum = []
joint_position_sequence_say_pendulum.append(joint_position_home)
joint_position_sequence_say_pendulum.append(pend_left)
joint_position_sequence_say_pendulum.append(pend_middle)
joint_position_sequence_say_pendulum.append(pend_right)
joint_position_sequence_say_pendulum.append(pend_left)
joint_position_sequence_say_pendulum.append(pend_middle)
joint_position_sequence_say_pendulum.append(pend_right)
joint_position_sequence_say_pendulum.append(pend_left)
joint_position_sequence_say_pendulum.append(pend_middle)
joint_position_sequence_say_pendulum.append(pend_right)
joint_position_sequence_say_pendulum.append(joint_position_home)
for joint_position_array in joint_position_sequence_nod:
openman_obj.move_all_joints(joint_position_array)
time.sleep(0.5)
for joint_position_array in joint_position_sequence_say_no:
openman_obj.move_all_joints(joint_position_array)
time.sleep(0.2)
for joint_position_array in joint_position_sequence_say_pendulum:
openman_obj.move_all_joints(joint_position_array)
time.sleep(0.5)
def move_joints_test():
"""
This is for Geting the positions of the joints without testing them
live, which is quite dangerous!
:return:
"""
openman_obj = OpenManipulatorMove()
joint_names = openman_obj.get_joint_names()
rospy.logwarn("Starting Moving Joints GUI...")
while not rospy.is_shutdown():
rospy.logwarn("#######"+str(joint_names)+"#####")
joint_id = int(raw_input("Joint ID="))
joint_position = float(raw_input("Joint Position Radians="))
openman_obj.move_one_joint(joint_id, joint_position, unit="rad")
rospy.logwarn("####################")
if __name__ == "__main__":
rospy.init_node('move_openmanipulator_node', log_level=rospy.WARN)
#move_joints_test()
movement_sequence_test()
Now in order to move the arm using our script, we can simply:
If everything went as expected, your arm should have moved to different places.
With the command above we called the servers responsible for moving the arm. Remember that we launched our controller with roslaunch dynamixel_workbench_controllers position_control.launch, which basically runs the code defined on ~/simulation_ws/src/open_manipulator_tc/dynamixel-workbench/dynamixel_workbench_controllers/src/position_control.cpp. Please have a look at that file if you want to understand how the Service Server and the Topic Subscriber move the arms.
Manually extracting the servo positions
At this point, you might be wondering how we know the radian value for the positions we want the arm to move to. We do that by disabling the controllers, moving the arm to the desired position and then checking the joint states, as explained here:
Step1: Position the arm as you wish when the control isn’t on, and leave it there with the help of someone.
Step2: Turn on the control (roslaunch dynamixel_workbench_controllers position_control.launch), now the joints are stiff
Step3: rostopic echo /joint_states/position -n1
And there you will have the exact position of all the joints. You now just have to feed that move_openmanipulator.py.
What can I do when my gripper doesn’t move?!
If you have followed the gripper design that we show here, it’s possible that it jot jammed and it didn’t move anymore. Just power everything OFF, and gently rotate the gripper AXIS actuator to a more open pose. NEVER send a command to the gripper bigger than -0.5 ( CLOSE ), -2.0 ( OPEN ).
Rate doesn’t work properly?
rateobj = rospy.Rate(10.0)
for joint_position_array in joint_position_sequence_nod:
openman_obj.move_all_joints(joint_position_array)
rateobj.sleep()
rateobj2 = rospy.Rate(1.0)
for joint_position_array in joint_position_sequence_say_no:
openman_obj.move_all_joints(joint_position_array)
rateobj2.sleep()
You would think that this would make the first movement go fast and the second go slow. But it seems that the one the counts is the first rateobject. If that happens, then it would be preferable to use time.sleep().
So, that is the post for today. Remember that we have a video showing all the content of this post. Please have a look and leave your thoughts on the comments section of the video. Also, feel free to subscribe to our channel on YouTube because we are publishing ROS-related content every day.