Whether free and flexible movements or defined sequences – thanks to its modular design, a lightweight pneumatic robot can be used for numerous applications. In combination with various adaptive pneumatic grippers, it can pick up and manipulate a wide variety of objects and shapes. At the same time, it has been designed to yield and poses no danger to the user, even in the event of a collision.
As a result, the BionicSoftArm meets two essential requirements for tomorrow’s collaborative workspaces, increasingly eliminating the strict separation between the worker’s manual tasks and the automated actions of the robot. This means that in the future, both humans and machines will be able to work on the same workpiece or component at the same time.
To achieve this, it means that on the one hand, automated robot solutions must be able to interact with humans directly and safely – without the two having to be isolated from each other for safety reasons. On the other hand, such open workspaces will primarily demand robots that can be easily customized and independently adapt to different products and scenarios.
The BionicSoftArm owes its flexibility to its modular design, which makes it possible to combine several pneumatic bellows segments and rotary actuators. Depending on the requirements, the length of the BionicSoftArm can be varied with up to seven pneumatic actuators, thus offering maximum flexibility in terms of range and mobility. This makes it very easy to implement applications that are difficult to achieve with a standard robot.
As a result, the BionicSoftArm can also work around obstacles in extremely tight spaces. This makes direct human-robot collaboration just as possible as its use in traditional SCARA applications, such as pick & place tasks. The elimination of costly safety devices such as cages or light barriers shortens modification times and allows the robot to be used flexibly at different locations – making production processes extremely versatile and cost-effective.
The Bionic Learning Network team incorporated numerous findings and technologies from previous projects into the development of the BionicSoftArm: like its two predecessors, the Bionic Handling Assistant and the BionicMotionRobot, the BionicSoftArm’s movements and functionality are inspired by an elephant’s trunk. With its pneumatic bellows structure, the BionicSoftArm effortlessly performs the flowing movements of its natural model.
The bellows are made of robust elastomer. Each one of them is covered with a special 3D knitted fabric consisting of two layers. A soft knitted fabric lies directly on the bellows to protect them from friction and wear. The layer of high-strength fibers above this are oriented so that they allow the bellows structures to expand in the required direction of movement and at the same time restrict movement in the other directions. This innovative fiber technology is what makes it possible to exploit the potential force of the entire kinematics system.
The BionicSoftArm’s software architecture is also based on the Bionic Learning Network’s previous projects, so it is controlled via the intuitive Robotic Suite. The graphical user interface was developed specifically for Festo’s lightweight bionic robots, and was first used with the BionicCobot. The user can easily teach the robot the actions to be performed and set their parameters using a tablet.
The commands are implemented by a Festo Motion Terminal VTEM, which is what makes it possible to control and adjust the complex kinematics in the first place. The internal control algorithms of its motion apps and the built-in piezo valves allow flow rates and pressures to be set precisely and also varied as desired across several channels simultaneously. This enables both powerful and fast as well as soft and sensitive motion sequences.
The open-source ROS (Robot Operating System) platform serves as the interface between the tablet GUI and the Festo Motion Terminal, and is used to calculate the paths of the kinematic system. To do so, the ROS interprets the incoming code from the tablet and forwards the resulting axis coordinates to the Motion Terminal.