Sept 2016-Nov. 2016
Silicon, PLA 3d Print
KUKA Robots, 3D Print, Arduino,
Interaction and machine vision
Machine vision is the ability of a computer to “see.” A machine-vision system employs one or more video cameras, analog-to-digital conversion (ADC), and digital signal processing (DSP). The resulting data goes to a computer or robot controller. Machine vision is similar in complexity to voice recognition.
We focus on the integration and refinement of the concept in terms of interactivity, vitalism and dynamism of the design. This means we will involve techniques in the notion of machine vision. We will integrate external devices like Leap Motion, to control the transformation of the design that results from the perception of the external Devices.
The aim for this project is design an end tool for KUKA robot to catch and handle an assigned volumatic 3d print object, using softrobotics technology as well as machine vision, means trigger the robot motion with different sensor. The robot design should trace the morphology of the aim object.
Than we look into the geometric properties of this assigned object. With the series sections and curvature analysis, we found the that the best way to lift the object is using two fingers gripper, taking the advantage of the volumetric shape with positive and negative surface. According to these analysis, we design a robot and that can shift its shape into two distinct phrase, acting like a gripper.
We are inspired by the morphology of hands from different species, which are living upon different scenario and environment. Hence, the shape out outline and skeleton of hand is a kind of respond from species to the physical environment. These enhance the capability of surviving and enable them adapt climate change.
If we look at the components of a hand, then we can find some rigid part serving as skeleton while soft part to be envelope. we design to construction our soft robot hand with this identity, a softbotics combining strong stable skeleton and customized soft joints.
Integrating with a finger motion sensor, called Leap Motion, we can capture the movement of fingers and transfer it into a digital sequence, then we are able to control a vacuum pump to manipulate the inflation and deflation of silicon joint, resulting in the banding of the robot finger.