Skip to main content
E-va is an interactive robotic hand with the ability to mimic a users hand position. She also has a voice recognition system which allows her to react to voice commands. She can also interact and communicate with the user via a LCD display.

Parts list

Qty Product Part number
1 Arduino, Uno Rev 3 715-4081
1 Breadboard Prototyping Board 80 x 60 x 10mm 102-9147
5 Mini 180 Degree Resin Gear Servo SG90 215-3180
1 MG995 Metal 360 Continuous Servo MG995-360
1 5VDC HC-SR04 Ultrasonic Sensor SN-HC-SR04
1 I2C 1602 Serial LCD for Arduino & RPI DS-LCD-162A-I2C

In the late 80s and early 90s sci-fi classics such as 'Back to the Future' painted the 2010s as a technological utopia, with flying cars and hoverboards. Yet here we are in 2022 with few of the technologial wonders promised to us by these shows. With that it's no wonder that the current generations lack the drive and passion towards STEM that the previous generation had. E-va serves as a demonstration to relight that passion in the younger generation that a technological utopia is not as far away as it may first seem. E-va also serves as a stepping stone to demonstrate many complex concepts and features which can be applied elsewhere in a vast array of other projects.

Overview

E-va is a robotic hand built in the likeness of a human hand. She had 5 fingers of varying sizes each controlled by a separate servo mechanism to allow each finger to move independently. The wrist had 1 degree of freedom, DOF, to allow for a waving motion. E-va also has an integrated LCD display to allow her to converse with the user as well as an ultrasonic sensor to allow E-va to identify the presence of a user in her vicinity.

Design

  1. Fingers

    mechanism is used to drive the finger movement

    Figure 1: (a) Driving pulley system scheme.(b) Cable driven system scheme.

    A pulley mechanism is used to drive the finger movement. It is ideal for the three-joint pulley system to have a lightweight design and prevent unnecessary slipping during motion. A comparison between two pulley designs is illustrated below where Figure 1(a) has bulkier joints with multiple loops within each joint which may be more prone to slipping while Figure 1(b) shows a lighter structure with only one tendon locking point at the tip of the index. For E-vas design we opted for design (b) as it allows us to make the design more compact as well as reduce the amount of power needed by the servo to drive the fingers.

  2. Wrist

    Wrist cam-follower mechanism

    Figure 2: Wrist cam-follower mechanism

    The wrist is driven by a 360-degree servo which is attached to a cam-follower mechanism as shown in figure 2. The cam-follower allows for a rhythmic waving motion to be generated. The mechanism shown in figure 2 then fits into the PVC casing which acts as the wrist. When both cams are inline (rotated 90 degrees from what is shown in image) the wrist will be flat and the hand will be upright. Upon the activation of the servo, the cams will rotate such that the peak of one cam and the trough of the other cam aligns, forming the waving motion.

  3. Palm

    CAD drawing of the palm

    Figure 3: CAD drawing of the palm

    The palm of the hand acts as a housing for all the finger servo motors as well as a connector to the wrist. The front of the palm is fitted with an acrylic sheet to allow for viewing of the servo motors while the hand is operational. The acrylic plate can also be removed for easy access to the servos and string of the pulleys.

Features

  1. Mimicking

    E-va can mimic the finger positions of a user by adjusting the angle of each individual finger servo. First, the user has to place their hand in front of their laptop's camera. Once their hand is detected, the system will calculate the distance between the tip of each of their fingers to the base of the finger. The maximum distance of each finger is recorded. When they close a finger, the distance between the tip and base starts to reduce towards 0. This is then fed into an algorithm which converts the change in distance to a set angle for the servo.

  2. Voice Recognition

    From when E-va is booted up, the energy threshold for ambient noise levels is calibrated. Whenever the energy threshold is exceeded a stream of audio data is taken in. This data is then processed and converted into text via an API. The text is then scanned for any keywords or phrases. When a designated keyword or phrase is said, a key is sent in order to run the preset voice command.

Hardware

Circuit diagram

Figure 4: Circuit diagram

Figure 4 shows the circuit diagram of all the components (note that the 360-degree servo is not differentiated in the diagram). Each component can be wired directly to the Arduino and powered using the 5V and ground rails of the Arduino. The HC-05 Bluetooth module allows for serial communication between the Arduino and the user's laptop. The LCD display allows for text display to allow E-va to communicate better with the user. The ultrasonic sensor is for proximity detection which is used for interactive features such as rock, paper, scissors as shown in the E-va Voice Recognition Demonstration video. The 6 servos are used to control the 5 fingers and wrist respectively.

Software

Block diagram of E-va’s system

Figure 5: Block diagram of E-va’s system

Data sent to the Arduino

Figure 6: Data sent to the Arduino

Figure 5 shows the block diagram of E-va to better visualise the flow of data and information from start to finish. The visual and audio data is sent as one list containing 6 elements as shown in figure 6. Elements 1 to 5 represent each finger from thumb to pinky respectively. The values of items 1 to 5 can range between 0 and 180 representing the angle of the finger servos. Element 6 represents the voice command. 0 on element 6 represents the neutral i.e. no voice command sent. Element 6 can currently hold the values of 0 to 5, 1 to 5 representing a voice command each. More voice commands can be added, as such the value is subject to increasing.

Flow of visual data

The python code utilizes OpenCV in order to acquire the visual data from the user’s laptop camera. This visual data is then sent to the Mediapipe library in order to identify if a user’s hand is in the frame. When a hand is detected the x,y coordinates of the tip and base of each finger is processed. An algorithm is utilized in order to acquire the distance between the tip and the base of each finger. This is then compared to the operating range of the servo motors in order to acquire the relative angle of the servo in respect to the user's finger. Once this is done for each finger the data is sent to the Arduino via the HC-05 module at a baud rate of 9600. The Arduino then moves each servo accordingly.

Flow of audio data

When the code is launched and energy threshold is set based on the ambient noise. Whenever the system detects a spike above the energy threshold, it records the audio. This audio is then sent to a Google API to be processed into text. The text acquired is then compared to the list of preset voice commands. If there is a match, a key is sent to the Arduino via the Bluetooth module. The Arduino then reacts accordingly. When an audio command is being run, the mimicking feature is temporarily disabled in order to allow the command to be executed smoothly.

If you have any ideas for improvements or decide to make your own version of E-va feel free to drop me a message at my Instagram! I'd love to hear and see what you guys think about E-va.

Hrikesh1 has not written a bio yet…
DesignSpark Electrical Logolinkedin