top of page

SOFTWARE DESIGN

__

Full source code available on BitBucket.

Communicating between Android and Arduino

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

We decided to use Bluetooth rather than WiFi or USB to communicate between the smartphone and our Arduino board, because the Bluetooth platform is widely supported and allowed us to simplify our circuitry. To allow the Arduino to send and receive Bluetooth signals, we attached a Bluetooth HC-05 module to it. All data is sent over the standard Bluetooth serial port.

 

When an incoming call is detected on the smartphone, a background listener is activated and sends a special wake up signal to the Arduino board. The phone then continues to transmit the angle offset to the Arduino, from which the control system takes over to correct the error.

 

Once the robot has arrived at the user, he can command the robot to return to the charging station with the simple tap of a button. The same feedback control system then allows the robot to find its way back to the station.

 

Image Recognition

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

We employed OpenCV4Android, an open-source computer vision software library, to perform color blob detction in our Android app. Red was chosen as our target color due to its contrast to other colors and infrequent occurence in the environment (particularly since the environment happens to be UC Berkeley). In our demo, we thus had the target user had a piece of red cloth tied to his or her ankle.

 

We took advantage of the smartphone's front-facing camera to act as "eyes" for our robot. We filtered out hues of red from the video feed, using a specified range of HSV values to draw contours around red objects. To improve accuracy, we then weighted each contour by its y-distance from the bottom of the camera frame and size relative to the whole frame, then selected the one with the highest weight as our target object. We weighted objects closer to the ground higher because both our target objects, the red cloth tied to the user's ankle and the charging station, would be positioned close to the ground.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Once the contour of the target had been identified, we calculate the center of the contour and its x-offset from the middle of the frame. The resulting angle is then transmitted to the Arduino via Bluetooth.

 

 

Control System

 

 


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

During movement, the software will continue polling the camera to ensure that the Mobile Dock is moving directly towards him or her, continuously sending an angle offset for the robot to continuously adjust its path. This effectively creates a closed-loop feedback system to account for noise or the user moving around. In practice, we found that this also allows the robot to follow the user if he or she is walking around or even walking away from the robot.

 

To turn Mobile Dock such that it becomes facing the user or the charging station, a closed-loop controller is used. The block diagram in the figure above shows how the controller is implemented. All the wheels are set to run at half of the full speed when Mobile Dock is travelling along a straight path. That speed is obtained by setting the middle number of highest PWM output (256/2=128) as to control the motor speed. If the angle of the target with respect the front direction of the phone screen is positive, the right wheel will move faster based on the angle, and the rear wheel will turn clockwise. If the angle is negative, the left wheel will move faster, and the rear wheel will turn counterclockwise. The output is the angle, and the input is 0. The sensor will be the mobile phone mounted to Mobile Dock, which sends the angle to the Arduino.

 

Due to the discretization of the angle read by the phone, the derivative portion of the PID controller was not used (Kd=0). For having a faster response and less overshoot with the trade off of a precise steady state value, the integral part of the PID controller (Ki) was tuned to be small. In short, Kp and Ki were used and were tuned using the approaach of the Ziegler-Nichols "magic formula" to ensure a most direct path with the least jittery motion.

 

 

Circuit Design

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

There are no inputs wired to the Arduino. The inputs are instead obtained via the HC-05 Bluetooth module using the RX and TX pins (D0 and D1). The Arduino subsequently uses these commands to control 3 separate motors and 3 servos via the PWM pins. The motors will be controlled via the TA7291 H-bridges which serve to power the motors as well as buffer the Arduino outputs.  The PWM outputs from the Arduino allows us to control the direction and duty cycle of the motors through the H-bridges. The use of H-bridges is highly advantageous because it decouples the low-voltage (5V) circuit for sensors and controllers from the high-voltage (12V) circuit for motors.  As a result, buffers are no longer necessary, and the circuit is greatly simplified.

bottom of page