HUMANOIDS FINAL PROJECT
Autonomous Drone Delivery System
COMPUTER VISION
ABOUT THE PROJECT
Autonomous Drone Delivery System
For our humanoids project, we were developing an autonomous drone delivery system. The goal of the project was for a drone to detect an object, pick it up, and ultimately deliver it to its destination. Using the Anafi drone which has built in python functionality, we were planning on using computer vision to detect the object and autonomously controlling the drone and an electromagnet to lift the object from the ground.
HARDWARE
In order to accomplish our goal of creating a miniature version of an autonomous drone delivery system, multiple hardware components had to be bought and manufactured.
RASPBERRY PI
The Raspberry Pi acts as the brain of the whole operation, controlling when the drone moves, where it moves, how it moves as well as controlling the electromagnet.
MAGNET
The electromagnet acts as the claw that will allow for the picking up of each object. The electromagnet turns on when the Raspberry Pi tells it to.
CRICKIT HAT
The Crickit HAT component, is a hardware component that is attached on top of the Raspberry pi, and allows it to power the electromagnet without overloading the power output of the Raspberry Pi.
BATTERY
The Battery powers the Raspberry Pi and Electromagnet.
DRONE RENDERING
Above one can see a rendering of how the drone would look with the manufactured components attached to it. At the bottom there is a Magnet holder, while one side of the dorne holds the Raspberry Pi, the other holds the Battery. All of the pieces will be held together using velcro strips, and the Raspberry Pi will be screwed onto its manufactured holder.
SOFTWARE
Detecting Colors
The way the code above works is that it takes each individual frame from the video and converts it into an HSV image. A HSV (Hue-Saturation-Value), uses Hue to describe color. The hsv is applied to all the pixels within the image. Given these values, we can check if they lie within given color ranges. If the pixel is within the color range defined, then it is highlighted by in the mask (white pixels in the video).
DRONE FUNCTIONALITY
One of the reasons we chose the Anafi drone, was because of its compatibility with python. Olympe provides a Python controller programming interface for Parrot Drone. Using this software we could easily program the drone's functionality. In terms of the software, the drone's functionality was to be able to move in all xyz directions. Combining the computer vision software, the drone would use this information to detect an object in the room. As the drone would approach the object, it would slowly start to rotate the camera gimbal downwards until the object was directly below the drone. Then the drone would move downwards and pick up the object and take it to the finishing point.
Drone Movement
Using the Olympe documentation, we can easily manipulate the forward/backward, up/down, and left/right movement of the drone. Using this function we can continue to move the drone and stop movement functionality when a specific action is triggered. Our plan was to move the drone until it was directly above the object and then set the flying state of the drone to hovering.
Gimbal Movement
As the drone would approach the object, the camera would slowly start to rotate downwards. When the camera reached an angle of approximately 90 degrees and the object is directly in the camera's frame of view, this would indicate that the object is directly below the drone and we could start the descent for the object.