Skip to main content

High Speed Videography for accurate Drone Rotor RPM estimation

So the notion that in order for me to proceed scientifically (and practically) towards the design of rotor dynamics identification algorithms that will run in real-time, I first need to model the effects of the rotor failure through experimentation.

The two best ways of the doing this are (1) Buy a +R2k Tachometer and record (probably by  recording the display) the speed of the propeller with a pre-determined PWM value. (2) Use the high speed capability of an action cam (such as GoPro) and some clever algorithms to compute the propeller RPM. The latter is the cheaper (and the geekier) option of the two.

The setup was such that contrast was created through the use of a black mat cloth on the setup table and painting the opposing blade of the propeller black and white (see below). A RGB (red-green-blue) adaptive algorithm was developed which would mitigate the occurrence of glares on the blade which would in turn give a false reading with the algorithm.

Each frame was analyzed by through creating a circular ring of data points around the center of the rotor (choosen through user input), then analyzing the RGB values distribution along the circumference in order to locate the position of the blade (see below). The generation of PWM commands was achieved through the use of an Arduino Uno script with allowed a 2 seconds delay between each command in order to generate enough data points for the estimation algorithm.


The computation of RPM was arrived (particularly at high RPM) through averaging the angular values for each frame (with the assumption that the blurring effect on on each subsequent frame is very similar) and then using the framerate (240 frames per second) to compute the revolutions per second. This was achieved and shown below.



The RPM measurements were then filtered using a Butterworth filter and a total variation diminishing algorithm which resulted in the results below. It does show that this method has great promise and the subsequent objective is then to model the dynamics using high order transfer functions (up to fourth order) for analysing the change in model parameters once the faults have been introduced.  Your comments are welcomed!


Comments

  1. such a nice piece of article. Thank you for sharing your knowledge. This is a very valuable thing you shared. good stuff!!Licensed Drone Videographer Dubai

    ReplyDelete

Post a Comment

Popular posts from this blog

Setting up the Tarot T4-3D gimbal on the Pixhawk 2.4.8 with Specktrum dx6 Gen2 toggle switch

So i took the challenge of setting up the Tarot gimbal not just for inherent stable video footage but also the flexibility of controlling it from the radio control. However, I encountered quite a few challenges which made me aware that I'm not the one only in this battle . It's quite clear that the setup of the Tarot gimbal using its own software is completely different from how it's been described in the Ardupilot/Arducopter webpage and in mission Planner. In Mission Planner and it's associated site makes one believe that it should be done through software, only to realize that in actual fact the setup is more complex than that.  After two evenings of trying various combinations, I realized the getting the pixhawk Aux channels to communicate with the T4 gimbal requires the following steps: - The Pixhawk Pin9 (Aux1) needed to be activated to pass through user-chosen channel from the transmitter. For the Dx6 Gen2 it was the channel 6, which can assigned the ...

GPS Navigation Ground Test #2 - Heading Error Computation Algorithm

This one is going to be quite short. Yesterday was the turn of the heading error algorithm to be tested. This heading error is calculated based on the heading the between two waypoints and heading measurement from the GPS module. This error will then be fed into a the roll controller as an input for roll command to reduce it to zero. But for the roll controller to work accordingly, the input must be right and within certain bounds. Same as the previous ground test, waypoints were loaded unto the autopilot and serial debug data was monitored using my Asus TF101 Tablet. It's worth saying that I managed to get serial data output straight from the LINUX command line . So the command line integration with VIM is complete. So it takes approximately under 10sec to upload and start debugging data of the autopilot. Sweet! Anyway, it was found that the GPS accuracy should be considered at 10-12m. Anything less than that and you'll be running for trouble. That is not a real conc...

Unmanned aircraft and crop duster fly too close

An aircraft separation incident between an unmanned aerial vehicle and crop duster highlights the challenges with having a diverse mix of aircraft operating in the same airspace. On 12 September 2013 the pilot of an Ayres S2R commenced aerial agricultural spraying operations on a property near Horsham, Victoria. At about the same time, the operator of a UAV, Sensefly eBee 178, arrived at ‘Iluka Echo’ (Echo) mine site to conduct an aerial photography survey of the site. After completing his pre-flight preparation and risk assessment of the operation, the operator heard an aircraft operating about 1 – 1.5 km away on a neighbouring property.