Before my GoPro camera arrived, I did a little experiment with my Y6 hexacopter from 3D Robotics: I repurposed a pocket camera for aerial images, set the drone to fly in a survey pattern over a limited area and processed the resulting photos using a free cloud app.
As a result, I was able to produce a partial 3D rendering of the area I surveyed.
The Canon S90 is a fabulous camera, and Canon has kept improving it in successive updates. The prospect of getting a nice still image sensor into the air was quite an exciting one.
In order for the S90 to take photos from altitude, I would either have to remote control it, trigger it from the flight computer or use an intervalometer. A software based intervalometer is standard on many cameras nowadays but not the S90. However, by installing an alternative firmware version that is not produced by Canon, the S90 could have an intervalometer function as well as a host of other features.
CHDK was the answer – the Canon Hack Development Kit. CHDK is an open effort to expand the capabilities of Canon cameras. Find the right version of CHDK for your camera, download it to the SD card and you can load it into the camera at boot-up time: it masquerades as a firmware update. One of the capabilities of CHDK is the ability to run scripts, and one such script was an intervalometer.
I tested the intervalometer and set the interval to four seconds which I believed to give me sufficient overlap between photos so that software could establish how to stitch together the batch of photos resulting from the flight.
The Droidplanner 2 app is nothing short of amazing. (Available on Android free but users are encouraged to support development.) I use it to design missions, transfer the mission waypoints to the drone, activate drone behaviours during flight and for getting real time telemetry.
When planning a survey mission, slide the map to the area you are interested in, draw an outline and tell the app what kind of camera you are using. It will then calculate the fly-over pattern and advise on how many photos to shoot given a desired level of overlap.
I added a few waypoints in addition to the survey pattern: A 20 second holding point at 20 meters as waypoint 1 so that I could expect how the drone was balanced and abort the mission if something wasn’t looking right. And I added a waypoint 14 for the drone to be sure to traverse above the treetops to a point close to where I expected to be launching from. From there, waypoint 15 was an auto-land waypoint.
A useful maneuver when you have transferred the flight plan to the craft is to as Droidplanner to load the plan from the craft. That way you know exactly which plan the drone is executing to.
The area I chose for my first fully autonomous mission was probably a bit boring for a mapping missions because it was mostly flat. It did have some hilly terrain towards the southern perimeter though. Had I moved less than a kilometer south-west I would have had the opportunity to include a cliff drop-off in the area to be surveyed but I chose to stage the flight in a less spectacular area because I felt I would better be able to intervene and take control if there were problems with the autonomous mode. In the extreme, the flat plain also offered a better chance for recovery if the drone decided to auto land because of low battery. After all, flying the Y6 with the S90 on board was additional payload that would require continuous extra lift.
I had to start the camera’s intervalometer before I strapped it to the underside of the craft. Once it was strapped on I wouldn’t have access to load the intervalometer script. With a fully charged battery and a big enough SD card the extra photos I would be recording didn’t really pose any concerns.
The drone executed the survey pattern flawlessly. On Droidplanner I could follow the drone’s path in real time and see how it stayed almost on top of the white lines. There was quite a bit of wind which rocked the drone quite a bit but it produced no discernible deviations from the course as depicted in Droidplanner.
The survey pattern and the additional waypoints added to the flight plan took just about 10 minutes to complete. The drone landed itself beautifully and paused the engines.
What was really interesting about the flight I had just carried out was that from the moment I engaged Auto flight mode, the craft was independent of signals from the ground. It was executing according to a plan that was stored on-board in the Pixhawk flight computer. The drone continued to transmit telemetry which I picked up on Droidplanner, and I could have chosen to intervene and take control at any point (because I didn’t fly out of radio control range).
The 3D landscape model
There were approximately two hundred photos on the SD card but by discarding incidental selfies from when I was mounting the camera and close-ups of grass pre-take-off etc, 67 photos were from the actual survey. Without any editing or lens correction I loaded the photos into Autodesk 123D Catch, a free cloud application that specializes in creating 3D models from photos.
Note that the Canon S90 doesn’t have GPS, nor did I synchronize the photos’ metadata with flight position data in post processing to infuse position into the photo EXIF. 123D Catch is essentially a photo stitching app that outputs 3D models. The vertical dimension is inferred from perspective changes of identified objects.
My model didn’t turn out very well.
Looking through the 67 photos, most of them are blurred – probably due to vibrations or wind (I didn’t use a gimbal). This would make it hard for the software to identify matching points in overlapping photos and it would make it difficult to identify elevated structures like trees. And so we see that the 3D output has a gaping hole in the middle where there wasn’t enough good data to produce a surface model. What is encouraging, however, is how the model shows the hill accurately on the border of the survey area, including the fact that vegetation is greener on its slopes.
I couldn’t resist looking into what was logged by the flight recorder.
This is a plot of altitude above ground (at launch location), latitude and longitude superimposed.
It looks like the drone overshoots a bit upon arriving at waypoint 6 but otherwise keeps close to the flight plan.
(The sheer amount of data that is captured during flight by the Pixhawk amazes me. With the blurred images, perhaps I need to check the readings on rig vibration.)
I’ll be experimenting more with this kind of modeling. I now have a GoPro mounted on the Y6 using a gimbal. That makes it easy to take straight-down shots without blur. The GoPro’s lens introduces a lot of distortion into the photo though but it is likely that most 3D stitching software will include profiles that correct for that behaviour. If not, I’ll still have the Canon S90 as an option.
It will also be worth using different kinds of software. Agisoft and Pix4D are tools I have seen others build impressive models in, and new options are appearing regularly. Some algorithms may be helped along by having geodata in the EXIF, and there are indeed ways to do that.
I have seen claims that this approach to mapping can create 3D maps with a resolution as detailed as 5-10 cm per pixel. Satellite imaging provider Digital Globe, who deliver mapping images to Google Maps and others, will soon be providing 31 cm/pixel resolution imagery when they launch their Worldview-3 satellite later this year.
With the use of autonomous missions and without many constraints regarding the choice of camera, mapping a landscape with your drone is a worthwhile activity. I suspect it is not too difficult to get better results than I did. The Ardupilot wiki has a good description of what you need.