LWN.net Logo

ELC: Nano-copters!

By Jake Edge
March 6, 2013

The Embedded Linux Conference often has talks about interesting Linux-powered devices, which should come as no surprise, but talks about devices that fly tend to attract a larger audience. Gregoire Gentil's presentation on a video "nano-copter" was no exception. While there was no free-flying demo, there were several tethered demos that clearly showed some of the possibilities of the device.

Gentil started his talk by showing a YouTube video of the marketing-level pitch from his company, AlwaysInnovating (AI), for the MeCam device. The MeCam is a small quad-copter that allows people to video their every move as it follows them around. The video will stream to their mobile phone, where it can be uploaded to Facebook, Twitter, and the like.

[MeCam]

The device itself, pictured at right, is a "flying Pandaboard" with "lots of modifications", Gentil said. The copter has four propellers, a camera, and can communicate via WiFi. He had one of the copters tethered to a stand so that it could go up and down, but not "kill anybody if something goes wrong", he said with a laugh.

The copter runs an OMAP4 with dual Cortex-A9 processors. It uses pulse-width modulation (PWM) to control each of the four propeller motors. The camera has 1080p resolution and uses CSI-2 for the data interface as USB is not fast enough, he said. There are also a "bunch of sensors", including Euler angle sensors, altitude sensors, and wall-detection sensors.

There is a single battery powering both the CPU and the motors, which is different from some radio-controlled (RC) copters, he said. The motors are quite small, and run at 10,000-15,000rpm. That can create a lot of noise in the electrical system, so many RC devices use two separate batteries. Instead, AI has added a lot of filtering on the power, so that it could avoid the extra weight of an additional battery.

[Gregoire Gentil]

The MeCam is not an RC device, instead it has an auto-pilot that is voice controlled and uses facial recognition to position itself. AI considered three different possibilities for running the auto-pilot code. The first was to run a standard Linux kernel, which would give them a "mature full environment" in which to develop and run the code. The downside is the latency. There were times when the motors would not be serviced for 50ms, which was enough time to cause the MeCam to "crash against the wall", Gentil said.

The second option was to use the realtime Linux kernel, which has "much better latency", but is "less mature than the standard kernel". That is the approach being used now, but AI is pursuing another approach as well: writing a custom realtime operating system (RTOS) for the Cortex-M3 which is present on the OMAP4. That will allow the system to have "perfect latency", he said, but it "will be complex to develop and is not mainstream".

[Demo]

For the demos, and until the Cortex-M3 RTOS version is working, the MeCam is running a PREEMPT_RT kernel on the Cortex-A9s. The auto-pilot process is given a priority of 90 using the SCHED_FIFO scheduling class.

The auto-pilot uses Euler angles (i.e. roll, pitch, and yaw) but due to the gimbal lock effect, that is not sufficient for navigating. The solution to that problem is to use quaternions, which use four numbers in a complex number space. That requires a math library and floating-point numbers, which is a problem for the Cortex-M3 version because it doesn't have any floating-point support. There are plans to use a fixed-point library to work around that.

To control the movement once the desired direction has been calculated, the MeCam uses a proportional-integral-derivative (PID) controller. The PID controller uses a feedback loop that produces movement that smoothly narrows in on the goal location without overcompensating. In addition, its "implementation is very straightforward", Gentil said. There are constants used in the PID algorithm, which can either be derived experimentally or calculated theoretically using a program like MATLAB. AI chose the experimental approach, and he recommended the PID without a PhD article for those interested.

There is an ultrasonic altitude sensor that uses frequencies above 40kHz to determine how far the copter is above the ground so that it can maintain a constant height. It uses the time for an echo return to determine its height. Someone asked about it getting confused when flying past a cliff (or off the edge of a table), but Gentil said there is a barometer that is also used for more coarse altitude information and that it would detect that particular problem.

The OMAP4 has a "bunch of video coprocessing stuff" that is used by the MeCam. The camera data is routed to two different tasks, one for streaming to the phone, the other for face detection. It uses Video4Linux2 (V4L2) media controls to control the camera and its output. He mentioned the yavta (Yet Another V4L2 Test Application) as an excellent tool for testing and debugging.

The camera sensor provides multiple outputs, which are routed to resizers and then to the live streaming and face detection. With the OMAP4 and V4L2, "you can definitely do crazy things on your system", he said. For streaming, the MeCam uses Gstreamer to produce Real Time Streaming Protocol (RTSP) data.

Gentil had various demos, including the copter operating from a four-way stand (tethered at each corner), from a two-way stand (tethered at opposite corners) to show the PID algorithm recovering from his heavy-handed inputs, as well as video streaming to attendees' laptops if their browser supported RTSP. There is still plenty of work to do it would seem, but there is quite a bit that is functioning already. Current battery life is around 15 minutes, but "I think we can do better", Gentil said. One can imagine plenty of applications for such a device, well beyond the rather self-absorbed examples shown in the marketing video.

[ I would like to thank the Linux Foundation for travel assistance to attend ELC. ]


(Log in to post comments)

ELC: Nano-copters!

Posted Mar 7, 2013 9:33 UTC (Thu) by alex31 (subscriber, #67059) [Link]

why rewrite an rtos for the M3 side ?
there is already mature, easy to use, open source
rtos for theses cortex-M family of mcu : chibios, freertos,
and numerous others !

RTOS for the Cortex-M3

Posted Mar 7, 2013 10:51 UTC (Thu) by rvfh (subscriber, #31018) [Link]

Indeed, FreeRTOS is already up and running on the OMAP4 M3 cluster using RPMsg to communicate with the A9 [1] (using only one of the two M3's though, as FreeRTOS is UP.) The main issue is the floating point maths which make porting the code to the M3 less straightforward.

[1] https://github.com/g-aubertin/ducati_FreeRTOS

ELC: Nano-copters!

Posted Mar 7, 2013 9:57 UTC (Thu) by epa (subscriber, #39769) [Link]

That's the most absurd-looking heatsink I've ever seen!

MeCam name

Posted Mar 7, 2013 10:03 UTC (Thu) by jnareb (subscriber, #46500) [Link]

There is a different project with the same name: http://www.mecam.me/

Abusing nano-copters?

Posted Mar 7, 2013 11:44 UTC (Thu) by NAR (subscriber, #1313) [Link]

Sounds like something the paparrazzis would want (if it's not too noisy). Could be useful for the army too, for in-house reconnaissance. It also reminds me the hunter-seeker from Dune.

Abusing nano-copters?

Posted Mar 7, 2013 12:04 UTC (Thu) by dskoll (subscriber, #1630) [Link]

The British army already uses tiny helicopters for reconnaissance in Afghanistan: http://www.kurzweilai.net/british-army-deploys-tiny-helicopters

ELC: Nano-copters!

Posted Mar 7, 2013 15:47 UTC (Thu) by Richard_J_Neill (subscriber, #23093) [Link]

For anyone who just wants to play with one, the R/C variant are available branded as "Hubsan", for about £40. To my amazement, the "toy" technology is now really advanced.

ELC: Nano-copters!

Posted Mar 19, 2013 11:19 UTC (Tue) by wookey (subscriber, #5501) [Link]

How much of this stuff is free software? Can I get it and play with it?

I've been pondering the problem of remote conference attendance and the fact that nearly all the interesting/important discussion happens in the corridors, not the sessions. We try to connect up the sessions, but it's meeting people in the coffee slots that I usually find really useful.

I was wondering if a little quadcopter 'presence' could work to see who's around, listen in on conversations and talk to people. It's probably too noisy and intrusive, and a rolling robot would be a lot more energy efficient, but would need a mic/camera on a long pole to get to head height, and might just keep falling over. Even if this isn't actually very practical it'd be fun to try :-) 'copters have the advantage of being very light and shippable. Very short fly-times is a problem, but often there is a table to land on.

Is anyone else wondering about this stuff?

ELC: Nano-copters!

Posted Mar 20, 2013 9:30 UTC (Wed) by jezuch (subscriber, #52988) [Link]

> It's probably too noisy and intrusive, and a rolling robot would be a lot more energy efficient

Has anyone tried mini-blimps?

ELC: Nano-copters!

Posted Mar 20, 2013 12:28 UTC (Wed) by wookey (subscriber, #5501) [Link]

That's actually quite a good idea. Should be do-able.

I've discovered that there are in fact a whole range of 'remote presence' robots already available: The VGO ($6000), Double Robotics 'Double' ($2000+ipad), Beam 'Texai'($16000+support contract), Robodynamics 'TiLR' ($10000), QB Anybot ($10000).

The double is the only one that isn't seriously expensive. It's really a segway-style frame to clip an ipad to - which is a quite a smart way to do things.

All of them are hopelessly proprietary of course, so no chance of common interfaces to whatever bot happens to be available at a site. I wonder if a double could be reverse-engineered to be driven by kosher software.

Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds