By Jake Edge
March 6, 2013
The Embedded
Linux Conference often has talks about interesting Linux-powered
devices, which
should come as no surprise, but talks about devices that fly tend to
attract a larger audience. Gregoire Gentil's presentation on a
video "nano-copter" was no exception. While there was no free-flying demo,
there were several tethered demos that clearly showed some of the
possibilities of the device.
Gentil started his talk by showing a YouTube video of the
marketing-level pitch from his company, AlwaysInnovating (AI), for the MeCam
device. The MeCam is a small quad-copter that allows people to video their
every move as it follows
them around. The video will stream to their mobile phone, where it can be
uploaded to Facebook, Twitter, and the like.
The device itself, pictured at right, is a "flying Pandaboard" with "lots
of modifications", Gentil said. The copter has four propellers, a camera,
and can communicate via WiFi. He had one of the copters tethered to a
stand so that it could go up and down, but not "kill anybody if something
goes wrong", he said with a laugh.
The copter runs an OMAP4 with dual Cortex-A9 processors. It uses
pulse-width modulation (PWM) to control each of the four propeller motors. The
camera has 1080p resolution and uses CSI-2 for the data interface as USB is
not fast enough, he said. There are also a "bunch of sensors", including
Euler angle
sensors, altitude sensors, and wall-detection sensors.
There is a single battery powering both the CPU and the motors, which is
different from some radio-controlled (RC) copters, he said. The
motors are quite small, and run at 10,000-15,000rpm. That can
create a lot of noise in the electrical system, so many RC devices use two
separate batteries. Instead, AI has added a lot of filtering
on the power, so that it could avoid the extra weight of an additional
battery.
The MeCam is not an RC device, instead it has an auto-pilot that is voice
controlled and uses facial recognition to position itself. AI considered
three different possibilities for running the auto-pilot code. The first
was to run a standard Linux kernel, which would give them a "mature full
environment" in which to develop and run the code. The downside is the
latency. There were times when the motors would not
be serviced for 50ms, which was enough time to cause the MeCam to "crash
against the wall", Gentil said.
The second option was to use the realtime Linux kernel, which has "much
better latency", but is "less mature than the standard kernel". That is
the approach being used now, but AI is pursuing another approach as well:
writing a custom realtime operating system (RTOS) for the Cortex-M3 which
is present on the OMAP4. That will allow the system to have
"perfect latency", he said, but it "will be complex to develop and is not
mainstream".
For the demos, and until the Cortex-M3 RTOS version is working, the MeCam
is running a PREEMPT_RT kernel on the Cortex-A9s. The
auto-pilot process is given a priority of 90 using the SCHED_FIFO
scheduling class.
The auto-pilot uses Euler angles (i.e. roll, pitch, and yaw) but due to the
gimbal lock effect,
that is not sufficient for navigating. The solution to that problem is to
use quaternions,
which use four numbers in a complex number space. That requires a math
library and floating-point numbers, which is a problem for the Cortex-M3
version because it doesn't have any floating-point support. There are
plans to use a fixed-point library to work around that.
To control the movement once the desired direction has been calculated, the
MeCam uses a proportional-integral-derivative
(PID) controller. The PID controller uses a feedback loop that
produces movement that smoothly narrows in on the goal location without
overcompensating. In addition, its "implementation is very
straightforward", Gentil said. There are constants used in the PID
algorithm, which can either be derived experimentally or calculated
theoretically using a program like MATLAB. AI chose the experimental approach,
and he recommended the PID
without a PhD article for those interested.
There is an ultrasonic altitude sensor that uses frequencies above 40kHz to
determine how far the copter is above the ground so that it can maintain a
constant height. It uses the time for an echo return to determine its height. Someone asked about it getting
confused when flying past a cliff (or off the edge of a table), but Gentil
said there is a barometer
that is also used for more coarse altitude information and that it would
detect that particular problem.
The OMAP4 has a "bunch of video coprocessing stuff" that is used by the
MeCam. The camera data is routed to two different tasks, one for streaming
to the phone, the other for face detection. It uses Video4Linux2
(V4L2) media controls to control the camera and its output. He mentioned
the yavta (Yet Another
V4L2 Test Application) as an excellent tool for testing and debugging.
The camera sensor provides multiple outputs, which are routed to resizers
and then to the live streaming and face detection. With the OMAP4 and
V4L2, "you can definitely do crazy things on your system", he said. For
streaming, the MeCam uses Gstreamer to produce Real Time Streaming Protocol
(RTSP) data.
Gentil had various demos, including the copter operating from a four-way
stand (tethered at each corner), from a two-way stand (tethered at opposite
corners) to show the PID algorithm recovering from his heavy-handed inputs,
as well as video streaming to attendees' laptops if their browser supported
RTSP. There is still plenty of work to do it would seem, but there is
quite a bit that is functioning already. Current battery life is around 15
minutes,
but "I think we can do better", Gentil said. One can imagine plenty of
applications for such a device, well beyond the rather self-absorbed
examples shown in the marketing video.
[ I would like to thank the Linux Foundation for travel assistance to attend ELC. ]
(
Log in to post comments)