LWN.net Logo

LinuxCon Japan: OpenRelief launches

By Jake Edge
June 13, 2012

Shane Coughlan got hands-on experience in the kinds of problems faced by disaster aid teams while helping with relief efforts after Japan's earthquake, tsunami, and nuclear accident in 2011. That experience was part of a discussion on open source technical measures to assist such efforts at LinuxCon Japan 2011, but it also led to a new project, OpenRelief, that was announced at LinuxCon Japan 2012. The project is aimed at developing inexpensive—"disposable"—drone aircraft to assist relief teams in seeing over the horizon to detect people, changes in terrain, smoke, radiation, and other conditions in places that may be difficult or dangerous for on-the-ground exploration.

[OpenRelief and airplane]

OpenRelief co-founders Coughlan, who is a consultant based in western Japan, and Karl Lattimer, who is UK based, came to Yokohama to announce the project and to show off the prototype drone aircraft that Lattimer built. The plane itself was an eye-catching prop for the talk, but some of the most interesting parts of OpenRelief are on the inside: open source software for route following, image acquisition and processing, and so on. Much of that code comes from existing projects, with OpenRelief integrating it into the airframe to create a mobile reconnaissance platform.

Problems and solutions

Gathering information on the state of various locales was one of the biggest problems that Coughlan saw when bringing aid from western Japan to areas affected by the earthquake and tsunami. There were locations that had supplies and doctors, but didn't know where to take them. In addition, sometimes aid arrived at locations that were already fully stocked and had no more storage, so the aid had to return back to where it came from.

The situation was "like a big fog" covering the disaster zone, he said. OpenRelief is trying to help penetrate that fog using technical measures. It would act as a supplement to existing response mechanisms. The goal is "for the responders on the ground to do their job more effectively", Coughlan said, so they don't go where they aren't needed and do go where they are. That was "quite a challenge" last year in responding to the earthquake.

"So, obviously the solution is a robot airplane", he said with a chuckle. More seriously, a robot airplane can help answer some of the questions that are hard to get answers to like "can we get there?" or "do we need to go there?". There were situations where a car couldn't get through to a particular nearby location to assess the situation, but "an airplane could have gone there to see".

Robot airplanes (or drones) have gotten a bad reputation in places like Afghanistan, Coughlan said, but they can be "immensely useful". Unlike those in various war zones, these airplanes are "full of love and peace". They are intended to provide a low-cost solution for mapping and investigating disasters.

The plane will be able to take off and land from a foot path, fly a pre-programmed route, and gather various kinds of information while traveling. Using on-board processing, it will be able to recognize roads, people, and smoke. There are also a variety of sensors that can be deployed to collect weather data, radiation levels, or other kinds of environmental conditions which can relay data via the plane.

The plane and its capabilities are "not really news", Coughlan said, as the technology has been available for some time. OpenRelief has just tied together multiple off-the-shelf and open source pieces for its use case. The technology is "phenomenal and astonishingly cheap". With that, he turned things over to "someone who can build stuff", Lattimer that is, to describe more of the technical details of the plane.

The guts of the plane

It took about a week to assemble one of the drones, Lattimer said, and few more days to finish it. The airframe has a simple design that comes mostly already constructed. It is made out of fiberglass and covered in plastic vinyl. The first that he built was "a challenging project", but the second was much easier.

The plane has an autopilot system, the Arduino-based ArduPilot, which uses a combination of GPS, airflow monitors, and air pressure sensors to fly the plane on pre-programmed flight plans. The flight plan can contain up to 600 3D waypoints that can be reprogrammed in flight from the Raspberry Pi main controller. It takes off using a standard radio controller, then the autopilot is turned on and the plane follows the flight plan.

The Raspberry Pi is "ideal" for an initial test computer, Lattimer said, because of its low cost, but other, faster main CPUs are possible down the road. The Raspberry Pi Another board used for testing (Samsung Orion Exynos) has a Mali graphics chip, which was reverse engineered by his employer, Codethink, and can be used to do a variety of image processing tasks. The main board runs Debian 6.0 ("Squeeze").

For imaging, the plane uses a CCD with a 170° fisheye lens that provides five separate images. Those feed into the Open Source Computer Vision (OpenCV) package on the Raspberry Pi for doing visual recognition of smoke, people, and roads. There are also plans to do "structure from motion" (SfM) processing to detect landscape changes (e.g. flooding, height changes) but that is fairly processor intensive and will likely require processing after the plane returns from its mission.

Lattimer also described a low-cost, ground-based radiation sensor that can be dropped or placed in areas of interest to relay its readings to the plane. He built the sensor inside of a treacle tin (for those lacking the UK context, it is a can that held a syrup not entirely unlike molasses). The sensor employs an ionization chamber that measures relative ionizing radiation, rather than absolute radiation levels like a Geiger counter would do. It uses a Nanode controller, which is an open hardware device with a long-range, low-power radio to communicate with the plane. Other types of sensors (for chemicals or weather conditions) could also be built using the Nanode.

Mission control

There is a need to tie the robot and sensors together, Coughlan said, and that is where "mission control" comes into play. Most of the technology to do that already exists, so OpenRelief has just integrated those pieces onto a laptop that runs Debian, Ubuntu, or Windows. For example, the autopilot has a sophisticated mission planning application that is written for .NET, but can be run with Mono on Linux.

The output from the mission control system is formatted to be compatible with existing disaster relief packages (e.g. Sahana Eden). The information gathered can be processed into geographic information system (GIS) formats that can be directly integrated into the maps used by those applications. Rather than trying to reinvent the disaster relief application wheel, OpenRelief is adding new capabilities to the existing systems, Coughlan said.

That is one of the keys to the OpenRelief plan: integrating mostly available open hardware and software into the existing systems so that the drones can be put to work in the near future. The system uses OpenStreetMap data for its maps and can even contribute updates to that map repository for things that have changed. "Working alongside existing processes is critical", Coughlan said. During last year's discussions there was talk of redoing much of the disaster relief infrastructure, but that is not the route that the project took; "we just want to fit in with what's there".

The project started, at least conceptually, at the disaster relief panel in 2011. After thinking about airframes, autopilots, people recognition, and the like for a bit, development started in January. The team will be testing and refining the prototype with the hope of being production-ready in December.

The equipment is relatively inexpensive, with a retail bill of materials (BoM) for the prototype at around $750. Getting it into any kind of manufacturing process will make it "ridiculously cheap", he said. The target for the final BoM is $1000 which may include a more powerful main CPU (perhaps Tegra-based) and additional capabilities.

Help wanted

The team is around 25 people currently, consisting of a variety of specialties including engineers, makers, political scientists, mathematicians, and more. The team started out with "crazy ideas", but "those ideas turn out to not be crazy at all", Coughlan said. There is still lots of work to be done, but "it is doable" and the project is looking for help to get it done. The project is hoping to find people to donate time to develop, test, and improve the hardware, but it is looking beyond that as well.

OpenRelief is "kicking off with an ecosystem", he said. Coughlan's company Opendawn along with Codethink and Nanode have all made donations of hardware and time. But there are also a lot of individuals involved. "We want you to join our ecosystem". The project is looking for "your brains, not necessarily your money" (though he admitted it wouldn't turn down monetary contributions).

The ecosystem needs technologists as well as professional and volunteer relief workers to help refine and test the platform, and to help recognize the problems that need to be solved. In addition, there is a need for commercial enterprises to "make buckets of money" by building and selling the drones. The project's focus is to help save lives, but the platform could easily be repurposed for other uses, including for farmers or local governments. While it won't be useful "for anything naughty", Coughlan said, because it is a very visible and slow plane that won't be stealthy, there is a need for this kind of technology for various uses all over the world.

He invited everyone to check out the web site (which has been translated from English into several Asian languages), mailing lists, and various social media (Facebook, Twitter, and Pinterest) for more information. The slides [PDF] from the talk also give more technical details for those interested. Much of the code (with more coming) is on Gitorius and schematics are available at solderpad.com.

So far, the plane has only been flown in radio-controlled mode, at least partly because of regulations in Japan. Lattimer hopes to test with the autopilot sometime in the next month in a free-fly zone in the UK. Regulations on autonomous aircraft vary, but will be a challenge for some time, Coughlan said. He is hopeful that the disaster relief use case, as well as the very limited threat posed by a 3kg aircraft, will help change those regulations, though it will take time.

OpenRelief is an interesting project that combines a certain "geek appeal" with a useful function for reconnaissance in a disaster area. One can certainly imagine low-cost drone aircraft being employed in a variety of situations, but one that can potentially save lives by getting aid where it is needed most is more interesting still. By supplementing existing disaster relief systems—rather than trying to supplant them—OpenRelief's drones could be a nice addition to the disaster relief toolkit. One gets the sense that the drone is just the start, however, so it will be interesting to see what else the project comes up with down the road.

[ The author would like to thank the Linux Foundation for assistance with his travel to Yokohama. ]


(Log in to post comments)

LinuxCon Japan: OpenRelief launches

Posted Jun 14, 2012 6:26 UTC (Thu) by nhippi (subscriber, #34640) [Link]

> The Raspberry Pi has a Mali graphics chip, which was reverse engineered by his employer, Codethink

This is incorrect. rPI has Broadcom VideoCore IV GPU, which hasn't been reverse engineered (at least yet).

LinuxCon Japan: OpenRelief launches

Posted Jun 14, 2012 7:26 UTC (Thu) by jezuch (subscriber, #52988) [Link]

I wonder... Japan is famous for its robotics industry. Aren't there any Japanese projects along these lines? Or are they strictly focused on fancy human-like robots?

LinuxCon Japan: OpenRelief launches

Posted Jun 15, 2012 1:27 UTC (Fri) by euske (subscriber, #9300) [Link]

Contrary to popular belief, Japan is very conservative to use this kind of technologies for disaster recovery. People tend to just avoid using these in a serious situation. It's partly because of the inefficiency of government red tape, but there's also subtle distrust of machines among people (which might be surprising to you, in the country of automated toilets and creepy humanoids). I guess there's a similar project at some research organizations, but probably none of them are put in practice. In the last year's disaster, we had to import inspector robots for the stricken reactors, because none of those technologies were available at the time (and I doubt that they still aren't). Also, we had a fancy prediction system for spreading of radioactive materials, which the most public didn't even know it existed until months later.

Not stealthy?

Posted Jun 14, 2012 9:27 UTC (Thu) by NAR (subscriber, #1313) [Link]

I can't help but wonder how "not stealthy" could be this plane with it's 1.5 meter wingspan if it's painted to sky blue and flies 200 meters above ground...

Not stealthy?

Posted Jun 14, 2012 10:42 UTC (Thu) by armijn (subscriber, #3653) [Link]

That's the reason why they are painting it red. After his talk Shane turned on the engine and I can tell you it is too noisy to be stealthy ;-)

Not stealthy?

Posted Jun 18, 2012 21:36 UTC (Mon) by aray (subscriber, #85004) [Link]

you'd be surprised how quiet these things are 800ft up

Copyright © 2012, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds