By Jake Edge
February 20, 2013
The Linux Foundation's Rudolf Streif introduced one of the morning keynotes
at the 2013 Android
Builders Summit (ABS) by noting that
androids in space have a long history—at least in science fiction like
Star Wars. He was introducing Dr. Mark Micire of the US National Aeronautics and Space Administration
(NASA) Ames Research Center, who recently led a project that put the Android
operating system into space in the form of an "intelligent space robot"
that currently inhabits the International Space Station (ISS). Micire
brought the tale of how that came about to the first day of ABS on February
18 in San Francisco.
He started off by expressing amazement at what the community has done with
Android that takes it far beyond its mobile phone roots. He has several
different versions of his talk, but when he looked at the talk descriptions
for ABS, he quickly realized that the "geeky version" would be right for
the audience. A video provided the high-level view of the project,
starting with the liftoff of the last space shuttle, which carried the
first version of the robots to the ISS, to a description of
using a Nexus S smartphone to communicate with and control the robot. The idea is to have
a robot
available to both the crew and the ground-based operations staff to
take over some of the menial tasks that astronauts currently have to perform.
Some history
The spherical light saber trainer seen on the Millennium Falcon in Star
Wars was the
inspiration for several different space robot projects over the years,
Micire said. That includes the "personal satellite assistant" (PSA) which
was developed by the NASA Ames Research Center. It had a display
screen, two-way audio, a camera, and useful tools like a flashlight in a
roughly spherical package. Similarly, the Johnson Space Center created the
AirCam that could fly around the shuttle in space to take photographs and
video of the spacecraft. The AirCam actually flew on the shuttle in 1987,
but both projects were eventually canceled.
Micire's project evolved from a senior project at MIT, which
created roughly spherical satellite simulators to be used for experimenting with
synchronized satellite maneuvers. The algorithms to do those kinds of
maneuvers need to be developed in some cases, but it is expensive to test new
algorithms
with actual
satellites. The MIT SPHERES (Synchronized Position Hold, Engage, Reorient
Experimental Satellites) project used volleyball-sized robots that could be
flown inside the ISS to test these algorithms.
The SPHERES robots have a tank of carbon dioxide to use as propellant, much
like a paintball gun. In fact, when they need refilling, Micire has sometimes
taken them to Sports Authority (a US sporting goods store) to the
puzzlement of the clerks there. The CO2 is routed to thrusters
that can move the robot in three dimensions.
A Texas Instruments DSP that is "a decade old at this point" is what runs
the SPHERES robot. There is a battery pack to run the CPU and some
ultrasonic receivers that are used for calculating position. That pack uses
standard AA batteries, he said, because lithium-ion and other battery types can
explode in worst-case scenarios, which makes it difficult to get them
aboard a spacecraft. It is easy to "fly AA batteries", though, so lots of
things on the ISS run using them.
Since the cost of getting mass to low earth orbit is high, he said that he
doesn't even want to contemplate the
amount being
spent on resupplying AA batteries to the ISS.
The robot also has an infrared transmitter that sends a pulse used by the
controller of
ultrasonic beacons installed in an experimental lab area of the ISS. The
IR pulse is seen by the controller, which responds by sending several
ultrasonic pulses at
a known rate. The receivers on the SPHERES pick that signal up; using the
known location of the transmitters and the speed of sound, it can then
triangulate its position within the experimental zone, which is a cubical
area six feet on a side. Micire
showed video of the SPHERES in action on the ISS. He played the video at
4-6x normal speed so that the movement wasn't glacial; NASA safety
engineers prefer not to have high-speed
maneuvering via CO2 jets inside spacecraft.
The NASA Human Exploration and Telerobotics (HET) project that Micire runs
wanted to create robots that could handle a number of different tasks in
space that are currently done by astronauts. The idea is to provide both
the crew on the station and the team on the ground with a useful tool.
Right now, if there is an indicator light on a particular panel in the
station and the ground crew wants to know its state, they have to ask a
crew member to go look. But a robot could be flown over to the panel and
relay video back to the ground, for example.
The HET team was faced with the classic decision of either rolling its own
controller for the Smart SPHERES or buying something "commercial off the
shelf" (COTS). The team didn't have a strong opinion about which choice
was better,
but sat down to list their requirements. Those requirements included
sensors like a gyroscope, camera, accelerometer, and so on, in a package with a
reasonably powerful CPU and a fair amount of memory and storage. While
Micire was
worriedly
thinking "where are we going to find such a device?", he and the team were
all checking their email on their smartphones. It suddenly became obvious
where to find the device needed, he said with a chuckle. Even NASA can't
outrun the pace of the mobile phone industry in terms of miniaturization and
power consumption, he said.
Flight barriers
There are a lot of barriers to getting
a device "space rated" so that it can fly on the ISS (or other
spacecraft). The engineers at NASA are concerned about safety
requirements, and anything that could potentially "deorbit the station" are
of particular concern. HET wanted to go from a concept to flight in
roughly a year; "that's insane", Micire said, as it normally requires 2-3 years
from concept to flight because of safety and other requirements.
But using a mobile phone will help speed the process. Right about the time
a platform was needed, he heard about the Nexus S ("bless the internet!")
being released. It had just what was needed, so he and a colleague "camped out"
in line at the Mountain View Best Buy to get numbers 11 and 12 of the 13
that were delivered to that store.
The first thing they did to these
popular and hard-to-get new phones was to tear them apart to remove the
ability to transmit in the cellular bands. For flight safety, there must
be a hardware mechanism that turns off the ability to transmit. Removing the
driver from the kernel was not sufficient for the safety engineers, so a
hardware solution was needed. They decided to remove the transmit chip from
the board, but it was
a ball-grid-array (BGA) part, so they heated one of the boards to try to do
so. The first attempt resulted in an "epic fail" that ruined the phone,
but the attempt on the second board was successful. Now, pulling that chip
is the first
thing done to new phones to get around that "airplane mode problem".
The next problem they faced was the batteries. As he mentioned earlier,
lithium-ion is problematic for space; it takes two years to get those kinds
of batteries certified. Instead they used a "space certified" AA battery
holder, adding a diode that was used to fool the battery controller on the
phone. Micire said that he did a bit of "redneck engineering" to test the
performance of the AA batteries over time: he taped the phone to his laptop
and pointed its camera at voltage
and current meters hooked up to the battery pack. The phone ran a
time-lapse photo application, and he
transcribed the data from that video into a spreadsheet. He found that the
phone will
run well for seven hours using six AA batteries.
In the micro-gravity environment in the ISS, broken glass is a serious
problem. It can "become an inhalant", for example. Something had to be
done about the display glass so that breaking it would not result in glass
fragments. Micire thought he had the perfect solution by putting acrylic
tape over the display, but it turns out that tape is flammable, so it was
deemed unsuitable. In the
end, Teflon tape fit the bill. He showed some graphic photographic
evidence of what was done to a phone "in the interests of science" to prove
to NASA safety engineers that a broken screen would not cause a hazard.
The phone interfaces to the SPHERES over a USB serial connection because
the TI DSP
doesn't support anything else. The phone and battery holder are then
essentially taped to the side of the robot, as can be seen at right.
The team had "no time for software", Micire said, but "Cellbots saved our lunch" with a data
logging app for Android. In order to test the Nexus S sensors in space,
they needed a way to log the sensor data while the Smart SPHERES were
operating. It turns out that asking Samsung what its accelerometer does in
micro-gravity is not very fruitful ("we don't know, you're from NASA").
Sampling every sensor at high frequency and recording the data would allow
them to figure out which sensors worked and which didn't.
For any part that is used in an aircraft or spacecraft, a "certificate of
conformance" is required. That certificate comes from the supplier and
asserts that the part complies with the requirements. It's fairly easy to
get that from most suppliers, Micire said, but Best Buy is not in that
habit. In a bit of "social hacking", they showed up at the store five
minutes before closing time, cornered a very busy manager, and asked them
to sign a piece of paper that said "a Nexus S is a Nexus S"—after a
puzzled look as another store employee bugged them for attention, the
manager simply signed the certificate.
It turns out that all of the computers on the ISS run Windows XP SP 3,
which means there is no driver to talk to the Nexus S. Since it would take 2-3
years to get a driver certified to be installed on those machines, another
solution had to be found. They ended up writing an app that would kick the
phone's USB into mass storage mode prior to the cable being plugged into the
computer. Because Windows XP has a driver for a USB mass storage device,
it could be used to communicate with the Nexus S.
Testing
The first test units were launched on the final shuttle mission, and Micire
showed
video of the Smart SPHERES in action on the ISS. The light level was
rather low in the video because the fluorescent lights were turned down to
reduce jamming on the beacons. That was actually useful as it proved that
the camera produced reasonable data even in low-light situations. The
sensors on the phone (gyroscope, magnetometer, ...) worked well, as shown
in his graphs. The gravity
sensor showed near-zero gravity, which must mean that it was broken, he
joked. In reality, that is, of
course, the proper reading in a micro-gravity environment.
There are "lots of tubes" between the ISS and ground-based networks, so the
latency can be rather large. They were still able to do video transmission in
real time from the Smart SPHERES to the ground during the initial tests,
which was a bit of a surprise. After that test, the mission director
pulled the team aside; at first Micire was a little worried they were in
trouble, but it turned out that the director wanted to suggest adding Skype
so he could have a "free-flying robot that I can chase astronauts with".
In December 2012, another experiment was run. Once again, sped-up video
was shown of the robot navigating to a control panel to send video of its
state to controllers on the ground. Those controllers can do minor
adjustments to the orientation of the robot (and its camera) by panning from
side to side. There is no ability to navigate the robot in realtime from
the ground due to latency and potential loss-of-signal issues.
Other experiments are planned for this year and next, including having the
robot handle filming an interview with one of the astronauts. Currently
when a class of schoolchildren or other group has the opportunity to
interview the crew in space, two astronauts are required: one for the
interview and one to hold the camera. Since the Nexus S gives them "face
recognition for free", the robot could keep the camera focused on the crew
member being interviewed, which would free up the other crew member.
Micire's talk was an excellent example of what can happen when a device
maker doesn't lock down its device. It seems likely that no one at
Google or Samsung considered the possibility of the Nexus S being used to
control space robots when they built that phone. But because they didn't
lock it down, someone else did consider it—and then went out and actually
made it happen.
[ Thanks to the Linux Foundation for assisting with travel costs to San Francisco for ABS. ]
(
Log in to post comments)