Tizen is intended to serve as a base Linux distribution for
consumer electronics products, from phones to automobile dash systems,
all built and sold by different manufacturers—yet offering a
consistent set of application-level APIs. Consequently, one of
the problems the project clearly needs to address is assessing the
compliance of products marketed as Tizen offerings. At the 2013 Tizen
Developer Conference in San Francisco, several sessions examined
the compliance program and the testing process involved.
First, Intel's Bob Spencer addressed the goals and
scope of the compliance program. Perhaps the biggest distinction he
made was that only hardware products would be subject to any
compliance testing; applications will not need to be submitted for
tests. That said, applications will need to conform to packaging
and security guidelines, but as he put it "acceptance into the Tizen
app store means success" on that front. The project is working on a
tool to flag build and packaging errors for app developers, but it is
not ready for release.
As for hardware, Spencer said, companies developing a Tizen-based
product will not need to send in physical devices. Instead, they will
need to send the results of the compliance test suite to the Tizen
Association, along with a "branding request" for approval of the use
of the Tizen trademark.
In broad strokes, he said, the compliance model consists of a set
of specifications and the test suite that checks a device against
them. The specification distinguishes between requirements (i.e.,
"MUST" statements) and recommendations ("SHOULD" statements). There
is a common core specification that applies to all Tizen devices, plus
separate "profile" specifications that address particular device
categories. Each includes a combination of hardware and system software
features. To get the compliance seal of approval, a device must
pass both the common specification plus one or more device class
profiles.
The "or more" requirement might allow a device to
qualify as both a tablet (which falls under the "mobile" profile) and
as a "convertible" (which, sadly, is not part of the automotive profile,
but rather the "clamshell" profile also used for laptops).
Interestingly enough, devices will not be allowed to qualify for Tizen
compliance by meeting only the common core specification. At present,
the common core specification and the "mobile" profile have been published
for the Tizen 2.1 release as a "public draft." Spencer said that the
final 2.1 specification is expected by the end of June.
The draft
[PDF] does not separate out the common core and mobile profile
requirements into distinct sections, which the site says will be done
only once there are multiple device profiles published. On the
compliance discussion list, Mats Wichmann said
that this was due to the need to get a mobile profile specification out
the door.
Spencer provided an overview of the specification in his session.
He described the hardware requirements as being designed for
flexibility, supporting low-end feature phones up to high-end
smartphones, tablets from simple e-readers on up, and perhaps even
watches (which, incidentally, marked the first mention of Tizen
powered watches I have encountered). The list includes 512MB of RAM,
1GB of storage, at least one audio output, some form of Internet
connectivity (which SHOULD be wireless), display resolution of at least
480×320, USB 2.0, and touch-screen support (which MUST support
single-touch, but SHOULD support multi-touch).
There is considerably more flexibility regarding the vast
assortment of sensors and radios found in phones today; the
specification indicates that things like GPS, Near-Field
Communications (NFC), and accelerometers are all optional, but that if
a device provides any of them, it must implement the associated APIs.
At the moment, the draft requires supporting both Tizen's HTML5
APIs and its native APIs; Spencer said there were internal discussions
underway as to whether there should be separate "web-only" and
"web-plus-native" profile options. In addition to the application
APIs, the software side of the specification requires that devices be
either ARM or x86 architecture, defines application packaging and
management behavior, lists required multimedia codecs, SDK and
development tool compliance, and mandates implementation of the
"AppControl" application control interface (which defines a set of
basic cross-application operations like opening and displaying a
file).
The requirements are a bit more stringent in one area: web
runtimes. A device must provide both a standard web browser and a web
runtime for HTML5 applications. In addition, both must be built from
the official Tizen reference implementations (which are based on
WebKit), and must not alter the exposed behavior implemented
upstream. The browser and web runtime must also report a specific
user agent string matching the reference platform and version
information.
Testing, testing, testing
Immediately after Spencer finished his overview of the compliance
specification, Samsung's Hojun Jaygarl and Intel's Cathy Shen spoke
about the Tizen Compliance Test (TCT) used to assess devices. TCT is
designed to verify that the version of Tizen running on a product
conforms to the specifications, they said, although the project
requires that the Tizen reference code will be ported to each
device, rather than implemented from scratch. Consequently, the TCT
tests are designed to test features that ensure a consistent
application development environment and a consistent customer
experience, but allow manufacturers to differentiate the user
experience (UX).
The TCT battery of tests includes both automated and manual tests,
they explained. The manual tests cover those features that require
interoperating with other devices, such as pairing with another device
over WiFi Direct, or human interaction (such as responding to button
presses). The automated tests are unit tests addressing the mandatory
hardware and software requirements of the specification, and
compliance with any of the optional features the vendor chooses to
implement.
TCT splits the native API and Web APIs into separate categories
(although, again, both are currently required for any device to
pass). The native TCT involves a native app called FtApp that
executes individual tests on the device in question. The tests
themselves are built on the GTest framework
developed by Google. Tests are loaded into FtApp from a PC connected
to the device via the Smart
Development Bridge (SDB) tool in the Tizen SDK. There is also a
GUI tool for the host PC to monitor test progress and generate the
reports necessary for submission. The "native" tests cover the native
application APIs, plus application control, conformance to security
privileges, and the hardware features.
The web TCT can use the GUI tool to oversee the process, but
there is a command line utility as well. This test suite involves
loading an embedded web server onto the device, since it tests the
compliance of the device's web runtime with the various Web APIs
(including those coming from the W3C and the supplementary APIs
defined by Tizen). It also tests the device's web runtime for
compliance with package management, security, and privacy
requirements, and can run tests on the device's hardware capabilities.
These may not be completely automated, for example, involving a human
to verify that the screen rotates correctly when the device is turned
sideways. Finally, there is a tool called TCT-behavior that tests
interactive UI elements; it, too, requires a person to operate the
device.
The web TCT currently covers more than 10,000 individual
tests, while the native TCT incorporates more than 13,000. Shen and
Jaygarl said the automated tests take three to four hours to complete,
depending on the device. The manual tests add about one more hour.
Reports generated by the test manager are fairly simple; they list the
pass/fail result for each test case, the elapsed time, the completion
ratio (if applicable), and link to a more detailed log for each case.
The test management tool is an Eclipse plugin, designed for use with
the Tizen SDK.
During the Q&A at the end of the session, the all-important
question of source code availability was raised by the audience. Shen
and Jaygarl said that they expected to release the TCT test tools by
the end of June. Currently, they are still working on optimizing the
manual test cases—although it also probably goes without saying
that the TCT can hardly be expected before the final release of the
specification it is intended to test.
With more than 23,000 test cases, compliance with Tizen 2.1 will
hardly be a rubber-stamp, though requiring vendors to port the
reference code ought to take much of the guesswork out of the
process. Jaygarl and Shen also commented that developers will be able
to write their own test cases in GTest format and run them using the
official TCT tools, so when the toolset arrives it may offer something
to application developers as well as system vendors.
Compliance with a specification is not necessarily of interest to
everyone building an embedded Linux system, nor even to everyone
building a system based on Tizen. The program is designed to meet the
needs of hardware manufacturers, after all, who already have other
regulatory and development tests built into their product cycle. It
will be interesting to see how the Tizen Compliance Program evolves to
handle the non-mobile-device profiles in the future, but even if that
takes a while, it could be amusing to run the tests against the first
batches of commercially available Tizen phones, which are reported to
arrive this year.
[The author wishes to thank the Linux Foundation for travel assistance to Tizen Dev Con.]
(
Log in to post comments)