LWN.net Logo

Impressions from the 12th Realtime Linux Workshop in Nairobi

November 19, 2010

This article was contributed by Thomas Gleixner

A rather small crowd of researchers, kernel developers and industry experts found their way to the 12th Real-Time Linux WorkShop (RTLWS) hosted at Strathmore University in Nairobi, Kenya. The small showing was not a big surprise, but it also did not make the workshop any less interesting.

After eleven workshops in Europe (Vienna, Milano, Valencia, Lille, Linz, Dresden), America (Orlando, Boston, Guadalajara) and Asia (Singapore, Lanzhou) the organization committee of the Realtime Linux workshop decided that it was time to go to Africa. The main reason for this was the numerous authors who had handed in their papers in the previous years but were not able to attend the workshop due to visa problems. Others simply were not able to attend such events due to financial constraints. So, in order to give these interested folks the opportunity to attend and to push the African FLOSS community, and of course especially the FLOSS realtime Community, Nairobi was chosen to be the first African city to host the Realtime Linux Workshop.

Kenya falls into the category of countries which seem to be completely disorganized, but very effective on the spontaneous side at the same time. As a realtime person you need to deal with very relaxed deadlines, gratuitous resource reservations and less-than-strict overall constraints, but it's always a good experience for folks from the milestone- and roadmap-driven hemisphere to be reminded that life actually goes on very well if you sit back, relax, take your time and just wait to see how things unfold.

Some of the workshop organizers arrived a few days before the conference and had adjusted enough to the local way of life so they were not taken by surprise that many of the people registered for the conference did not show up but, at the same time, unregistered attendees filled in.

Day 1

The opening session, scheduled at 9AM on Monday, started on time at 9:40, which met the already-adjusted deadline constraints perfectly well. Dr. Joseph Sevilla and deputy vice-chancellor Dr. Izael Pereira from Strathmore University and Nicholas McGuire from OSADLs Realtime Linux working group welcomed the participants. Peter Okech, the leader of the Nairobi organization team, did the introduction to the logistics.

Without further ado, Paul McKenney introduced us to the question of whether realtime applications require multicore systems. In Paul's unmistakable way he lead us through a maze of questions; only the expected quiz was missing. According to Paul, realtime systems face the same challenges as any other parallel programming problem. Parallelizing a given computation is not necessarily giving you the guarantee that things will go faster. Depending on the size of the work set, the way you split up the data set and the overhead caused by synchronization and interprocess communication, this might actually leave you very frustrated as the outcome can be significantly slower than the original, serialized approach. Paul gave the non-surprising advice that you definitely should avoid the pain and suffering of parallelizing your application if your existing serialized approach does the job already.

If you are in the unlucky position that you need to speed up your computation by parallelization, you have to be prepared to analyze the ways to split up your data set, choose one of those ways, split up your code accordingly, and figure out what happens. Your mileage may vary and you might have to lather, rinse and repeat more than once.

So that leaves you on your own, but at least there is one aspect of the problem which can be quantified. The required speedup and the number of cores available allow you to calculate the ratio between the work to be done and the communications overhead. A basic result is that you need at least N+1 cores to achieve a speedup of N, but as the number of cores increases, the ratio of communications overhead to work goes up nonlinearly, which means you have less time for work due to synchronization and communication. Larger jobs are more suitable than small ones, but, even then, it depends on the type of computation and on the ability to split up the data set in the first place. Parallelization, both within and outside of the realtime space, still seems to be an unlimited source of unsolved problems and headaches.

Paul left it to me to confuse the audience further with an introduction to the realtime preemption patch. Now admittedly the realtime preemption patch is a complex piece of software and not likely to fall into the category of realtime systems whose correctness can be verified with mathematical proof. Carsten Emde's followup talk looked at the alternative solution of monitoring such systems over a long period of time to reach a high level of confidence of correctness. There are various methods available in the kernel tracer to monitor wakeup latencies. Some of those have low-enough impact to allow long-term monitoring even on production systems. Carsten explained in depth OSADL's efforts in the realtime QA Farm. The long-term testing effort in the QA farm has improved the quality of the preempt-rt patches significantly and gives us a good insight into their behaviour across different hardware platforms and architectures.

On the more academic side, the realtime researchers from the ReTiS Lab at the Scuola Superiore Sant'Anna, Pisa, Italy looked at even more complex systems in their talk titled "Effective Realtime computing on Linux". Their main focus is on non-priority-based scheduling algorithms and their possible applications. One of the interesting aspects they looked at is resource and bandwidth guarantees for virtual machines. This is not really a realtime issue, but the base technology and scheduling theory behind it emerges from the realtime camp and might prove the usefulness of non-priority-based scheduling algorithms beyond the obvious application fields in the realtime computing space.

One of the most impressive talks on day one was the presentation of a "Distributed embedded platform" by Arnold Bett from the University of Nairobi. Arnold described an effort driven by physicists and engineers to build an extremely low-cost platform applicable to a broad range of essential needs in Kenya's households and industry. Based on a $1 Z80 microcontroller, configurable and controllable by the simplest PC running Linux, they built appliances for solar electricity, LED-based room lights and simple automation tasks in buildings and shop floors. All tools and technology around the basic control platform are based on open source technology, and both the hardware and the firmware of the platform are going to be available under a non-restrictive license. The hardware platform itself is designed to be manufactured in a very cost-effective way not requiring huge investments for the local people.

Day 2

The second day was spent with hands-on seminars about git, tracing, powerlink, rt-preempt and deadline scheduling. All sessions were attended by conference attendees and students from the local universities. In addition to the official RTLWS seminars, Nicholas McGuire gave seminars with the topics "filesystem from scratch", "application software management", "kernel build", and "packaging and customizing Debian" before and after the workshop at the University of Nairobi.

Such hands-on seminars have been held alongside most of the RTLWS workshops. From experience we know that it is often the initial resistance that stops the introduction of technologies. Proprietary solutions are presented as "easy to use", as solving problems without the need to manage the complexity of technology and without investing in the engineering capabilities of the people providing these solutions. This is and always has been an illusion or worse, a way of continued creation of dependency. People can only profit from technology when they take control of it in all aspects and when they gain the ability to express their problems and their solutions in terms of these technological capabilities. For this to happen it's not sufficient to know how to use technology. Instead it's necessary that they understand the technology and are able to manage the complexity involved. That includes mastering the task of learning and teaching technology and not "product usage". That's the intention of these hands-on seminars, and, while we have been using GNU/Linux as our vehicle to introduce core technologies, the principles go far beyond.

Day 3

The last day had a follow up talk by Peter Okech to his last year's surprising topic of inherent randomness. It was fun to see new interesting ways of exploiting the non-deterministic behavior of today's CPUs. Maybe we can get at least a seed generator for the entropy pool out of this work in the not-so-distant future.

The afternoon session was filled with an interesting panel discussion about "Open Innovation in Africa". Open Innovation is, according to Carsten Emde, a term summing up initiatives from open source to open standards with the goal of sharing non-differentiating know-how to develop common base technologies. He believes that open innovation - not only in the software area - is the best answer to the technological challenges of today and the future. Spending the collective brain power on collaborative efforts is far more worthwhile than reinventing the wheel in different and incompatible shapes and sizes all over the place.

Kamau Gachigi, Director of FabLab at the University of Nairobi, introduced the collaborative innovation efforts of FabLab. FabLabs provide access to modern technology for innovation. They began as an outreach project from MIT's Center for Bits and Atoms (CBA). While CBA works on multi-million dollar projects for next-generation fabrication technologies, FabLabs aim to provide equipment and materials in the low-digit-dollars range to gain access to state-of-the-art and innovative next-generation technologies. FabLabs have spread out from MIT all over the world, including to India and Africa, and provide a broad range of benefits from technological empowerment, technical training, localized problem solving, and high-tech business incubation to grass-roots research. Kamau showed the impressive technology work at FabLabs which is done with a very restricted budget based on collaborative efforts. FabLabs are open innovation at its best.

Alex Gakuru, Chair of ICT Consumers Association of Kenya, provided deep insight into the challenges of promoting open source solutions in Kenya. One of the examples he provided was the Kenya state program to provide access to affordable laptops to students, on whose committee he served. Alex found that it was impossible to get reasonable quotes for Linux-based machines for various reasons, ranging from the uninformed nature of committee members, through the still not-entirely-resolved corruption problem, to the massive bullying by the usual-suspect international technology corporations which want to secure their influence and grab hold of these new emerging markets. He resigned in frustration from the committee after unfruitful attempts to make progress on this matter. He is convinced that Kenya could have saved a huge amount of money if there had been a serious will to fight the mostly lobbying-driven choice of going with the "established" (best marketed solution). His resignation from this particular project did not break his enthusiasm and deep concern about consumer rights, equal opportunities and open and fair access to new technologies for all citizens.

Evans Ikua, FOSS Certification Manager at FOSSFA (Free and Open Source Software Foundation for Africa, Kenya) reported on his efforts to provide capacity building for FOSS small and medium enterprises in Africa. His main concern is to enable fair competition based on technical competence to prevent Africa being overtaken by companies which use their huge financial backings to buy themselves into the local markets.

Evans's concerns were pretty much confirmed by Joseph Sevilla, Senior Lecturer at Strathmore University, who complained about the lack of "The Open Source/Linux" company which competes with the commercial offerings of the big players. His resolution of the problem - to just give up - raised more than a few eyebrows within the panelists and the audience, though.

After the introductory talks, a lively discussion about how to apply and promote the idea of open innovation in Africa emerged, but, of course, we did not find the philosopher's stone that would bring us to a conclusive resolution. Though the panelists agreed that many of the technologies which are available in Africa have been coming in from the outside, they sometimes fit the needs and in other cases simply don't. Enabling local people to not only use but to design, develop, maintain and spread their own creative solutions to their specific problems is a key issue in developing countries. To facilitate this, they need not only access to technical solutions, but full and unrestricted control of the technological resources with which to build those solutions. Taking full control of technology is the prerequisite to effectively deploy it in the specific context - and, as the presentations showed us - Africa has its own set of challenges, many of which we simply would never have thought of. Open innovation is a key to unleash this creative potential.

Conclusions

Right after the closing session a young Kenyan researcher pulled me aside to show me a project he has been working on for quite some time. Coincidentally, this project falls into the open innovation space as well. Arthur Siro, a physicist with a strong computer science background, got tired of the fact that there is not enough material and equipment for students to get hands-on experience with interesting technology. Academic budgets are limited all over the world, but especially in a place like Kenya. At some point he noticed that an off the shelf PC contains hardware which could be used for both learning and conducting research experiments. The most interesting component is the sound card. So he started working on feeding signals into the sound card, sampling them, and feeding the samples through analytic computations like fast fourier transforms. The results can be fed to a graphic application or made available, via a simple parallel port, to external hardware. The framework is purely based on existing FOSS components and allows students to dive into this interesting technology with the cheapest PC hardware they can get their hands on. His plans go further, but he'll explain them himself soon when his project goes public.

My personal conclusion of this interesting time in Nairobi is that we really need to look out for the people who are doing the grunt work in those countries and give them any possible help we can. One thing is sure that part of this help will be to just go back there in the near future and show them that we really care. In hindsight we should have made more efforts upfront to reach out to the various groups and individuals interested in open source and open innovation, but hindsight is always easier than foresight. At least we know how to do better the next time.

On behalf of the participants and the OSADL RTLWS working group I want to say thanks again to the Nairobi organization team led by Peter Okech for setting up the conference and taking care of transportation, tours to the Nairobi national park, and guiding us safely around. Last we would like to encourage the readers of LWN.net who are involved in organizing workshops and conferences to think about bringing their events to Africa as well in order to give the developers and students there the chance to participate in the community as they deserve.

(The proceedings of the 12th RTLWS are available as a tarball of PDF files).


(Log in to post comments)

Good old days of COCOM list?

Posted Nov 19, 2010 22:19 UTC (Fri) by NAR (subscriber, #1313) [Link]

I'm not old enough to remember the days when computers (and software) couldn't be exported to the former Warsaw Pact countries, but I know that this list had an incentive on invention: because computers couldn't be imported from the West, they had to be built in the East. Even if these were mostly clones of Western designs, it was necessary to develop skills to build them. The Cold War had an other fortunate consequence: the Western companies weren't able to after those violating their copyrights and other intellectual property.

Now Africa is not bordered by such restrictions, both free software developers and proprietary software lobbyist can travel there.

Good old days of COCOM list?

Posted Nov 23, 2010 1:18 UTC (Tue) by skissane (subscriber, #38675) [Link]

I think the Soviet IT industry really suffered due to the obsession with stealing Western designs. Rather than building their own native architectures and systems, the focus was on having KGB spies infiltrate Western companies, on smuggling Western systems through third countries and reverse-engineering them, etc... When your focus is on copying someone else, you will never do better than them, the best you can hope for is, if you copy them perfectly, you will be their equal. If they had put the same effort into innovation, they might have come up with advances ahead of what the West had. But, they had to do what their political masters told them to...

Good old days of COCOM list?

Posted Nov 24, 2010 22:33 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link]

It was even worse. The USSR had its own hardware and software some of it was quite advanced (BESM series, for example) and even unique systems like ternary Setun' computers.

But then the USSR decided to license IBM's technologies. It all went downhill from there.

Impressions from the 12th Realtime Linux Workshop in Nairobi

Posted Nov 20, 2010 18:53 UTC (Sat) by fotoba (subscriber, #61150) [Link]

I think tere we much more papers in the proceedings.
A weel as my papers were acppeted via abstracts. But I was not able to come to the Nairobi. As I know Dresden's RLTWS11 I think RTLWS12 was great.

Maybe only problem is that Dr. Emde was not yet to be able to read my e-mails to him.

I wish taht I will be able to come to the Kansas next year

Impressions from the 12th Realtime Linux Workshop in Nairobi

Posted Nov 20, 2010 23:12 UTC (Sat) by tpo (subscriber, #25713) [Link]

I searched around the net but could not find neither the paper by Arnold Bett on that "Distributed embedded platform" nor any other substantial info about it anywhere?

Maybe it's a lack of my search engine stammina: are the papers/presentations available anywhere?
*t

Impressions from the 12th Realtime Linux Workshop in Nairobi

Posted Nov 22, 2010 9:44 UTC (Mon) by ppisa (subscriber, #67307) [Link]

I second that. I would like to learn used communication and control technology
and compare it with our open source effort based in 1995 on 8051 and today
on LPC21xx and 17xx cores and RS-485 uLan networking. There could be chances
for some technology/inspirations in both directions.

Impressions from the 12th Realtime Linux Workshop in Nairobi

Posted Nov 22, 2010 15:03 UTC (Mon) by andip (guest, #46212) [Link]

Hi!

You can't find it, because there was no paper from Bett and his colleagues. They presented their project on short notice, without publishing a paper. All papers that are available, are included in the above tar ball.

regards,
andi

Impressions from the 12th Realtime Linux Workshop in Nairobi

Posted Nov 21, 2010 13:35 UTC (Sun) by kleptog (subscriber, #1183) [Link]

So there has been some research on non-determinism in systems these days. I tried a while ago to find determinism in the timer interrupt and couldn't find any but it seems the non-determinism is even more than I imagined. Using this as entropy seems like an excellent idea.

Impressions from the 12th Realtime Linux Workshop in Nairobi

Posted Nov 21, 2010 22:12 UTC (Sun) by dlang (✭ supporter ✭, #313) [Link]

remember that non-deterministic does not necessarily give you very much entropy. something may be non-deterministic, but almost always produce one result on a particular system.

Copyright © 2010, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds