|
|
Subscribe / Log in / New account

LWN.net Weekly Edition for August 25, 2016

25 Years of Linux — so far

By Jonathan Corbet
August 24, 2016
On August 25, 1991, an obscure student in Finland named Linus Benedict Torvalds posted a message to the comp.os.minix Usenet newsgroup saying that he was working on a free operating system as a project to learn about the x86 architecture. He cannot possibly have known that he was launching a project that would change the computing industry in fundamental ways. Twenty-five years later, it is fair to say that none of us foresaw where Linux would go — a lesson that should be taken to heart when trying to imagine where it might go from here.

At the time of the announcement, Linux was vaporware; the first source release wouldn't come for another month. It wasn't even named "Linux"; we can all be happy that the original name ("Freax") didn't stick. When the code did come out, it was a mere 10,000 lines long; the kernel community now adds that much code over the course of about three days. There was no network stack, only Finnish keyboards were supported, many basic system calls were absent, and Linus didn't think it would ever be possible to port the kernel to a non-x86 architecture. It was, in other words, a toy system, not something that seemed poised to take over the world.

Some context

The computing industry in 1991 looked a little different than it does now. A whole set of Unix-based vendors had succeeded in displacing much of the minicomputer market but, in the process, they had turned Unix into numerous incompatible proprietary systems, each of which had its own problems and none of which could be fixed by its users. Unix, in moving down from minicomputers, had become much more widespread, but it also lost the code-sharing culture that had helped to make it Unix in the first place. The consequences of the Unix wars were already being felt, and we were being told by the trade press that the upcoming Windows NT release would be the end of Unix altogether. Unix vendors were developing NT-based systems, and the industry was being prepared for a Microsoft-only future.

Meanwhile, the GNU project had been underway for the better part of a decade. Impressive progress had been made on GCC and a whole set of low-level command-line utilities, but Richard Stallman's vision of an entirely free operating system remained unrealized and, in many minds, unattainable. We could put the GNU utilities on our proprietary Unix workstations and use them to build other free components — notably the X Window System — but we couldn't get away from that proprietary base. 32-Bit x86-based computers were becoming available at reasonable prices, but the Unix systems available on them were just as proprietary as the rest; there appeared to be little hope of a freely available BSD system at that time.

Linux jumped into this void with a kernel that was designed for 32-bit processors, a free license, and the ability to make use of the user-level free software that was already out there. Most importantly, Linux had a maintainer who was happy to take significant changes from others, and the Internet had become widespread enough to enable the creation of a large (for the time) development community. Suddenly, we had our free system that anybody could improve, and many people did. Before long, the gaps in Linux started to be filled.

Over the following years amazing things happened. Proprietary Unix did indeed die off as expected, but Microsoft's takeover of the rest of the computing industry did not quite go as planned. An industry that was doing its best to go completely closed was forced (after years of mocking and derision) to adopt a more open development model. Those of us who worked on Linux — the many thousands who worked at all levels, not just on the kernel — have changed the world in a huge and mostly positive way.

Forward to the present

A quarter of a century later, many things look very much the same. Linus is still running the project and many of the developers who contributed in the early days are still actively involved. We still have a free kernel that can serve as the base for a completely free operating system. Richard Stallman is still pushing for all software to be free. Much code is still developed by posting patches to mailing lists, much to the dismay of the younger GitHub generation. But a lot has changed over those years as well.

Linux in the early days was a decidedly noncommercial undertaking; few people made any sort of a reasonable living from it until the mid-to-late 1990s. It was a hobby, a way to have a reasonable operating system on commodity hardware, and a way to retain control over our computing environment. Some saw Linux as a weapon to use in the fight against "evil" companies like Microsoft but, for most of the community, it is probably fair to say that those companies weren't the enemy; instead, they were simply irrelevant. They were not offering a system that we wanted, so we were building our own instead.

The entry of corporations into Linux development was viewed with a fair amount of concern and trepidation in the early days. The early hiring of Alan Cox by Red Hat had users worried (needlessly) about his ability to continue contributing to the kernel in the ways he thought best. Linus actively avoided working for Linux-oriented companies. As the corporate world started to take note of our noncommercial system, there were a lot of fears that it would be co-opted and its spirit would be lost.

But, without companies, Linux would not be what it is now. We depended on them early on to create and support distributions for us. The community was singularly unsuccessful at creating a proper web browser for Linux until the collapse of Netscape jump-started the development of the tool we now call Firefox. Corporate support for scalability work (making the kernel perform on "large" four-processor systems, for example) was key to having a kernel that performs well on today's consumer-level devices. A community that did not attract (and welcome) corporate participation would not have created the system that we are running now.

We have managed to avoid many of the worst-case outcomes from heavy corporate participation so far. Rent-seeking efforts like the SCO lawsuits have been defeated. We have not gotten off for free on the patent front, but neither have we suffered the outright disaster that many feared. Companies have managed to drive some projects into the ground, but the freedom to fork a mismanaged project has often come to the rescue. In general, a lot of the outcomes that people feared have not come to be.

Sometimes, though, it can be hard to avoid feeling that the companies have taken over and that, perhaps, some of the spirit has indeed been lost. The bulk of free-software development is now done on somebody's payroll; some software is well supported indeed, but other projects that have been unable to find a corporate benefactor languish. As we have seen with projects like OpenSSL or GnuPG, it's not just the obscure projects that fall by the wayside; important infrastructure can also go without support. Changing Linux may not require corporate permission but, often, it seems to require corporate interest and funding.

Linux has done well indeed from the involvement of companies; they have taken us far beyond the apparent limits on what purely voluntary developers can do. Still, it is hard, sometimes, to avoid feeling that the free-software development model, meant to change the world and assure our freedom, has mostly become a tool for companies to cast off some of their development and support costs and undercut their competitors' revenue streams. That is almost certainly not something we could have avoided, but, without care, it could take a lot of the spark out of the free-software community.

The next 25 years

Back in 1991, it would have been difficult indeed to look forward and envision the world we live in today. Any attempts to describe the world of 2041 will be equally vain. All we can do is think about where we would like to be and try to get there.

Corporate participation in free-software development isn't going away, or, at least, so we must hope. But we have to try not to sell out to it entirely. A crucial piece of that is not allowing any single company to control any important project on its own. Developers who work on independent projects tend to think of themselves as affiliated with the project first, and their employer second; that results in a strong incentive to avoid compromising the project's goals in favor of what today's employer wants. Single-company projects are never really under the community's control; independent projects can be.

We need to think about what we want from our licensing. Copyleft has been an important part of how our base of free software was developed, but there are many who are saying that copyleft is dying, and they may be right. Even in projects that are covered by copyleft licenses, companies (which tend to own the bulk of the copyrights now) have been markedly resistant to enforcing those licenses. If the GPL is routinely ignored, it might as well be a permissive license. The experience of the last few decades shows that a lot of great free software can be developed under permissive licenses, and perhaps that is the future. But we should not wander blindly into that future without an open-eyed discussion.

Linux owns much of the computing world at this point, but the continued dominance of Linux should not be taken for granted. A nearly useless Linux kernel grew to the point that it pushed aside established competition; similarly, the toy system we laugh at today might just supersede Linux in the coming years. If that system is free software and truly better, then perhaps its success will be for the best, but there is no guarantee of either. If we want a future full of free software, we will have to earn it, just as we have earned what we have now.

And, most of all, we need to keep in mind why we embarked on this project in the first place, and why we're still doing it 25 years later. If developing Linux is just another job, it will certainly provide employment for a while but it will end up being no more than just another job. If, instead, it is a project to build superior technology that is free in every sense and fully under our control, then it can remain a project that can change the world for the better. We have built something great; with work, focus, and passion we can build something greater yet. It may well be that the first quarter of a century was just the beginning.

Comments (29 posted)

Designing mass-transit support for GNOME Maps

By Nathan Willis
August 24, 2016

GUADEC
At GUADEC 2016 in Karlsruhe, Germany, Andreas Nilsson explained the methodology he employed to implement a new feature for the GNOME Maps application: support for routing trips through public transportation networks. The use of mass transit, as it turns out, differs significantly from how users plan travel routes for walking, cycling, or driving.

The problem space

The transit-routing project began with bug number 764107, Nilsson said. GNOME Maps supports route planning for travel by foot, by bicycle, and by car, but that left out a large number of possible users. Nilsson was intrigued by the idea of working on the problem, he said, because he grew up in a small village in Sweden that, essentially, had no public transportation. There was a bus out of town that left three times a day, but that was it. Now that he lives in a major metropolitan area (Gothenburg), there are mass-transit lines everywhere.

So he drew up some initial implementation ideas, presuming that he could employ his standard process: consider the use case, mock up some designs, then code it. Then, however, he had a conversation about mass transit with his girlfriend (who is from Rio de Janeiro), and quickly discovered that the two of them had wildly different expectations about how a mass-transit planner should operate. [Andreas Nilsson]

He then began to look for research on how mass transit is used, only to discover that there was nothing useful available at the level he needed—namely, anything revealing how people plan their trips. After a few more conversations, he decided that the only way to move forward was to conduct his own end-user research, and interview a variety of people about route-planning and mass-transit usage.

User research and testing

There is still no standardized approach for conducting such user research within free-software projects, so Nilsson developed his own. Starting with family members, friends, and co-workers, he conducted a range of interviews over the course of several weeks.

In addition to the basics of planning trips, each interview included questions about other transportation systems (e.g., whether or not the person owns a car and, thus, has a mix of transport options available), what existing services and mobile apps the person uses, and whether the person prefers certain transit methods over others.

As it turned out, the answers not only covered the expected ground, but they revealed additional information Nilsson had not considered. For example, he had planned to have a "prefer this transit method" option, but one interviewee indicated that she planned her trips with (in a sense) a negative preference: she tries to avoid train lines whenever possible, because they give her motion sickness.

Nilsson took the interview results, developed "user personas" (hypothetical user scenarios), and proceeded to develop the UI mock-ups as originally intended. An audience member asked why the user-persona step was necessary, since many designers do not use it. Nilsson replied that he finds it helpful to avoid letting his own opinions unduly influence whether or not a feature makes it into the eventual code. "It's harder to say 'I don't like this' and 'we don't need this feature'."

The transit-routing feature has since been implemented in GNOME Maps based on Nilsson's work, mainly by Marcus Lundblad. Significant testing has followed, particularly where the wording and layout of directions is concerned. The feature should be available in the next stable GNOME Maps release.

Lessons learned

Behind the scenes, it uses Open Trip Planner to compute routes. That is a free-software web service that uses publicly available transit data published in the General Transit Feed Specification (GTFS) format designed by Google, on top of an Open Street Map base map layer.

Any transit system that releases GTFS data is supported, and the information in the database is exactly as detailed (and as fresh) as the available GTFS data. Another audience member asked whether or not the system distinguished between various networks of transit (such as trains and trams in the same city, or Tokyo's multiple independent subway services). Nilsson replied that such information should be distinguished within GTFS, so GNOME Maps will use it automatically.

A lengthy question-and-answer period took up the remainder of the session, much of it focused on how GNOME can better employ user research when developing applications. Nilsson told one audience member that crafting the question set was not easy; he started by looking at other transit-planning implementations and asking "why this?" about many of the design choices.

Allan Day asked what he had learned about conducting user interviews. Nilsson replied that it is important to not talk too much, for several reasons. First, talking too much can inadvertently steer the interviewee's responses. Second, whenever there is an "awkward pause" most interviewees will naturally start talking more themselves, and the more they talk, the more they reveal about what they are thinking. Day added that he hopes GNOME can build up a guidebook for developers to use when conducting user research and interviewing; Nilsson added that he thinks the project will get better at the process as it keeps conducting research.

There were also a few questions about privacy and other GNOME Maps features. One audience member expressed concern about Google Maps's feature of marking locations as "home" or "work;" Nilsson replied that he has not implemented any such feature in GNOME Maps. Someone else asked whether the route planner showed pricing information, since that can be important when planning a trip. Nilsson responded that the idea came up in the interview process, but it has not yet been incorporated into the application. It could be tricky to implement in a reliable manner, given the volatility of prices.

[The author would like to thank the GNOME Foundation for travel assistance to attend GUADEC 2016.]

Comments (15 posted)

Page editor: Jonathan Corbet

Inside this week's LWN.net Weekly Edition

  • Security: A different sort of "Fake Linus Torvalds"; New vulnerabilities in firewalld, glibc, gnupg, kernel, ...
  • Kernel: Restartable sequences; Btrfs; Network filtering for control groups; MMIO operations.
  • Distributions: Bringing OSTree to real-world desktops.
  • Development: GNOME updates from GUADEC; KDE Applications 16.08; Introducing OpenStreetView; Mozilla rebranding; ...
  • Announcements: Gilles Chanteperdrix; Event calendars.
Next page: Security>>

Copyright © 2016, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds