The Journal Register Company (JRC) owns newspapers around the United States, and like many print media companies is looking to adapt its business and news-gathering models for the Internet era. On July 4th, 18 of JRC's papers published print and online editions using a "free" and "open" workflow, in homage to the Independence Day holiday. Dubbed the Ben Franklin Project, the effort combined crowdsourcing, interactive news gathering, and free software.
JRC's Vice President of Content Jon Cooper provided an overview
of the processes used by the various papers, including letting citizens
suggest story ideas and improvements, posting story budgets, incorporating
interactive online content, and using social media tools to gather news,
not just to publicize it. That dimension focused mostly on engaging the
local community with the newsroom staff, in a sense opening the process of
producing the news. Of more interest to LWN readers, perhaps, was the
decision to only use a free toolchain to produce the final product.
Slashdot picked up a blog post from one of the papers, The Saratogian, which outlined the use of the desktop publishing tool Scribus, the SeaShore image editor (a Mac OS X raster editor based originally on GIMP code), WordPress, and Google Docs. The story submitter and several commenters seemed to come away from the post with the impression that the newspaper found the software not-ready-for-prime-time, latching on to a quote four paragraphs in that said: "The proprietary software is designed to be efficient, reliable and relatively fast for the task of producing a daily newspaper. The free substitutes, not so much."
Scribus and other free software applications
When you look at the reports of all of the participating papers and JRC itself, however, it is clear that the above comment is not to be taken too seriously. All eighteen participating papers published their Scribus-built editions on time, and with positive results.
To be sure, a few encountered trouble along the way. The Delaware
County Times live-blogged
page layout, noting at one point that the program was crashing whenever the
editor attempted to import a particular image, and mentioning the time
involved in finding specific fonts, but the staff ultimately finished the
issue. The Saratogian noted that the most time-consuming process
was reproducing the paper's page templates in Scribus. But the New
Haven Register, Oneida
Dispatch, the Daily Local News, the other papers, and JRC management reported that the experiment was a success.
The papers spent a month prior to Ben Franklin Day training
staff on Scribus, both with the official documentation and with third-party
Editor Jack Kramer of the New Haven Register said that the staff
adapted to Scribus "pretty quickly," but that, although the
documentation was important, what proved more important were the in-house
training and support groups that the paper formed, which worked together and "perfected the program usage."
Karl Sickafus of the Daily Local called Scribus "arguably, the single most valuable find of the Ben Franklin project" and wrote that his staff even went so far as to write custom scripts to import content from the paper's database directly into Scribus. He then speculated that such a system could easily replace the proprietary ad tracking, advertising, and editorial systems JRC uses today.
Kramer and several of the other editors mentioned using both GIMP and
SeaShore for image editing, though Kramer noted that his photo editor was
not "totally satisfied" with SeaShore. All of the
participating papers published their content to a mirror site running
Wordpress in addition to their regular web site. In an interesting
footnote, the Slashdot debate veered into an argument over the oft-cited
issue that the name "The Gimp" (though the project uses "GIMP" these days) is off-putting or offensive, and drives potential users away without even trying it. When asked whether anyone at the New Haven Register was bothered by the name, Kramer replied simply "not at all."
Google Docs and other missing pieces
Some of the papers (such as The Morning Sun) used Open Office for story writing, but Google Docs was widespread. Predictably, in the Slashdot story and in several of the comment threads on the individual newspaper sites, free software advocates took issue with the decision to use Google Docs in a "free software" experiment, noting accurately that the tool is not open source or free software.
Indeed, several of the papers blur the line between "free to use" and "software freedom," a mistake certainly not limited to this particular field. An anonymous New Haven Register staffer provided more background on that decision in a comment on the paper's blog, saying "We ordinarily write our stories in a content-management system that costs money for us to use, and its purpose is to manage content for the print edition only — not our website."
Because it was used to replace a content management system (CMS),
presumably the critical feature of Google Docs was collaborative editing in
this case, so the lack of it in the other open source tools led to Google Docs adoption. That includes WordPress, which the papers used to publish their online editions. Although multi-user editing is possible in WordPress, the newsrooms evidently found it lacking. The newsroom-oriented CMS Campsite may simply have been overlooked.
It is also interesting to note what other free-to-use proprietary applications were selected; this is real-world feedback that the open source community should take note of. Most of the papers used existing social networks like Facebook and Twitter to solicit feedback from the local community. Almost all used video, but chose proprietary video editing and hosting tools. Finally, though the papers used SeaShore to edit photographs, it appears that none used a free raw conversion tool — perhaps because, as SeaShore indicates, the photo staff is equipped with Mac OS X, for which the free raw converters do not provide regular builds.
News for tomorrow
Having read all of the available accounts of open source's performance on Ben Franklin Day (some papers have yet to publish results online), Scribus and the other applications seem to have performed well. What happens next is the challenge. Sickafus observed:
We can literally do EVERYTHING we do using nothing but freeware. We just proved that. But, what are we going to do with than newly generated energy? Are we going to go back to doing the "things we do" the same way we are accustomed to, while reminiscing of the long forgotten Ben Franklin project? That would make absolutely no sense what so ever.
He advocates devoting financial resources that would be spent on proprietary content management and ad systems instead to adapting open source solutions. Reporter Ron Nurwisah recommends essentially the same thing, looking at the cost of dozens of licenses for Adobe products. Neither position would surprise long-time free software advocates; still, it is refreshing to witness an industry realize the potential of open source software.
JRC has kept its Ben Franklin Project
WordPress site active since July 4th, posting discussions on where the
company needs to go next, and exploring other open source applications. In
addition, Cooper recently wrote to the Scribus mailing list to
initiate a dialog with the development team about what the papers had
There is still no official announcement from any of the papers about the
permanent addition of open source to their newsrooms. Based on
the results so far, though, an announcement like that may not be all that
[ Our thanks to Jay R. Ashworth for pointing us in the direction of this
Comments (5 posted)
We just wrapped up the Ohio LinuxFest call for presentations, so pitching presentations is on my mind. Regional, volunteer-run conferences are not only a good way for people without a travel budget to see some big names in open source, they're also a way for first-time or inexperienced speakers to hone their presentation skills. Regional conferences also provide an excellent forum to educate users about your favorite project or topic.
But competition for speaking slots can be fierce. Established
conferences like SCALE, Ohio LinuxFest, and others receive many more
proposals than available slots. For example, the 2010 call for
presentations for Ohio LinuxFest received about 120 proposals for less than
30 speaking slots. Some of the talks for regional conferences will quickly
go to experienced speakers who have presented at the show before and/or
have established a name for themselves as a topic expert and competent
presenter. But most presentation committees for community shows also try
to select local talent who are new to presenting, and those slots will go to the speakers with the best proposals.
Pitching your Proposal
"Submit early and often" is a good rule of thumb. Speakers should
develop at least two presentation ideas, and submit them as early as
possible — certainly well before the deadline. It's usually a bad
idea to wait until the last minute to submit a proposal. In some cases, the
committee takes note of which talks come in early. Even if that isn't the
case, waiting until right before the deadline usually means that the
proposal you submit will not be of the same quality as a proposal developed
over the space of a few days. Submitting multiple talks boosts your chances
if the committee likes you as a speaker, but doesn't like one of your
topics or has to choose between you and another speaker who submitted a similar topic.
Presentation titles should be descriptive, short, and ideally something that will grab the attention of the presentation committee. A title like "Improving Kernel Contributions" is good, but "Anatomy of a Failure" was likely to garner more attention.
No matter how good the presentation title, the abstract has to live up
to it, provide the committee with enough information to understand what the
talk is about, and convince them that you'll put in the preparation
time. There's little confidence that a prospective speaker who submits a
one-sentence proposal is going to put in the time necessary to deliver a
really excellent talk. And that is the goal of any good presentation
committee: filling the schedule with talks that are not merely adequate,
but ones that are excellent.
An excellent abstract explains what the topic is, the scope of the talk, and what the audience will learn during the presentation. The last is particularly important. Many submissions merely describe the topic, but the best submissions explain what the audience will gain from attending the talk.
Most calls for presentations will also seek a biography. Writing a biography is often an unpleasant exercise, but it's the best opportunity to show the presentation committee that you are actually the right person to present a given topic. Or not. If two people submit an "Introduction to Fedora" presentation, one of them being Paul Frields, guess who's most likely to be awarded the slot? This is another reason to think hard about your topic, pitch more than one idea, and try to choose something unique.
Feel free to contact the chair of the presentation committee if they're listed on the Web site and you need clarification on what the committee is looking for.
Preparing the Talk
Don't make the mistake of starting with slides when preparing a
presentation. In fact, you probably shouldn't be at the computer at all
when preparing the first draft of your presentation. Instead of firing up
OpenOffice.org Impress or creating slides using
LaTeX's Beamer class, grab a stack of index cards and head off to somewhere quiet.
For most presentations, you should have an introduction that describes what the audience will learn and the key ideas or information you want to impart. For a standard 60-minute slot, you should focus on no more than three to five major ideas.
Once you've mocked up your talk on paper, then it's time to start preparing slides. Use whatever tool you're most comfortable with but avoid writing out the entire talk on your slides. Your audience is not there to read your presentation, they're there to watch and hear you. Many presenters tend to cram slides with information and then stand in front of the audience reading the slides. This is a sure sign of a talk that will put the audience to sleep. If your audience can glean as much information from the slides as your actual presentation, you're not doing your job as a presenter.
With some exceptions, such as highly technical talks that require code examples or other supporting text, your slides should be uncluttered and only contain the basic ideas of what you're saying. Slides exist to support the speaker, not the other way around.
Also, a word of advice about technical problems. Come prepared to give
your talk with no slides at all. If your laptop fails, the projector dies,
or any other audio/visual catastrophe occurs, you should be able to give a
talk with no slides at all. The best bet is to have index cards or a
bulleted list of ideas that you can use to refresh your memory if showing
the slides fails for some reason. Again, if by the day of your presentation you're not ready to do the presentation sans slides, you have failed to prepare adequately.
This is a lesson I learned the hard way at FOSDEM. Giving a talk on
openSUSE, I was too used to referring to the slides and not comfortable
giving the talk extemporaneously — despite knowing the topic
cold. The projector and sound system developed a major glitch, so I was
forced to work with no slides. I gave a substandard talk because I didn't
have adequate paper backup and hadn't practiced the talk enough. Max
Spevack followed my talk and had the same A/V problems that I had, but was able to do a fine talk using his notes. That was the last time I walked into a room to present without being fully prepared to give a talk with no slides at all.
The more practice you have, the better. Don't simply read through slides at the computer. Stand up and practice delivering your presentation out loud, preferably while looking at a mirror. If you have a willing audience, practice in front of them. Focus on making eye contact, smiling, and speaking at a slower pace than usual.
Remember that people are far more likely to remember the overall tone of the talk and general topics than details. If you're hoping to impart every detail of developing with Django in 60 minutes, you'll probably fail. Emphasize a few key concepts that you'd want attendees to remember, and focus on delivering an enjoyable presentation.
Timing is extremely important. Practice the timing so that you have enough material to fit the time, minus time for questions and answers, and no more. A common mistake is to show up with far more material than is possible to present in the time allotted. This not only robs the audience of the chance to ask questions, but also means you probably won't do a good job of presenting all of the material.
The other reason to practice extensively is to calm your nerves. Most people find public speaking unsettling to some degree, but knowing the presentation (and topic) backwards and forwards goes a long way towards making public speaking more comfortable. See Scott Berkun's Confessions of a Public Speaker for some excellent anecdotes and rationale why people find public speaking so uncomfortable, as well as expansive advice on improving speaking skills.
It's important to remember that most audiences are friendly and wish to see a good talk. They do not expect you to be a great entertainer or to exhibit unnatural brilliance. Focus on making eye contact, speak slowly, and pause from time to time to gather your thoughts. Don't be afraid of a moment or two of silence.
Do be conversational and vary your tone; try to engage the
audience. Don't be afraid to ask questions, as it's a good idea to try to get the audience to focus on you rather than the slides or (worse) their cell phones and laptops.
Giving a good presentation is not rocket science, or even kernel development. It is, however, a fair amount of work. Expect to spend at least ten to fifteen hours developing and practicing a good presentation. If it's your first, you might want to put in even more time.
This might sound like a lot of work for an hour in front of an audience, but it's well worth it. You'll find that you learn a great deal about a topic, even one you're an expert in, by preparing to deliver a presentation.
Comments (27 posted)
One of the best references for Linux and UNIX system administrators over
the years has been the "Handbooks" (either Linux Administration
Handbook (LAH) or UNIX System Administration Handbook (USAH) at various
points). But the last edition was published in 2000 (as USAH), and included
information on then-current Red Hat Linux 6.2 and FreeBSD 3.4. A new, updated version,
and Linux System Administration Handbook, Fourth Edition (ULSAH), is
due out any day now, and the principal authors,
Evi Nemeth, Garth Snyder, Trent R. Hein, and Ben Whaley, agreed to answer
some questions for LWN readers. Below are their answers on the book, the
impact of Linux,
the future for UNIX and Linux, and more.
Could you all please introduce yourselves? What are you working on when
you're not writing system administration books?
Ben: I wear a bunch of different hats as an engineer at AppliedTrust in
Boulder. In addition to consulting on architecture and operations in
UNIX and Linux environments, I think a lot about next generation
technologies. Virtualization (in its myriad forms), application
security, and the adoption of open source are all of interest. I also
have a interest in the history of computing generally, and of UNIX and
Linux in particular. Outside of computing, I enjoy the republic of
Boulder as much as possible.
Garth: I was one of the original authors of the UNIX book lo these many
years ago, but I've spent time on a variety of projects since then. Over
time, I've actually done more software development than administration.
Evi: I'm Evi Nemeth, former Computer Science faculty at the University
of Colorado. I'm now out of the UNIX/Linux/CS world altogether; I
retired, bought a sailboat, took off and am currently in French
Polynesia, half way across the Pacific en route to New Zealand. I'm
working on making my boat single-hander-friendly for when I run out of
crew. I do have email on the boat via my ham radio; it's like uucp at
about 100 characters per minute.
Trent: I'm a scientist at heart, and I love understanding all the
layers of a system. When I'm not writing, I'm deep in the trenches
working on IT infrastructure issues that range from low-level technical
issues to policy and management.
The new version of the Linux Administration Handbook is on its way. When
can we expect it, and what are you most excited about in this edition?
Ben: We anticipate a mid to late July shelf date. There is loads of new
material in this combined UNIX and Linux edition, including new chapters
on virtualization, scripting, and green IT practices. I'm very pleased
with the new set of cartoons and cover art. Also, I'm thrilled to be
included as a new author after contributing to the 2nd edition of the
Garth: Many of the existing chapters have had near-complete rewrites as
well. You might think that long-standing, basic technologies like disk
storage and email would be relatively stable, but that's not true at
all. The way that most sites manage these resources has changed
completely just in the last five years or so.
Evi: LAH and USAH have merged back together. It's due out sometime soon
(July 2010). I'm most excited that it's finally done; I left the boat in
Panama and came back to Colorado for a year to work full time on the
book — it's a huge amount of work.
Trent: It better be in bookstores in the next 2 weeks!! I'm super
excited about the new Green IT section — it presents opportunities for
sysadmins to make a huge difference to our planet.
What led to the decision to combine the UNIX and Linux versions of the book?
Ben: Linux is recognized as an enterprise-grade operating system that
has proven its capabilities throughout the explosion of computing in the
last twenty years. It has significant momentum behind it, much more so
than other leading UNIX variants. One could argue that UNIX lives on but
looks to Linux as its leader. It has enough in common with traditional
UNIX that it make sense to cover it all in one place.
Garth: One factor that's helped make it possible to reintegrate is the
dramatic shakeout in the UNIX market. There simply aren't as many major
versions of Unix around as there were ten years ago. We've seen a
similar consolidation in the Linux arena as well, with most activity
consolidating around a few major distributions.
Evi: Personally, I'm not sure we should ever have separated UNIX and
Linux into 2 books. We thought that conditional text (if Linux, blah,
blah, else ...) would make it easy to manage both books with minimal
effort. Not true.
We also feel that this is probably the last edition that will be on
paper and maybe the last edition period, so if we are doing one last
book, let's cover all the systems we can. I'm sorry we didn't manage to
include MacOS too.
Trent: Combining the books makes sense because system administrators
manage both UNIX and Linux systems ... organizations shouldn't be
separating duties for these platforms since they're so similar.
A lot of old-time Unix folks were taken by surprise when Linux hit the
scene. You've seen a lot of Unix variants come and go; what, do you
think, accounts for the way that Linux has been able to displace Unix in
so many settings?
Evi: I think the fact that Linux ran on PCs instead of the special
vendor specific hardware of most UNIX systems gave Linux a leg up.
University students ran Linux on their PCs, graduated, and went to work
in corporate America (Europe, etc.). Linux also embraced (or tolerated)
Windows more than the UNIX vendors and found ways for the systems to
Trent: The community. It's a lot easier to support Linux these days
because if there's any issue, there are hundreds or thousands of people
to turn to on the 'net for help. It's a lot easier than sitting on hold
with a vendor's call center for hours.
What do you think is the future of "true" Unix?
Ben: In the near term I believe that UNIX environments will continue as
they have, serving as venerable, tenured systems with proven stability
and some powerful capabilities. Most of the UNIX systems I work with run
heavy databases or specific enterprise applications. OpenSolaris is an
impressive system with advanced features that don't exist elsewhere. In
the longer term, it seems to me that Linux will displace UNIX. All the
major vendors contribute to Linux's development (as LWN's own data
suggests) and even promote standardization.
Garth: Traditional Unix vendors don't have the resources to compete
against Linux on every front, so they've had to pick their battles and
concentrate on distinguishing themselves through enterprise features
such as database or filesystem performance. However, the number of
domains in which it's possible to demonstrate a proprietary advantage
over Linux is continuously shrinking. I don't see any reason for the
current trends to change, so I'm pessimistic about the future of
Evi: UNIXes that run on PC hardware will survive in niches — FreeBSD in
embedded systems, OpenBSD in security conscious spots, Open Solaris if
Oracle leaves it alone, etc., but systems on proprietary hardware (AIX,
HP-UX, Solaris) will continue to decline in market share. Note that each
of these three vendors is hedging their bets with a Linux product or
with a Linux that runs on their hardware.
Trent: There's such a large installed base that I think Unix is going to
be around for a very long, long time. But, I can tell you that almost
all the new infrastructure I build is on Linux.
Is the success of Linux a good thing? Or would we be better off now if
some version of Unix had established itself more strongly in the
Ben: I was introduced to Linux before UNIX, and I suspect that I
wouldn't be on the same path that I'm on today without it. In fact, I
grew up from adolescence using Linux and today I look at closed systems
as a broken, legacy model of development. I love the creativity and
community that open systems offer. To me it's a great thing that has
kept UNIX alive.
Garth: In some ways, this is like asking, "The rise of humanity: good or
bad?" Linux has warts, but so do the alternatives. It's not the solution
so much as the context in which other things happen.
Outside of a thousand minor differences, there really isn't a dime's
worth of difference between the various Unix-like operating systems.
They are all basically the same. (Well, except perhaps for AIX. Whatever
else one might say about it, it's definitely not the same.) Fundamental,
game-changing advances — such as Solaris's ZFS filesystem — are few and
Evi: Yes, the success of Linux is a very good thing. Each open source
"Unix" has strengths. Linux is quick to see a cool idea in another
system, re-engineer it, and voila, it's part of Linux too.
Trent: Linux is great!! In a way, Linux is really UNIX for the people.
We needed that. It's forced the entire industry to be more open, which
I'm not sure a more established open-source UNIX would have achieved
because it's inspired a shift in mindset.
If you could command the Linux development community to do one thing
that would make life easier for system administrators, what would it be?
Ben: Settle on a few more standards, such as log formats, command line
arguments, configuration file syntax, and extensibility. As always, the
more documentation, the better. Active communities where developers and
users interact are extremely valuable.
Garth: Unix and Linux developers have traditionally designed software to
be as flexible and configurable as possible. A good example of this
approach is the original sendmail, which knew practically nothing about
email addressing or transport until you manually programmed that
knowledge into its configuration file. Many common systems still take
that general approach: here's the tool kit, build whatever you like.
This approach has its advantages, but it also imposes a cognitive burden
on everyone downstream of the developers. I'd like to see more focus on
simplicity and predictability in Linux distributions and less concern
about hypothetical scenarios and edge cases. A Linux server may be a lot
more complicated than an iPhone, but it surely doesn't need to be 1000
times more complicated, as it currently is.
Unfortunately, it's especially hard to promote simplicity and clarity in
the open source world. It's easy to accept patches that add incremental
features but hard to remove anything or break compatibility. Reaching
design consensus on future developments is hard enough without
revisiting the messes of the past.
Evi: Standardize! Make command names, arguments, and behavior the same.
Get rid of the not-invented-here attitude.
Trent: Keep it simple. The early success of both UNIX and Linux can be
attributed to their simple, modular approach. Too often these days folks
are developing packages that are like a giant corn maze (but I won't
name any names!). We'll all get farther if the development community is
focused on simplicity and modularity.
The world is now full of Linux users and administrators who have never
touched a traditional Unix machine. What lesson from Unix do these folks
risk missing in a Linux-only world?
Garth: The Linux community has put a lot of effort into strip-mining the
UNIX systems of the past and digging out the nuggets of value that
weren't nailed down. So no, I don't think administrators unfamiliar with
traditional UNIX systems are missing too much. On the other hand, we do
seem to be stuck in about 1990 with respect to our basic idea of what an
operating system should be. Look at all the incredible developments
we've seen over the last 20 years in application and web development;
software is completely different now. I hope the triumph of UNIX and
Linux don't lock us into the POSIX API indefinitely.
Evi: This generation of Linuxites have used UNIX, Linux is a UNIX
system. It has commands that you type to a shell instead of driving thru
a zillion menus, it has all the important concepts of UNIX, like pipes
and input output redirection, it has man pages, ...
Trent: Process adherence. In the UNIX world of yesterday, we had giant
machines that filled an entire room, and dozens or hundreds of users
shared them. In order to achieve a reasonable service level, system
administrators had to be very process-focused, else they would impact
many users if a mistake occurred. Today, a single user may have dozens
of systems, physical and virtual. System administrators still need to
have a well-developed, well-followed set of processes to maintain them
to provide an even higher level of service.
The world is also full of energetic new developers for whom open source
has always just been part of the environment. These developers will be
creating our next generation of systems. What kind of operating systems
do you hope they will build, and what advice would you offer them on
Ben: The cloud is clearly the direction that is currently leading the
pack, and I'm anxious to see what happens next in the space. I hope that
open source development continues to thrive. It will also be interesting
to see what happens with security. People are paying a lot of attention
to it these days, and we're well positioned to fix the less secure,
trust-based models of earlier systems.
Evi: A fundamental principle of UNIX-ish operating systems is that they
provide commands that do one thing well and then the plumbing to hook
them together to get a final result. Windows-ish operating systems try
to think of everything a user might want to do and make a command or
option for it. If the Windows developer didn't think of the thing you
need to do, you are out of luck. Next generation operating systems need
the UNIXy approach. Keep things simple, take small steps, but do it well.
Trent: I hope they build operating systems that perform well. Back in
the day, we had to optimize for every last cycle, because cycles were
few. Some new OS developers have had the luxury of hardware being
abstracted from them, and it's easy to forget what really happens at
that layer. It's easy to fall into a pattern of slapping layer upon
layer, resulting in a lot of kernel and application bloat. Keep it
simple, and think about how to optimize for performance.
Anything else you would like to say to LWN's readers?
Ben: I'm a lurker on LWN and I've learned a lot from both the articles
and the comments. I appreciate the active open source community on the
site. Thanks for reading.
Garth: I'm excited about Btrfs and really looking forward to its debut
as a production system. Check it out!
Evi: I need crew. Is there anyone out there (between jobs maybe) who
isn't busy between September and November 2010? We share expenses on
consumables; I pay for boat maintenance.
Trent: I hope in UNIX and Linux System Administration Handbook (4th
Edition) we teach folks how to be good system administrators and find
answers on their own, rather than providing every possible answer for
them. System administrators need knowledge acquisition and problem
solving skills more than anything.
We would like to thank Evi, Garth, Trent, and Ben for taking the time to answer
Comments (19 posted)
Page editor: Jake Edge
Next page: Security>>