User: Password:
Subscribe / Log in / New account

Leading items

Synfig in motion

By Nathan Willis
January 15, 2014

Synfig Studio is a free software 2D animation suite; we last looked at it in depth in 2010, when animator Konstantin Dmitriev was pushing its development forward as part of his anime project Morevna. Since then, Dmitriev (who was not an original Synfig developer) has taken on the task of maintaining the application. The latest release is version 0.64.1 from November 2013; it incorporates a number of functional and usability improvements that make Synfig much easier to work with, but Dmitriev and the other team members have also made big strides in funding ongoing development and in producing necessary training materials.

Across the second dimension

When first started, Synfig's user interface bears a strong enough resemblance to other image editors (such as GIMP or Inkscape) that it is not hard to bumble around and figure out many of the basics. There is a drawing canvas, a toolbox (with, for the most part, easily-understood tool icons), and palettes showing image layers, tool options, undo/redo history, and the like. The essence of 2D animation in Synfig, however, is that all of these tools are also tied to a timeline.

When one selects an object on the canvas with the selection tool, a tree view of its properties appears next to the timeline. Each of those properties can be changed at any timecode of the user's choosing: its position, its color, opacity, depth within the z-order of objects, the blending method used to composite it, and so on. Animation is a matter of manipulating these properties for all of the objects on the canvas at the right time; when exporting a scene, Synfig interpolates the changes—hence, we get motion.

[Synfig 0.64.1]

Perhaps that broad description sounds simple, but in practice, of course, using it to make animation of watchable quality is not. What separates a good animation tool from a merely mediocre one is how well it streamlines repetitive tasks, groups commonly needed features, and in other ways builds on top of the "primitive" jobs of moving and changing object properties to create tools that an animator can use easily.

On that usability front, Synfig has made considerable progress in last few years. Some of that progress has been made through a process of steady normalization in the user interface: better icons, clearer controls, ensuring that screen elements (buttons, fields, text label, and so on) are consistently aligned and properly sized, and so on. But there have been other important changes, in particular a 2013 coding sprint overhauled a lot of the unusual terminology found in the UI.

For instance, the control handles used in the timeline were previously referred to as "ducks;" they are now just called "handles." As the Synfig wiki points out, the term "duck" has legitimate historical origins, but nevertheless it was a practical obstacle to users trying to learn Synfig. There were also a number of UI terms which had been saddled with names from internal Synfig data structures; they have been renamed in more human-friendly fashion. Some of the other UI improvements include switching to Cairo for rendering (which considerably smooths out visualizing one's work) and documenting keyboard shortcuts in the program's tooltip pop-ups.


UI improvements certainly make it easier to get around in Synfig (whether one is new to the program entirely, or simply hunting for an infrequently used feature), but the project has made progress on important features as well. The most significant addition in 0.64.1 is the application's first implementation of "skeleton layers."

Skeletons or "bones" are a means for the animator to rig up a simple framework of joints for an animated character: handles for each articulation point, and connections linking the points as they are designed to move (e.g., hand-to-elbow-to-shoulder). The "skin" of the character is the visible image, which is attached so that it moves with the skeleton. Right now, Synfig lets animators attach a skin to a skeleton and move it around manually, but there are bigger benefits to be found in the future: with a working skeleton implementation, more advanced motion techniques like inverse kinematics become possible. At this stage, Synfig's bones implementation is still marked as "experimental," and must be switched on in the application preferences.

The other functional changes include the ability to set different default interpolation methods for different parameter types, the ability to scale groups of objects all at once, and a fully zoomable drawing canvas. The interpolation method change is especially valuable because it frees the user from having to repeatedly make the same changes to multiple objects. How a parameter is interpolated between one keyframe and the next can have a big impact on realism. For example, a car moving along the ground can move at a constant speed, but a ball bouncing up and down needs to accelerate and decelerate. Thus, the two types of motion need to have different interpolation functions. Synfig allows animating object properties other than position, of course, but the need for flexibility still applies.

[Synfig training video]

There are several other changes that have more to do with the production process than with animation tools themselves. For example, users can mark an image layer to be excluded from rendering; that feature could be used to speed up render times by just exporting part of an animation, to add mock-up layers as placeholders for future content, and much more.

Another example is a set of improvements to working with keyframes. Keyframes are frames in a sequence where the animator manually makes changes to the canvas; the frames in between keyframes are what is interpolated. In older releases, keyframes could not be disabled; they could only be removed completely. The new releases let users turn each keyframe on or off at will, which can be a big boon when trying to figure out proper motion. The keyframe palette and timeline now show more information about keyframes, and Synfig goes ahead and creates the first keyframe at timecode zero when starting a new project (which is something that every user had to do manually before).

Supporting development

In addition to the changes in the Synfig application code itself, the project has undertaken a concerted effort to fund developer time. In 2013, Dmitriev was awarded a grant from the Shuttleworth Foundation. 60% of the grant money was used for a full-time developer (in addition to Dmitriev, who continued working as his schedule allowed), while the rest went toward producing a series of Synfig training videos.

Producing training materials has been a successful strategy for other creative-suite applications like Blender and Krita, of course, both for attracting users and for providing a revenue source. But it also helps bridge one of the most needed gaps in Synfig adoption. At Libre Graphics Meeting 2013 in Madrid, one may remember, animator Nina Paley expressed considerable frustration with how difficult she found Synfig to learn. In particular, she noted that she needed hands-on training to figure out Synfig, despite her experience with other animation applications. The training videos, then, would seem to address that concern quite nicely.

The first set focuses on "cutout" animation, which is the simplest method to get started with. Cutout animation involves moving static elements on screen, 2D-marionette–style. While Synfig can do much more (in fact, animating vector drawings is Synfig's strong suit), the cutout techniques do allow the viewer to learn the ins and outs of most of the application. The project is currently seeking to translate the narration into a variety of languages; no details for further installments have been announced, but additional courses have been alluded to.

[Synfig's secret menu]

Since the end of the Shuttleworth Foundation grant, Dmitriev has also run a series of one-month crowdfunding campaigns through Indiegogo, with each one underwriting a specific feature set. The amounts have varied according to the chosen feature (and all have been more modest in size than a one-month contract in Silicon Valley would entail), but the project has beat its funding goal on every campaign thus far, which is a noteworthy achievement in its own right.

Synfig still has its share of quirks, of course. Most notable is the "secret menu" that hides the majority of the function menus in the UI. To be perfectly clear, "secret menu" is not my tongue-in-cheek description; that is the actual name of the feature, and it is well-earned, since the menu can only be activated by clicking on an unlabeled triangle between two other UI elements. But the recent progress is considerable, both in usability and in expanding the feature set. The most recent work underwritten by the crowdfunding campaigns has not landed yet, but it will include some important feature enhancements like sound support and single-window mode.

In some ways, Synfig faces an intrinsically uphill battle for user attention merely because so much more emphasis is placed on 3D animation these days. But there is a lot that can be done in two dimensions, as the Synfig training materials demonstrate. Apart from traditional cartoons, a lot of the motion graphics used as illustrations in other video productions are 2D affairs. Currently, Blender is the free-software choice for this type of content (as Jakub Steiner shows), even when 3D is non-essential. At its current pace of development, though, Synfig could soon give Blender a run for its money.

Comments (1 posted)

Setbacks for net neutrality and for patent trolls

By Nathan Willis
January 15, 2014

On January 14, an appeals court in the United States handed down a decision that undoes some of the regulations governing net neutrality. The ruling was not a complete gutting of the regulation in question, but it could still have far-reaching implications for those who count on the existence of a flat, single-tier Internet—software developers included. The same day, the US Supreme Court declined to overturn a lower court ruling striking down the patents of a notorious patent troll. Both fights, of course, are far from over.

The neutral zone

The net neutrality decision [PDF] came from the US Court of Appeals for the District of Columbia, and was a 2–1 ruling in favor of the mobile phone carrier Verizon. Verizon had challenged the Federal Communication Commission (FCC) "Open Internet" rule, which specified that ISPs could not discriminate between different bits of data that they deliver to customers. In particular, the rules specified that ISPs cannot block, slow down, or expedite some data traffic while treating other traffic by different rules.

The most obvious potential abuse by discriminatory traffic handling is blocking or degrading a competitor's service. In the US it is not uncommon for a broadband ISP to also own TV networks, so the ISP side of the company would have an incentive to expedite delivery of content from the TV side, while throttling content from competing networks. But there are other possible abuses; at the Consumer Electronics Show (CES) in Las Vegas earlier in January, AT&T announced a "sponsored data" service, in which AT&T would allow software services to essentially pick up the tab for the portion of the end-user's data charges that the service consumed. In other words, Netflix could sign up, and tell AT&T customers that their Netflix video streaming would not count against their monthly data bill, while YouTube video would.

The Open Internet regulation was enacted by the FCC in 2010 and, naturally, broadband ISPs and cellular carriers have been fighting against it ever since. The new ruling strikes down a specific part of the regulation, but it is a key one: the section that prohibits blocking and traffic discrimination. Left in place are the other provisions, such as the rule that ISPs must transparently disclose how they block and discriminate traffic.

The justification for the court's decision is a tad peculiar, at least to those outside of the legal circle. As Harold Feld of Public Knowledge puts it, it hinges on the distinction that the FCC makes between "Information Services" and "Telecommunications" services. Wired broadband ISPs are considered "Information Services," which puts them into a different legal box as far as the FCC's power to regulate is concerned. If wired ISPs were "Telecommunications" services, as is (for example) the phone network, then the FCC would have more authority to regulate them—all because the law views telecommunications as a protected public service.

On the other hand, that technicality also means that the FCC could, in theory, back up and reinstate the Open Internet rules by re-classifying wired ISPs as Telecommunications services. Bloomberg quotes FCC Chairman Tom Wheeler will consider appealing the ruling, and will ensure "that these networks on which the Internet depends continue to provide a free and open platform for innovation and expression."

For the general public, most of the news coverage about the court decision has couched it in terms of ISPs charging higher fees to the online services with deep pockets: Netflix, Google, Hulu, and the like. That eventuality would probably mean costs passed down to the consumer. But the more insidious effect would be that it would be legal for ISPs to block or throttle any application or service—or, perhaps more likely, to charge every service an access fee, with a tiered pricing structure designed to maximize the ISP's bottom line. The concern for Silicon Valley is that lack of net neutrality would block innovation; but perhaps it is more accurate to say that it would tack on a layer of fees. The carriers assure customers that innovation will continue, but making innovation more expensive is detrimental too—to everyone except the entrenched players, of course.

Troll Hunt 2014

As disconcerting as the net neutrality decision is for many software developers, things were not all bad on the legal front. The Supreme Court turned down the opportunity to hear the Soverain Software "shopping cart patent" case on appeal. Lower courts had upheld a challenge to the patent from online retailer Newegg.

The shopping cart patents (there are actually two) were patents 5715314 and 5909492, which the non-practicing entity Soverain had successfully used to extract millions of dollars from various online merchants. Newegg challenged the patents back in 2010; it won the case in 2011; now that the Supreme Court has declined to hear an appeal, the case is closed. And not only is the case against Newegg at an end, but so are Soverain's cases against an array of other retailers (all of which, fortunately, were filed in the same patent-friendly district).

Of course, the Soverain shopping cart patents were clearly invalid—after all, they were invalidated in part by a 1984 CompuServe ad. But not every software patent is as clear-cut; plenty of them do seem to describe something novel and original. While it is good to see the court system catching on to the biggest offenders—non-practicing entities that do nothing but issue lawsuits and extort settlements—there is a lot ground to cover before the software patent mess is cleaned up for good.

Comments (25 posted)

The Humanitarian OpenStreetMap Team

By Jonathan Corbet
January 10, 2014 2014
The OpenStreetMap (OSM) project can be thought of as the "Wikipedia of maps," Kate Chapman said in her keynote talk. It has the goal of mapping almost anything observable worldwide; it's a task that is never fully complete. OpenStreetMap has found a wide range of uses, but Kate was there to talk about one of the most interesting: providing free maps to support humanitarian and disaster response efforts. As the executive director of the Humanitarian OpenStreetMap Team (HOT), Kate has taken a leading role in this use of OSM data.

The idea behind HOT is to make use of both open source software and open data to prepare for and respond to disasters anywhere in the world. Supporting economic development is also an important goal behind HOT's work. HOT was first discussed in 2005, and first activated in 2009 to support relief efforts in Gaza which had been hampered by a lack of good maps. In that case, HOT was able to purchase satellite imagery that was then used to produce the mapping data needed by workers on the ground.

An "activation" in HOT terminology is a response from the team due to some sort of crisis or disaster. Consider, for example, the January 12, 2010 earthquake in Haiti; that event drew what Kate described as a "very organic" response from the OSM project. A contributor in Japan traced current street information for Port Au Prince from satellite imagery provided by Yahoo for this purpose. Within two weeks, the rough map had acquired a high level of detail and was in widespread use throughout the city. For many of the responders working in Haiti, the OSM data became the preferred source of mapping information.

Response in Haiti had been complicated by the fact that the country's geographic data had been lost in the earthquake, as had many of the people who knew where the backups were. OSM was able to fill in by providing maps quickly — maps that reflected the post-earthquake state of the city. But a project like HOT is not going to be able to continue to maintain those maps indefinitely; such an effort is more suited to locals than to a worldwide project. So the next step was clearly to help the Haitians learn to keep the maps current themselves. So, in March of 2010, HOT members started traveling to Haiti and running training sessions on how to create and update OSM maps of the area. Five trips were made in total, and HOT was formalized as a non-profit organization in August of that year.

Getting ahead of the game

Post-disaster mapping can be useful, but it would be even more useful to have detailed maps in place before something goes wrong. Realizing this, the HOT project decided in 2011 to get a project going in Indonesia, a country that is exposed to a variety of natural disasters. HOT joined up with the InaSAFE project, which is developing free disaster-management utilities, and ran a series of workshops in the country.

One of the early workshops was on the island of West Nusa Tenggara. Residents there had been working on various mapping efforts, but the results took the form of pictures created in tools like Corel Draw. HOT [Kate Chapman] members went and mapped a couple of villages properly; that generated a great deal of excitement in the province, which decided to support the effort. Before too long, West Nusa Tenggara became the best-mapped province in the country.

Work then moved to Jakarta, a city of 10 million people (in a region with twice that many) made up of 267 separate urban villages. Jakarta is highly prone to flooding and would benefit from proper mapping. HOT invited each urban village head to participate in the project; a number of university students also took part. Then, in January of 2013, a terrible series of floods hit Jakarta. The OSM data was in place by then; it was used to quickly create detailed maps of the flooded areas and coordinate the response. This was the first time, Kate said, that mapping data created ahead of time had been used in a response effort.

HOT is now working on training people to maintain maps in the area. Mappers in Jakarta are making great use of Field Papers, a utility that can print out OSM data with QR codes indicating its location. Workers can walk through the area and write their notes on the printed map; the QR code then makes it easy to scan the result back into the map database. In other words, people can update the maps without needing to have a GPS receiver. The resulting maps can then be used to identify the areas that are most prone to disastrous flooding. Disaster-response plans can then be made before those floods happen.

Other projects

Indonesia is currently the largest effort being run by HOT, but it is far from the only one. The EUROSHA project is supported by the European Union; it funds volunteers on six-month deployments in Africa to do training and increase preparedness. The Open Cities Project, run in cooperation with the World Bank, is working toward the production of open maps for 100 cities in Asia that are prone to disaster. This work is being done in Bangladesh and Sri Lanka initially, and is mostly focused on training. There is a project in Senegal, supported by France and the World Bank, to create an active OSM community. HOT is also working with groups doing mapping in the tsunami-struck areas of Japan.

In the United States, the Civil Air Patrol operates a fleet of small aircraft to take aerial photographs of the country, with a focus on disaster areas. The result is a massive pile of imagery, so big that it often goes unused because there is no easy way to find the useful information. Here, working with a modified version of MapMill, HOT members put together a system allowing volunteers to sort through the imagery and identify pictures of disaster-struck areas. It was a quick effort, tossed onto GitHub and forgotten about — until Hurricane Sandy hit New York. 6000 people then used the software to help sort out the images and create a map of where the worst damage was.

There is a project with the American Red Cross to digitize satellite imagery and create maps of areas where disasters might strike. One result was a map of urban areas in Uganda containing lots of thatched-roof huts. Such areas are prone to wildfires and can benefit from advance planning and mitigation efforts.

When Typhoon Haiyan was seen to be heading toward the Philippines, HOT volunteers started doing detailed mapping a couple of days ahead of landfall on November 7. They were able to map 10,000 buildings in Tacloban, which was in the path of the storm, and tag 25% of them with their function — useful data for assessing damage and coordinating the response. What has proved harder is getting post-storm photos and data. Some satellite imagery was released by the US Department of State, but more would be useful.

Response to Haiyan was helped by the OSM task manager system, which coordinates the work that needs to be done. Volunteers can get information about the areas needing mapping work and pick up tasks to complete. In this case, 1679 people contributed to the response to the disaster, making nearly 5 million map changes. It was, Kate said, an incredible volunteer effort.

Getting involved

Those who want to help can find the current task list at HOT's software work is all freely licensed and can be found on GitHub; needless to say, patches are welcome. Those wanting to get more involved can join into the technical working group calls, held every other Monday at 5PM UTC. There are also positions — some paid — for those who want to get involved on the ground. There is a wide variety of tasks to be performed; HOT is, for example, trying to set up a series of Python classes in Jakarta. There are lots of ways to become a part of this effort, Kate said in conclusion; see this page for information on how to do that.

[Your editor would like to thank for funding his travel to Perth].

Comments (4 posted)

Page editor: Jonathan Corbet
Next page: Security>>

Copyright © 2014, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds