Not logged in
Log in now
Create an account
Subscribe to LWN
LWN.net Weekly Edition for May 23, 2013
An "enum" for Python 3
An unexpected perf feature
LWN.net Weekly Edition for May 16, 2013
A look at the PyPy 2.0 release
KS2011: Afternoon topics
Posted Oct 26, 2011 21:44 UTC (Wed) by mathstuf (subscriber, #69389)
Personally, I'm more partial to CMake than other build systems for C/C++. Some languages have it better: Python has setuptools or distribute, and Haskell has cabal. These are built with the language in mind and work fine until other things are needed (e.g., asciidoc/xmlto for man pages, doxygen for documentation, and non-standard testing suites (especially when language bindings are concerned and you can't build/use the bound language without hacks)). Others languages still have it hard such as Java with maven or whatever the XML horror du jour is.
What I *really* want is a build system that generates Makefiles (or Ninja files, or Visual Studio projects, or XCode projects, or what have you) (CMake), defaults to out of source builds (CMake and whatever autotools magic glibc has that no one bothers to copy), has a declarative syntax (cabal), and has no need to ship generated files (CMake, raw Makefiles).
I have hacks for CMake to handle LaTeX builds (including double and triple+bibtex passes) with out-of-source builds, symlinking my dotfiles, generating cross-referenced doxygen, and more, but I think a build system that supports something more akin to make's generator rules (something like Prolog/Mercury maybe?) would be nicer to work with (CMake's escaping and argument parsing is less than ideal to manage with complicated things). Implicit know-how of supporting system copies and bundled libraries with automatic switches (which can be disabled if there are patches which make in-the-wild copies not work) would be wonderful as well. CMake's external_project_add gets close, but still has some rough edges (such as needing manual switches for system copies support).
Posted Oct 27, 2011 19:52 UTC (Thu) by pbonzini (subscriber, #60935)
mkdir build; cd build; ../configure && make
Posted Oct 27, 2011 20:11 UTC (Thu) by mathstuf (subscriber, #69389)
Out-of-source *build* is probably bad wording. More precisely, .gitignore should be empty and git status (and the equivalent in the other VCS's) should also be empty starting with no generated files.
Posted Oct 29, 2011 19:05 UTC (Sat) by fuhchee (subscriber, #40059)
The cure to that is to mean to commit autoconf/automake-generated files.
Posted Oct 31, 2011 20:46 UTC (Mon) by mathstuf (subscriber, #69389)
Which, IMO, is worse than having them in the source tree after their generation. The autoconf/automake *generated* files belong in the build directory.
Posted Oct 31, 2011 20:55 UTC (Mon) by fuhchee (subscriber, #40059)
If you say so. :-)
Posted Nov 1, 2011 14:30 UTC (Tue) by nix (subscriber, #2304)
It's certainly not true if your source directory is a release tarball (or other release medium). Autoconf et al should have been run for you by that point, and the result tested. That way end users don't need anything but a shell and ordinary build tools to run the build. (This is one area where cmake falls down: all the builders need a copy of it.)
Posted Nov 1, 2011 14:41 UTC (Tue) by fuhchee (subscriber, #40059)
In practice, if people are pragmatic, it's fine.
Developers can regenerate the files at will with any version that works.
In the case of version control branch merge problems, regenerate them again.
Posted Oct 28, 2011 18:51 UTC (Fri) by anatolik (subscriber, #73797)
I really like the idea of Tup build system http://github.com/gittup that stores graph into local sqlite database and reparses/rebuilds graph only when files are changed - this makes iterative development much more pleasant. Another cool feature is dependencies autodiscovering - under the hood it uses fuse (and fuse-like) library for that (this works on linux, macosx and windows). And the third feature that I like is "monitor" - inotify based daemon that updates graph of dependencies in background while you change files in your editor.
I made some experiments with my project (100K) and found that null build takes ~1.6 sec without monitor and 0.09 sec with monitor. Null build for my gmake based build system on the same project takes 42 secs (it parses makefiles files, builds graph of dependencies, scans files for modification, but does not run any commands).
Posted Oct 28, 2011 20:13 UTC (Fri) by mathstuf (subscriber, #69389)
It doesn't support -B or -n make flags though. The percentage output is nice. However, it doesn't seem to support out-of-source builds (I sort of hacked bootstrap.sh to do the bootstrap successfully, but there's no further support to get a full build working). The code base looks clean, so maybe I can get a patch and convince the maintainer to accept out-of-source as an option.
Posted Oct 28, 2011 20:38 UTC (Fri) by anatolik (subscriber, #73797)
> make -n
> it doesn't seem to support out-of-source builds
AFAIK the tup author in favor of adding it. It is better to contact the maillist as I am not sure about his plans.
Oh, yeah, I remembered 4th thing that I like in tup - "buffered output". Output from commands is always printed atomically. No more interlaced output! The interlaced output is especially annoying if you have an error in one of widely used header files - this makes error messages really difficult to read.
Posted Oct 28, 2011 21:00 UTC (Fri) by mathstuf (subscriber, #69389)
So if I update a system library or header, it will relink the relevant parts? That's usually the corner case I've run into.
> AFAIK the tup author in favor of adding it. It is better to contact the maillist as I am not sure about his plans.
> Oh, yeah, I remembered 4th thing that I like in tup - "buffered output". Output from commands is always printed atomically. No more interlaced output! The interlaced output is especially annoying if you have an error in one of widely used header files - this makes error messages really difficult to read.
I usually do `make -kj8` followed by `make` to do this, so this should help that. I rarely do parallel builds from vim however (<F8> is bound to "build", autodetecting CMake, make, pdflatex, cabal, rpmbuild, and a few others based on the current file) so interlaced output never really bothered me there.
Posted Oct 27, 2011 0:06 UTC (Thu) by mhelsley (subscriber, #11324)
The alternative is probably something vaguely like static analysis of the code. Static analysis is notoriously complicated and often produces a flood of false-positives though.
So my guess is we'll still have humans involved in dependency generation and maintenance for quite some time -- even with use of tools like strace :).
Posted Oct 27, 2011 2:28 UTC (Thu) by nlhepler (subscriber, #54047)
As for standardizing on a build system, I have mixed feelings about using automake. It's a PITA, even the presenters admit this. All the problems with regenerating the configure, autoconf-ing, incompatibilities between versions, etc make it an absolute pain to use. Extending something like waf or fabricate to perform all the tests that are needed (is libA around? what about libB?, etc), and to build a monolithic C function to grab platform-specific information seems like a much less painful approach. Also, fabricate is a single python file that can easily be included with your package -- not the best approach, but it could give something like a make.py a fallback if it's not available system-wide.
Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds