|
|
Subscribe / Log in / New account

Integrated issue tracking with Ikiwiki (LinuxWorld)

Joey Hess looks at Ikiwiki. "Ikiwiki is a wiki engine with a twist. It's best described by the term "wiki compiler". Just as a typical software project consists of source code that is stored in revision control and compiled with make and gcc, an ikiwiki-based wiki is stored as human editable source in a revision control system, and built into HTML using ikiwiki."

to post comments

Integrated issue tracking with Ikiwiki (LinuxWorld)

Posted Apr 9, 2007 1:47 UTC (Mon) by jengelh (guest, #33263) [Link] (5 responses)

Now explain what makes it different from all the wiki implementations on the web. Focusing on e.g. MediaWiki [what wikipedia runs], to me, it compiles semimarkup (call it human-readable) into HTML. Or did I miss something?

Integrated issue tracking with Ikiwiki (LinuxWorld)

Posted Apr 9, 2007 7:33 UTC (Mon) by Per_Bothner (subscriber, #7375) [Link] (4 responses)

Or did I miss something?

Yes. Most Wiki engines are more like interpreters: The HTTP request is handled by a Wiki engine, which at least conceptually translates Wiki markup to HTML on the fly when the request is received.

IkiWiki compiles the Wiki markup to HTML as a "batch job" which means you can serve up the Wiki as static web pages. If you enable Wiki editing (which is optional), then you basically set up a simple CGI "trigger" to update edited pages when a page is saved. But viewing pages involves just viewing static pages, without invoking any Wiki engine. Hence IkiWiki can support much higher traffic than other engines.

This approach makes a lot of sense to me. Replication, load balancing, backup also seem to be handled more flexibly when you sperate out presentation from editing, and use SVN (or similar) for your "database".

Integrated issue tracking with Ikiwiki (LinuxWorld)

Posted Apr 9, 2007 9:23 UTC (Mon) by jengelh (guest, #33263) [Link] (2 responses)

IkiWiki compiles the Wiki markup to HTML as a "batch job" which means you can serve up the Wiki as static web pages. If you enable Wiki editing (which is optional), then you basically set up a simple CGI "trigger" to update edited pages when a page is saved. But viewing pages involves just viewing static pages, without invoking any Wiki engine. Hence IkiWiki can support much higher traffic than other engines.

While it may be true that MediaWiki generates HTML on the fly, Wikipedia's use of accelerator squids should mitigate that. It has happened to me that after editing a page, its old version was still displaying until forcing a refresh (using Shift-Reload in the browser), which means the caching is working sort of ikiwiki-style (compare wp - iki: contacting squid - serving a web page; contacting apache - generating static HTML). Though whether WP performs better using whatever is in place now or ikiwiki remains speculation at this point.

Integrated issue tracking with Ikiwiki (LinuxWorld)

Posted Apr 9, 2007 12:57 UTC (Mon) by k8to (guest, #15413) [Link] (1 responses)

Not to disagree with what you said, which is certainly accurate, but at least for smaller operations the lack of necessity for caching servers is a significant advantage. Squid sure is useful, but it is an extra piece of touchy software to configure and notice problems with. It seems useful to have the problem solved mostly on the origin server.

Integrated issue tracking with Ikiwiki (LinuxWorld)

Posted Apr 9, 2007 14:41 UTC (Mon) by dmarti (subscriber, #11625) [Link]

The other big win from Ikiwiki is that it handles two people trying to edit the same page much more smoothly, since it uses your revision control system for that. From the article:

"Ikiwiki uses your revision control system to track changes and handle tasks such as rolling back changes and merging edits. Because it takes advantage of revision control, there are no annoying warnings about other people editing a file, or finding yourself locked out of a file because someone else started editing it and left. Instead, the other person's changes will be automatically merged with yours when you commit."

And when you check out your project code you can also get a copy of your project wiki and bug tracker that you can use offline. More in the article about how you can do one commit that both fixes a bug and flags the bug report as done -- revert the commit and it makes you see the bug again.

Integrated issue tracking with Ikiwiki (LinuxWorld)

Posted Apr 10, 2007 10:50 UTC (Tue) by epa (subscriber, #39769) [Link]

The biggest advantage from generating static pages is that it's much less likely to cause security holes. I like Twiki, for example, but you have a huge mass of Perl code running for every page fetch and fiddling around on disk. (And yes, it did have exploitable holes and probably still does.) If all that's visible to the world is a static tree of files, it's harder to attack.

Of course, if you want to give the public write access then you still do need some kind of write access. But Ikiwiki is nice for the case when you just want to edit the pages yourself (after presenting a username and password to the web server) or to restrict editing to a small group of people.


Copyright © 2007, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds