|
|
Subscribe / Log in / New account

Evolution of shells in Linux (developerWorks)

This developerWorks article looks at the history of Unix shells over time. "Ken Thompson (of Bell Labs) developed the first shell for UNIX called the V6 shell in 1971. Similar to its predecessor in Multics, this shell (/bin/sh) was an independent user program that executed outside of the kernel. Concepts like globbing (pattern matching for parameter expansion, such as *.txt) were implemented in a separate utility called glob, as was the if command to evaluate conditional expressions. This separation kept the shell small, at under 900 lines of C source".

to post comments

Odd choice for a title ...

Posted Dec 7, 2011 23:22 UTC (Wed) by JoeBuck (subscriber, #2330) [Link] (5 responses)

... considering that all of the major shells described predate Linux, which did not exist until 1991. There's nothing "in Linux" about the topic.

Odd choice for a title ...

Posted Dec 8, 2011 0:37 UTC (Thu) by theophrastus (guest, #80847) [Link] (4 responses)

so you're suggesting that there ought to be a linsh that only understands a unpublished binary format based on guile?

Odd choice for a title ...

Posted Dec 8, 2011 1:48 UTC (Thu) by JoeBuck (subscriber, #2330) [Link]

Of course not. Rather, Linux is just a kernel, and the Posix world is much older and bigger. Linux was initially built so the pre-existing GNU and BSD userland utilities could run on cheap hardware; the shells came first.

Odd choice for a title ...

Posted Dec 8, 2011 14:51 UTC (Thu) by felixfix (subscriber, #242) [Link] (2 responses)

He's only saying there's a mismatch between the title and text. Usually the fix is to rewrite the title, not the text.

Odd choice for a title ...

Posted Dec 8, 2011 19:19 UTC (Thu) by jzbiciak (guest, #5246) [Link] (1 responses)

Yeah. A better title would have been "A history of the shells that are available in Linux."

Odd choice for a title ...

Posted Dec 9, 2011 12:17 UTC (Fri) by nix (subscriber, #2304) [Link]

"A history of a few of the shells that are available in Linux", but not zsh.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 0:37 UTC (Thu) by lkundrak (subscriber, #43452) [Link] (4 responses)

One of interesting things about early UNIX shells (Thomson's, up to sixth edition) was that exit and goto were not built-ins, but simple programs that seeked a file descriptor open for the running script appropriately, either to the beginning proceeding to given label or to the end. Also, Bourne's seventh edition shell is almost universally considered to be one of the most horrible C code ever written. From its mac.h source file:
#define LOCAL   static
#define PROC    extern
#define TYPE    typedef
#define STRUCT  TYPE struct
#define UNION   TYPE union
#define REG     register
#define IF      if(
#define THEN    ){
#define ELSE    } else {
#define ELIF    } else if (
#define FI      ;}
#define BEGIN   {
#define END     }
#define SWITCH  switch(
#define IN      ){
#define ENDSW   }
#define FOR     for(
#define WHILE   while(
#define DO      ){
#define OD      ;}
#define REP     do{
#define PER     }while(
#define DONE    );
#define LOOP    for(;;){
#define POOL    }
#define SKIP    ;
#define DIV     /
#define REM     %
#define NEQ     ^
#define ANDF    &&
#define ORF     ||
#define TRUE    (-1)
#define FALSE   0
Thankfully, many of the macros, such as DIV are not actually used anywhere. This is what the code actually looks like:
VOID    fault(sig)
        REG INT         sig;
{
        REG INT         flag;

        signal(sig,fault);
        IF sig==MEMF
        THEN    IF setbrk(brkincr) == -1
                THEN    error(nospace);
                FI
Also, the code quoted is another rather creative technique used by the shell: it's the signal handler for memory faults, that simply enlarges heap with setbrk() when an out-of-bounds memory is signalled by kernel. Needless to say, there are not many calls to malloc() in the source...

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 5:04 UTC (Thu) by tshow (subscriber, #6411) [Link] (1 responses)

Ok, the pair of LOOP and POOL have to win some sort of award for being spectacularly awful. Was this done on a bet?

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 14:35 UTC (Thu) by nix (subscriber, #2304) [Link]

He liked Algol. A lot. Hence BOURNEGOL.

I have never seen another program written in BOURNEGOL, which is a real pity: the more programming becomes a matter of manipulating something deeply disgusting like that, the fewer people will want to be programmers and the higher our paycheques go :P

(similarly, I propose rotating knives and Great Cthulhu in job interviews.)

*removes tongue from cheek*

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 6:28 UTC (Thu) by BrucePerens (guest, #2510) [Link]

Oh yes, Steve Bourne's MACRO ALGOL.

I remember being recruited for a company where they were pushing that we would get to work with Steve Bourne! Now, I have never met him and don't know anything about him except for how much I hated that MACRO ALGOL when I was trying to fix something in v7 shell. I don't think I explained to the recruiter why the prospect of working with him had me nonplussed. But I'm sure he heard about it eventually.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 14:33 UTC (Thu) by nix (subscriber, #2304) [Link]

Needless to say, there are not many calls to malloc() in the source
This would be because it predates malloc(). (Remember your K&R first edition, with no mention of malloc() but discussion of an alloc() memory allocator that you might wish to write. And that was 1978...)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 5:47 UTC (Thu) by imgx64 (guest, #78590) [Link] (17 responses)

> Shells are like editors: Everyone has a favorite and vehemently defends that choice (and tells you why you should switch).

I'm glad this is less true these days; most people just use Bash because it's the default on their distribution and never think about it. I wasted too much time thinking about editors and trying different ones until I settled on Vim. I do not want to repeat that for shells.

Although to be honest, I did contemplate ZSH at one point, and I liked many of its features. The ones that stand out the most are the autocomplete (for example, autocomplete for umount only shows mounted devices), and ZLE (the line editor, I wish Bash/Readline could edit multi-line history as easily).

But then, the good is the enemy of the perfect (apologies to Voltaire), so I decided that Bash is good enough, and that switching and learning all these new features is not worth my time.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 6:47 UTC (Thu) by lindahl (guest, #15266) [Link] (12 responses)

The nice thing about individual nice features is that other people steal them. Automplete great in zsh? Here is that feature in bash:

complete -A hostname ssh ping traceroute mtr

It doesn't seem to have an 'action' for mounted filesystems.

Still, I have to finish by mentioning that you should definitely switch to emacs :-)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 7:24 UTC (Thu) by wahern (subscriber, #37304) [Link] (5 responses)

When I'm in bash or ksh every third command seems to be pwd. I'll use bash when bash learns RPROMPT. The right-hand prompt is the greatest thing since sliced bread. No approximations, please.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 8:42 UTC (Thu) by thedevil (guest, #32913) [Link] (2 responses)

You know about PS1 and the magic characters you can use in it, right?

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 9:31 UTC (Thu) by HelloWorld (guest, #56129) [Link] (1 responses)

What part about "right-hand prompt" and "no approximations" did you not understand?

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 18:33 UTC (Thu) by clump (subscriber, #27801) [Link]

What part about "right-hand prompt" and "no approximations" did you not understand?
Not helpful. There are many ways to have the prompt echo the current working directory. One such way is to use a right hand prompt in zsh, there are others as well. No need to attack people for mentioning that.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:26 UTC (Thu) by imgx64 (guest, #78590) [Link]

You could use a newline in PS1 and have the current directory above the prompt. It's not the same as RPROMPT, but I suppose it solves the problem.

But then again, I don't think you'll really "use bash when bash learns RPROMPT", simply because there is no reason to use Bash if you're happy with Zsh[1].

[1] I don't want to start a shell wars thread, but I'm curious if Bash has any advantages over Zsh other than ubiquity and the user being familiar with Readline (has a customized .inputrc for example).

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 14:42 UTC (Thu) by nix (subscriber, #2304) [Link]

Can't use zsh then. It's got several different types of approximate matching. :)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 14:40 UTC (Thu) by nix (subscriber, #2304) [Link] (5 responses)

I'm reasonably certain that the zsh autocompletion system cannot be emulated by bash to any real degree. It's the single most overdesigned autocompletion system I've ever heard of, knocking the socks off bash and even Emacs. Half of it is written in byte-compiled zsh script, and a description of it occupies half the zsh manual, over 200 pages!

(But it is seriously awesome despite the ridiculous overdesign.)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 14:50 UTC (Thu) by nye (subscriber, #51576) [Link] (1 responses)

>I'm reasonably certain that the zsh autocompletion system cannot be emulated by bash to any real degree

Can you give an example of the kind of autocompletion that can be done in zsh but not in bash?

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 16:44 UTC (Thu) by nix (subscriber, #2304) [Link]

You can ask for 'fuzzy completion of each element of a directory separately, displaying completion output in one of several types of menu, grouped by type of entry (e.g. option -- automatically determined from --help output -- versus filename versus directory versus USENET group name versus file descriptor number versus a million other things), with the appearance of each type independently changeable, not duplicating autocompletion entries in a single command if and only if the command is rm, doing spelling correction on directories matching this glob but only for this subset of commands' if you like. And that's just one example I happen to be using.

bash has nothing remotely comparable.

It is *crazy* flexible, so flexible that there is an autoloaded 'compinit' function just so that normal mortals stand a chance of configuring its *default* setup. (This is the 'zshcompsys' completion system, btw, not the 'zshcompctl' system, which is akin to bash's, and is obsolete.)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 15:48 UTC (Thu) by joey (guest, #328) [Link] (1 responses)

The autoloading of completions on demand is the only reason I still use zsh. When I configured bash to load all completions, starting a shell took enough time to be annoying.

Evolution of shells in Linux (developerWorks)

Posted Dec 14, 2011 0:01 UTC (Wed) by tertium (guest, #56169) [Link]

Then you might find this useful: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=467231 (disclaimer: I'm the original reporter). The script still works fine for me, I only added "declare -p bash4 2>/dev/null" when upgraded to bash 4.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 23:34 UTC (Thu) by nevyn (guest, #33129) [Link]

Can bash do the auto-complete cycling thing that zsh does?

That's probably at least 50% of the reason I still use zsh, the other 90% being that I configured zsh like 16 years ago and am happy to not have to configure bash when "yum install zsh" works instead :).
RPROMPT is nice too though ;).

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 6:56 UTC (Thu) by geuder (subscriber, #62854) [Link] (1 responses)

> The ones that stand out the most are the autocomplete (for example,
> autocomplete for umount only shows mounted devices),

autocomplete behaviour is not really built-in or more exactly the built-in default behaviour is not suitable for many commands, e.g. umount. Bash does offer an API for everybody (i. e. every command) to get customized autocomplete behaviour.

I have always wondered why there are so many commands that do not install suitable autocompletion scripts for bash. But hey, it's open source. Maybe we should comment less and code more... Contributing those scripts might be more laborous than writing them, though. They don't just belong to bash but you have to work with every command's upstream.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 14:43 UTC (Thu) by nix (subscriber, #2304) [Link]

Or just contribute them to the bash-completion project's contrib directory. It installs autocompletion scripts for a lot of unrelated projects already...

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 8:28 UTC (Thu) by rvfh (guest, #31018) [Link]

> for example, autocomplete for umount only shows mounted devices

Ubuntu does that for me :-) Just checked on 10.04 at work.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:13 UTC (Thu) by fb (guest, #53265) [Link]

For some reason, I could never get over bash.

At multiple moments in the 90s and early 00s, I got so annoyed with tcsh (or the lack of it pre-installed in many places where I had no su-powers) that I tried to switch to bash. Every time I got so annoyed that I remained with tcsh as my main shell.

IIRC bash had hard limitations at what I could do in a prompt (short of running an embedded command), and at the time it had no completion.

After learning about ZSH, I lived happy ever after. (Note that I learned about it from a friend talking about a shell at which I could play Tetris.)

But you are right about 'good enough', there is a time in life where you can afford to be playing around with different shells, editors etc; later the cost and benefit of migration just don't make sense.

[...]

FWIW, ZSH 4.3.13 was released yesterday.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 6:47 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (156 responses)

Who cares? Bash and zsh (with an occasional busybox on embedded systems) are what people have stagnated on.

Microsoft does a wonderful job with PowerShell, it's getting better than anything Unix ever has had.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 9:10 UTC (Thu) by tuxmania (guest, #70024) [Link] (68 responses)

I hope you are joking because there is nothing new or novel in Powershell at all. Its convoluted compared to most other shells because MS has gone out of their way not to use old but very functional concepts like pipes.

I have used PS a fair bit and the thing that stands out is how much work has been put into being different for the sake of being different, not better.

If Microsoft had instead ported bash onto Windows they would have an army of good scripters. Now, MS fanboys i know just snear at PS and discard it as "that crap from Linux" and the people who would know how to handle it is put off by its stupitity. Its a fail-fail situation sadly.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 9:36 UTC (Thu) by HelloWorld (guest, #56129) [Link] (67 responses)

> I hope you are joking because there is nothing new or novel in Powershell at all.
Yeah right, except that it's unlike all bourne-style shells out there. Its pipes exchange structured data, and not just an unstructured byte stream. This is about as revolutionary as it gets as far as shells are concerned.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 9:54 UTC (Thu) by tuxmania (guest, #70024) [Link] (66 responses)

Well, in my ten years of shell scripting, i have never ever stumbled upon a problem that could not be solved easily. If the answer is structured data, your question is fundamentally different from your customers questions.

Many things is much harder to do in Powershell despite this "structured data" holy grail you talk about. Why? Because third party support for Powershell is abysmal, and even Microsoft tends to avoid it like the plague in many places, especially where there is a GPO for roughly the same thing somewhere, making it next to impossible to do what you really need to do.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:13 UTC (Thu) by HelloWorld (guest, #56129) [Link] (65 responses)

> Well, in my ten years of shell scripting, i have never ever stumbled upon a problem that could not be solved easily.
You must be a pretty unheeding person then. For example, the sort(1) utility is almost useless because pipes are just byte streams. Try sorting the output of, say, "aptitude show '~i'" by version. You can't, because "aptitude show" doesn't output one package per line (as that would be too long) and even if it did, you can't tell sort how to compare (i. e. by version instead of lexicographically).
Sorry, but UNIX shells are simply broken by design. The fact that they can be used to do useful things occasionally doesn't change that.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:43 UTC (Thu) by mpr22 (subscriber, #60784) [Link] (12 responses)

Um, you're not citing a deficiency in the shell. You're citing deficiencies in the programs made available to it.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:51 UTC (Thu) by HelloWorld (guest, #56129) [Link] (11 responses)

Yes in the sense that this problem can't be overcome by modifying the shell itself. No in the sense that the lack of some structured data format beyond byte streams is clearly a flaw in the original design of UNIX and its shell.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 18:09 UTC (Thu) by niner (subscriber, #26151) [Link] (10 responses)

You could also argue in the other direction. aptitude simply does not adhere to the convention set by the shell. The shell and scripting tools want line based data while aptitude prints multi line records. In such a situation, PS would not help. A correct example would be PS mandating structured output while a certain program would stuff all it's output into a single string field.

Having other conventions does not help magically. It's adherence to such conventions that makes a system work.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:29 UTC (Thu) by HelloWorld (guest, #56129) [Link] (9 responses)

> You could also argue in the other direction. aptitude simply does not adhere to the convention set by the shell. The shell and scripting tools want line based data while aptitude prints multi line records.
Even if aptitude would print one line per package (which, given the amount of information to be printed, would result in unreadable garbage), it still wouldn't work. There's no convention about which character encoding to use (which is why pipes really aren't text streams but merely byte streams), there's no convention about the separator character and there are no escaping rules, it's all just a huge clusterfuck.

> Having other conventions does not help magically. It's adherence to such conventions that makes a system work.
It's not about other conventions, it's about having conventions *at all*, and UNIX doesn't. Tools like cut(1) and awk(1) allow you to set the input record separator precisely because of that: there is no convention.

By the way, there's another, related problem in Unix shell design: the shell does the globbing, but it doesn't interpret the arguments, as that is left to the program being executed. But the program has no way to know whether an argument was produced by globbing or not, which gives you trouble when you have files named --foo around. It's all so broken...

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 2:30 UTC (Fri) by vonbrand (subscriber, #4458) [Link] (8 responses)

It's not about other conventions, it's about having conventions *at all*, and UNIX doesn't. Tools like cut(1) and awk(1) allow you to set the input record separator precisely because of that: there is no convention.

Right. Because you never have to sort stuff separated by anything but TAB. BTW, you can sort version-wise by sorting numerically and separating on '.'

By the way, there's another, related problem in Unix shell design: the shell does the globbing, but it doesn't interpret the arguments, as that is left to the program being executed. But the program has no way to know whether an argument was produced by globbing or not, which gives you trouble when you have files named --foo around. It's all so broken...

That is exactly the reason why many programs interpret "--" as "end of switches, rest is arguments." Besides, I have had the dubious pleasure of working with systems where each program was in charge of interpreting globs, and not too surprisingly each one did it their own way (or not at all), for all around confusion.

This is a command interpreter for interactive use we are talking about here, for crying out loud. That you can do a surpising amount of programming in it is a welcome bonus, so you don't have to go grab a full-fledged programming language all the time (in many cases you wouldn't bother and do it by hand or just give up). And most of the "problems" mentioned are just that the programs involved aren't built for easy chaining (can't ask for one-line records, hard to parse output, ...), not some "Unix failure" per se.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 8:56 UTC (Fri) by HelloWorld (guest, #56129) [Link] (7 responses)

> That is exactly the reason why many programs interpret "--" as "end of switches, rest is arguments."
I know, but nobody ever thinks of using that in practice, making it useless in practice. Not to mention that there are also many tools that don't support this.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 10:24 UTC (Fri) by nicku (guest, #777) [Link] (6 responses)

> That is exactly the reason why many programs interpret "--" as "end of switches, rest is arguments."
I know, but nobody ever thinks of using that in practice, making it useless in practice.
I don't understand why you say this. I and my workmates use this whenever required.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 10:29 UTC (Fri) by HelloWorld (guest, #56129) [Link] (5 responses)

> I don't understand why you say this.
Because that's the way that it is.

> I and my workmates use this whenever required.
Then you're obviously more attentive than the average shell programmer.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 10:33 UTC (Fri) by dlang (guest, #313) [Link]

how large a study did you make to define that the 'average shell programmer' doesn't know this?

and how do you define 'shell programmer'

I would say that people who are making their living on a team working with the shell (sysadmin types) have a pretty good chance of knowing this, but people who are tinkering and learning alone are less likely to have learned this, but such people are far more likely to have gaps of all kinds in their knowledge.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:10 UTC (Fri) by mpr22 (subscriber, #60784) [Link] (3 responses)

I think it's safe to say that for any widely-used programming language, there is a sizeable group you can more or less reasonably describe as being "average $LANGUAGE programmers" who are not very good at adhering to best practice. So, saying "the average $LANGUAGE programmer doesn't do $BEST_PRACTICE_ITEM" seems likely to be true, obvious, and unenlightening.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:32 UTC (Fri) by HelloWorld (guest, #56129) [Link] (2 responses)

The question is what consequences to draw from it. And the answer is obvious: solve the problem some other way so that the best practice isn't needed any longer. What this boils down to is that shell programming should be avoided wherever possible.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 13:06 UTC (Fri) by dlang (guest, #313) [Link] (1 responses)

by that logic, no languange should ever be used for programming

people keep trying to produce things that can be extended to do things that weren't initially programmed in via config files of various kinds, but evenutally every one of these config files grows into (or adopts) some sort of programming language.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 13:23 UTC (Fri) by HelloWorld (guest, #56129) [Link]

> by that logic, no languange should ever be used for programming
Well, this is not a binary thing. The thing is that in order to write reliable shell scripts, you need to jump through hoops *all the time* (i. e. every time you use a glob, every time you use sed in a locale other than the one you have tested your script with etc.) in order to stop bad things from happening.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:53 UTC (Thu) by lkundrak (subscriber, #43452) [Link] (1 responses)

Please, show me how to sort output of aptitude show '~i' with powershell.

Thank you.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 11:03 UTC (Thu) by HelloWorld (guest, #56129) [Link]

Oh my, what a poor trolling attempt.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 13:45 UTC (Thu) by cortana (subscriber, #24596) [Link] (30 responses)

I'll bite. The 'sort' utility wants to sort line-based output. The output of aptitude is not line-based. It uses the Debian RFC822-like format. The right thing to do would be to find a tool that sorts this output based on a particular field. You could argue that there's a hole here that could be filled by some kind of sort-dctrl tool.

...

I wrote that paragraph before I noticed that I already have a sort-dctrl program on my system; it's part of dctrl-tools. Unfortunately it doesn't actually work with aptitude's output; aptitude doesn't output deb822 data... it doesn't use a . to represent blank lines in its fields. :)

Anyway, I actually agree that PowerShell is pretty cool. I find it utterly unusable, however, because it's still welded to the utterly dreadful excuse for a terminal emulator that is conhost.exe. In 2011, essential features such as multiple tabs, run-based selection and the ability to click URLs are entirely absent, and the features that do exist, such as resizing windows, block based selection and changing the font/colours are buried behind a UI that seems to be actively designed to get in my way, making me want to nuke the computer and install Debian rather than be forced to use them for even one minute longer.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 14:54 UTC (Thu) by HelloWorld (guest, #56129) [Link] (27 responses)

> The 'sort' utility wants to sort line-based output.
Yes, and that is precisely what makes it useless. And actually, it can't even sort lines properly; try sorting, say, the output of ps by ppid or something. Yeah, it's doable, but it's messy, and it needn't be.

> I wrote that paragraph before I noticed that I already have a sort-dctrl program on my system; it's part of dctrl-tools. Unfortunately it doesn't actually work with aptitude's output; aptitude doesn't output deb822 data... it doesn't use a . to represent blank lines in its fields. :)
That pretty much proves the point. Now we already have two sorting utilities that are useless as they can't sort what I want them two. Contrast with PowerShell's Sort-Object, which makes it trivial to sort processes listed by Get-Process.

> Anyway, I actually agree that PowerShell is pretty cool. I find it utterly unusable, however, because it's still welded to the utterly dreadful excuse for a terminal emulator that is conhost.exe.
The terminal emulator can easily be replaced, for example with this one:
http://sourceforge.net/projects/console/
On the other hand, a fundamental problem like the lack of some kind of structured data exchange format can't be solved that easily.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 15:35 UTC (Thu) by cortana (subscriber, #24596) [Link] (10 responses)

You're complaining that the sort utility is useless because it can't sort every possible thing you would ever want it to sort. I'm saying that is a misunderstanding of what sort is supposed to do. It only sorts line-based data. For example, "ps -o ppid,pid | sort -n" seems to work quite nicely. The sort-dctrl example is unfortunate, but not terribly suprprising; the output of aptitude is, after all, designed for human consumption; 'grep-status -F installed | sort-dctrk -k version:v' happily tells me that fonts-sil-gentium is the package installed with the highest version number (20081126:1.02-12, in an amazing misunderstanding of the purpose of the epoch part of the Debian version numbering scheme...)

I'm not saying that PowerShell isn't terribly clever and all that, but I don't see every tool I use on a daily basis being rewritten to output its data in some standard structured format, and consume data in that format. As a result, actually using Powershell is annoying because it comes with some nice built-in stuff, but not enough third-party programs bother to make use of it. At best you end up with a bunch of scripts that parse the structured output of other programs and turn them into whatever it is that PowerShell uses internally so that Sort-Object & co. will deal with them. At that point you may as well just use a programming language that provides a decent selection of data structures in the first place (e.g., python, perl), with a module that gives you the data you want in such a structure.

Finally, and to digress, thanks for the link to Console2. Everyone who uses Windows should have it installed. Although it makes using the command-line on Windows a much less frustrating experience, it doesn't do all of what I want and sadly introduces its own quirks into the mix. It's funny you say that the lack of a standard structured data format is a fundamental problem with Unix; after many years of being trying to various conhost replacements, I've come to the conclusion that there is in fact something terribly wrong within some fundamental layer of Windows itself that makes it impossible to implement a decent terminal emulator. If that were not the case then surely someone would have done so by now; the awfulness of the command prompt on Windows is not so much an itch that needs scratching as a foot-long needle through the eye that inflicts a constant searing pain upon all users of the platform. Whereas I think the problem with a standard for structured data is that there are already plenty of standards to choose from; line-based character-separated or deb822 if you merely want a flat list of data with attributes; JSON or XML if you want something that can represent a tree of data, to name but a few. The power of Unix's use of unstructured data is that you are able to pick and choose with of these formats, plus many others, you can use when glueing together other bits of software.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 16:07 UTC (Thu) by halla (subscriber, #14185) [Link]

I always thought that the Windows console was intentionally painful to make people stop using it and learn to love the gui.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 16:47 UTC (Thu) by nix (subscriber, #2304) [Link]

the awfulness of the command prompt on Windows is not so much an itch that needs scratching as a foot-long needle through the eye that inflicts a constant searing pain upon all users of the platform
Now that's a QoTW candidate if ever I've seen one.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:16 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (4 responses)

<flamebait mode>
Heh. I have COMPLETELY opposite experience. Console on Linux sucks and can't be fixed by any means short of going back in time and shooting K&R before they start writing the first line of Unix code.

Standard _console_ _emulator_ on Windows definitely sucks, that's true. But the console layer itself is quite robust and easy to use.

For example, there is no freaking way in Unix consoles to detect if a modifier key is pressed. Then there's the fact that Unix consoles are in reality a result of adding features to something that initially had been a line printer output.

As a result, making applications with good text UI in console is close to impossible. That's why Midnight Commander is such a buggy POS. In fact, Unix consoles are so bad that tmux/screen had to implement console emulators working on top of console emulators because it's impossible to reliably track the state of a console.
</flamebait mode>

I really really love one good Windows console application - FAR Manager. It's far ahead of everything else on the planet for console-style work.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:19 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (3 responses)

> But the console layer itself is quite robust and easy to use.

Are you on the payroll, or are you just completely unacquainted history and actual programming?

The Windows console layer has no pseudoterminals. That's why there are a dozen remote-command-invocation facilities in Windows, and it's why none of them works right. It's why Console2 fails badly when talking to some programs (e.g., cmd) --- Windows programs are forced to use pipes instead of consoles, so programs run under a console other than conhost itself don't even know they're being used interactively, resulting in various horrible issues the Unix world fixed 20 years ago.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:33 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (2 responses)

No. I'm just trying to port FAR Manager to Linux without going insane. Alas, sanity is fleeing fast so I'll soon need some of Linus' meds.

>The Windows console layer has no pseudoterminals.
Which is good. Because pseudoterminals are a braindead idea. It's essentially a channel to transmit bytecode which is executed by terminal emulator to draw text.

It would have been OK, but terminal emulators suffer from terminal schizophrenia. They try to act as if they are line printers while at the same time trying to act as if they are driving a VT100 console. Which fails badly.

In Windows consoles are just consoles. I.e. a screen buffer on which you can draw text - and that's basically all. You can do whatever you want with it.

Instead of pseudoterms you can create child consoles (including invisible ones) and use them to start other programs. I'm not sure why console2 doesn't do it.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:44 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (1 responses)

> No. I'm just trying to port FAR Manager to Linux without going insane.

Consider using the VTE widget from GNOME: http://developer.gnome.org/vte/0.30/

> It's essentially a channel to transmit bytecode which is executed by terminal emulator to draw text.

Yes, the unix terminal system uses in-band signaling. In-band signaling has advantages and disadvantages with respect to out-of-band signaling, but in the case of the tty layer, the advantages outweigh the disadvantages, which have all been overcome. The chief advantage is being to use a terminal-agnostic channel (like a TCP connection) over which a terminal-program and a terminal-emulator can communicate. The Windows world has no equivalent because its console uses opaque out-of-band signaling.

> Which fails badly.

No, the terminal system actually works very, very well in practice.

> Instead of pseudoterms you can create child consoles (including invisible ones) and use them to start other programs. I'm not sure why console2 doesn't do it.

The reason console2 doesn't use the technique you describe is that it doesn't work. Yes, you can create "child consoles" and associate processes with them, but there's no "master side" interface to these child consoles: there is no way to extract text from them, to see what your child programs are attempting to output and output it yourself in a controlled manner. Yes, you can scrape the console, but the scraping process has inherent race conditions. (And scraping is messy besides: you're complaining about terminal control codes?)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:54 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

>Consider using the VTE widget from GNOME: http://developer.gnome.org/vte/0.30/

I want it to work in plaing text consoles (over SSH). There are more than enough GUI file managers. Right now I'm trying to layer it over tmux, and I'm slowly making progress.

>Yes, the unix terminal system uses in-band signaling. In-band signaling has advantages and disadvantages with respect to out-of-band signaling, but in the case of the tty layer, the advantages outweigh the disadvantages, which have all been overcome.

I don't mind the idea of transmitting bytecode for drawing. I just don't like the way it works right now. It's messy, ugly, buggy and so on.

Have you ever tried to read the code of terminal emulators? Nightmares of escape sequences still plague my dreams :)

>The reason console2 doesn't use the technique you describe is that it doesn't work. Yes, you can create "child consoles" and associate processes with them, but there's no "master side" interface to these child consoles: there is no way to extract text from them, to see what your child programs are attempting to output and output it yourself in a controlled manner. Yes, you can scrape the console, but the scraping process has inherent race conditions. (And scraping is messy besides: you're complaining about terminal control codes?)

You can associate a child console with parent which would make it visible. But yes, I guess it's one week spot. As far as I remember, conman (console manager) worked around it by intercepting console layer calls by injecting a DLL into address space of child processes.

If I were to design a console layer right now, then I'd have have chosen Windows model with explicitly defined marshalling protocol.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:50 UTC (Thu) by HelloWorld (guest, #56129) [Link] (2 responses)

> You're complaining that the sort utility is useless because it can't sort every possible thing you would ever want it to sort. I'm saying that is a misunderstanding of what sort is supposed to do. It only sorts line-based data.
Yes, and the only reason it's so limited is that the Unix shell paradigm doesn't allow anything better. Which was the point all along. Is that so hard to understand?

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:15 UTC (Thu) by ebiederm (subscriber, #35028) [Link] (1 responses)

Worse is better

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:23 UTC (Fri) by nix (subscriber, #2304) [Link]

Except when worse is worse. Case in point that has been raised before here: the Windows Command Prompt. It's worse than xterm, the Linux console, and screen(1). It's also... worse.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 23:13 UTC (Thu) by GhePeU (subscriber, #56133) [Link] (15 responses)

> Yes, and that is precisely what makes it useless. And actually, it can't even sort lines properly; try sorting, say, the output of ps by ppid or something. Yeah, it's doable, but it's messy, and it needn't be.

I don't find 'ps -Ao user,pid,ppid,time,command | sort -k3n' particularly messy.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 9:06 UTC (Fri) by HelloWorld (guest, #56129) [Link] (14 responses)

OK, I admit I didn't know about the -k switch. Which pretty much proves the point: If you want to work with untyped byte streams, you have to learn new switches for every program that you want to work on some specific field of a line. That isn't necessary if there's a standard mechanism for structured data.

Sorry, but this is wrong. Dead wrong.

Posted Dec 9, 2011 9:28 UTC (Fri) by khim (subscriber, #9252) [Link] (13 responses)

That isn't necessary if there's a standard mechanism for structured data.

Not if your programs continue to produce "untyped byte streams".

PowerShell is classic example of well-known problem.

With bash you need to know how to handle mess which existing programs produce and consume. With PowerShell you still need to know that (see lots of threads above about how to handle it - it's as ugly as in bash if not uglier), but you also have thus nice, user-friendly way which does not work all that often, but presented as the solution "for the future". Just like all other "grand unification schemes" before and after it'll fail so to say that PowerShell is better then bash is crazytalk: it's strictly worse and this is very easy to prove...

PowerShell must die - but I'm content because it'll do just that anyway. It'll take a long time, but now, when madness around JVM and CLR is slowly dying because people understand that they were duped it's only matter of time.

It'll be interesting to see if Microsoft will survive destruction of it's .NET dream or if it'll die off with too. I'd be relighted to see that, but don't hold my breath: Microsoft still looks way too resilient for that. Albeit lately it does so many stupid things die-off is actually likely. We'll see.

Sorry, but this is wrong. Dead wrong.

Posted Dec 10, 2011 2:52 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link] (12 responses)

Nope, sorry. You don't know what you're talking about.

PowerShell _is_ a unified language which interoperates with all other .NET languages. It does allow to transfer structured data in a unified way. UNIX people should sometimes leave their self-righteousness at the door and actually check what other people are doing.

PowerShell is definitely more powerful than bash or zsh because IT IS BUILT ON A GOOD FOUNDATION. It's that simple.

Bash knows nothing but byte streams. That's nice - for 1973. You can construct pipes in bash and then try use them to do something. But:
1) There is no static typing or checks. Everything is dynamic which combines nicely with...
2) No real way to unit-test your scripts.
3) No central documentation or help system. And no, 'man' isn't it. Scripts and utilities have all kinds of argument formats without any real convention. Which leads to...
4) No discoverability. Autocompletion is not sanely possible in bash or zsh.
5) Abysmal tooling. The only real tool is just a text editor. Continuous integration? Unit testing? Static analysis? Code coverage analysis? Who needs them, not us!

Keep dreaming...

Posted Dec 10, 2011 6:18 UTC (Sat) by khim (subscriber, #9252) [Link] (4 responses)

PowerShell _is_ a unified language which interoperates with all other .NET languages.

In other words: it's LISP Machines all over again. In a hypothetical world where everyone is using .NET it may even be a good thing. In real world where .NET rage is on decline and it's now clear that .NET will not ever take over the world... not so much.

UNIX people should sometimes leave their self-righteousness at the door and actually check what other people are doing.

UNIX people have looked from sidelines when first coming of LISP machines have come and gone. They'll do that a second time (and third time - if that'll ever happen), they are patient guys. They'll pick some good ideas from the whole sad story, as usual, but the whole "let's redo everything from scratch" thing is just not what UNIX people plan to ever do.

PowerShell is definitely more powerful than bash or zsh because IT IS BUILT ON A GOOD FOUNDATION.

It's more powerful but it's more complex, too. You may as well say that emacs prompt is more powerful then bash - and you'll be right. But it only helps you if you want to want in "Emacs OS". When you need to interact with a real world it's more complex then bash or zsh because it needs to do what bash/zsh are doing and it needs to do it in a way which makes it possible to easily do things in it's own world.

Sure, it's possible to use emacs as as shell replacement - but few people do. PowerShell is the same way. But unlike Emacs it's just as shell wannabe, nothing else, so I'm not sure it'll survive when all the rage about this generation of LISP Machines will die off.

Bash knows nothing but byte streams. That's nice - for 1973.

Long list of arguments skipped. Just note how everything you've cited was solved in "the first coming" (on LISP Machines), too.

The difference is in scale: first time tens of millions were burned thus only thousands of machines were produced and the whole saga took about ten years on sidelines. Second time architecture astronauts had billions so the whole story took longer and rage was raised higher. The result is the same (one company was burned to a crisp, second one is stagnating, but is more-or-less ready to throw all these baubles out), it just took longer.

When the hype will die off PowerShell will either die with .NET or it'll be used in some limited settings, while bash will live as a mainstream tool (albeit may be in split personality if Apple will continue to fear GPLv3).

P.S. Note that I'm not saying .NET will die off totally: "first coming" left Emacs behind, I'm pretty sure "second coming" will leave few IDEs behind, too. Apparently that's the niche where all your vaunted advances are really helpful.

Keep dreaming...

Posted Dec 10, 2011 7:18 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link] (3 responses)

Yeah, yeah. .NET is just a rehash of old LISP machines, everything has already been invented in 1969, Lennart is an agent of Microsoft and we kids should get out of your lawn.

Nothing had ever came close to PowerShell in functionality and usability for system shells. And no, old elisp shells or Rexx scripts ARE NOT the answer. They don't offer the two most powerful features of PowerShell: introspection and structured pipelines.

>It's more powerful but it's more complex, too. You may as well say that emacs prompt is more powerful then bash - and you'll be right. But it only helps you if you want to want in "Emacs OS". When you need to interact with a real world it's more complex then bash or zsh because it needs to do what bash/zsh are doing and it needs to do it in a way which makes it possible to easily do things in it's own world.

Sorry, but PowerShell can do ANYTHING bash can do. Without exceptions. Some things might be more clumsy than in bash - because PowerShell is not bash and things are done differently there.

And PowerShell even beats bash at its own game! PowerShell integration with git is much nicer than in bash: https://github.com/dahlbyk/posh-git - and achieves the similar functionality in a fraction of lines of code. A few screenshots can be seen here: http://markembling.info/2010/03/git-powershell-revisited

It does not matter...

Posted Dec 10, 2011 8:41 UTC (Sat) by khim (subscriber, #9252) [Link] (2 responses)

Nothing had ever came close to PowerShell in functionality and usability for system shells. And no, old elisp shells or Rexx scripts ARE NOT the answer. They don't offer the two most powerful features of PowerShell: introspection and structured pipelines.

They do offer introspection and as for "structured pipelines"... this is only important if you want to pretend you are replacing "shell proper".

Sorry, but PowerShell can do ANYTHING bash can do. Without exceptions.

So what? Any turing-complete language with support for fork/exec (or CheateProcess on Windows) can do the same.

Some things might be more clumsy than in bash - because PowerShell is not bash and things are done differently there.

And that's the point: PowerShell can not replace bash so bash will survive while PowerShell will die (or at least will be relegated for niche work).

If you don't like LISP Machines analogue then I have another for you: Space Shuttle. This extremely expensive program was invented to replace old rockets: Atlas, Delta, Saturn, etc. Just like PowerShell it promised the world and delivered very little¹. After all the hype and all the expenses it was only able to replace few of them - the ones which were forcibly killed to "open the road for the future". Today Space Shuttle is history and in it's place is huge glaring void - while people who were sensible enough to continue to use "old school" tools (Delta, Proton, Soyuz, etc) are still around and doing what Space Shuttle was supposed to do.

PowerShell integration with git is much nicer than in bash: https://github.com/dahlbyk/posh-git - and achieves the similar functionality in a fraction of lines of code.

It does not matter how many lines of code it requires. The same story as with Space Shuttle: sunk costs. When people try to justify Space Shuttle craze they compare development costs of Saturn V² and Space Shuttle - but this is completely wrong: it was not possible to get money spent on Saturn V back, thus you need to compare ongoing costs of Saturn V with new development costs of Space Shuttle - and there are no comparison.

CMD/Bash is here, it's not goes away so program will need to support it anyhow. This means all additional developments (things like cmdlets) should be counted extra.

¹) Well, one nice property of this stupidity was the fact that they managed to convince USSR to waste a lot of resources for the mirror project - and if this was the goal then the project can be considered success.

²) Why Saturn? Well, alternative for Space Shuttle idiocy was continuation of Saturn program so it's only natural to compare Space Shuttle to Saturn.

It does not matter...

Posted Dec 10, 2011 13:15 UTC (Sat) by dlang (guest, #313) [Link]

the shuttle was also an attempt to 'greenify' space travel, by making things reusable.

It does not matter...

Posted Dec 11, 2011 1:03 UTC (Sun) by Cyberax (✭ supporter ✭, #52523) [Link]

>They do offer introspection and as for "structured pipelines"... this is only important if you want to pretend you are replacing "shell proper".

Please show me how I can tab-complete SQL queries in any elisp shell. I'd like to be able to at least complete all table names in all contexts.

PowerShell is freakingly powerful in that regard - I already use it instead of psql shell to work with Postgres databases. Mostly because autocompletion in posh based on simple introspection is miles ahead of what's present in psql.

And that says something.

>If you don't like LISP Machines analogue then I have another for you: Space Shuttle

Yup. It's funny because bash is very much like Space Shuttle: it's old, it's expensive to use (bash hell scripts are not easy to write), it's prone to crashes, etc. The only reason it's used is because of a sunken costs of tons of hell scripts.

Sorry, but this is wrong. Dead wrong.

Posted Dec 11, 2011 0:11 UTC (Sun) by nix (subscriber, #2304) [Link] (6 responses)

3) No central documentation or help system. And no, 'man' isn't it.
So... the central Unix documentation system is not acceptable because... you say so? I'd rather use man than HTML Help with its abysmal searching, insulting baby talk, and horrendous security holes.
Scripts and utilities have all kinds of argument formats without any real convention.
Which would be why POSIX, uh, standardized them more than ten years ago, and imposed rules which virtually all standard tools follow. A couple of holes exist, mostly for backward compatibility or ease-of-use's sake (e.g. tail and head's - and +-based arguments) and a couple are just deeply unusual (e.g. find and dd), but most are pretty consistent, and the ones with weird user interfaces get comprehended in the end by frequent use. I'd prefer Lisp everywhere, but, face it, it's not gonna happen.
4) No discoverability. Autocompletion is not sanely possible in bash or zsh.
You carefully named the two shells which have extremely extensive autocompletion, in zsh's case shipped with the package. I don't know what you're trying to do, but you're doing such a good job arguing against yourself that I don't see why I need bother. (Yes, it's not 100% automatable without a bit of per-tool scripting. The workload is minimal compared to writing the tools. Perhaps in an ideal world it could be completely automated, but that just shifts the burden from writing the autocompletion code to writing some sort of reflective description of the system. Big deal.)

Sorry, but this is wrong. Dead wrong.

Posted Dec 11, 2011 5:36 UTC (Sun) by Cyberax (✭ supporter ✭, #52523) [Link] (3 responses)

>So... the central Unix documentation system is not acceptable because... you say so?
Nope. Man's shortcomings are well known. It's basically a simple indexed document storage, without built-in searching and poorly structured.

PowerShell allows to embed documentation directly into objects and also structure it by parameters, methods, etc.

And there's nice "-online" switch that leads you directly into the TechNet article associated with tool, with Q&A and other additional functionality.

Can you make your scripts in your home directory have their documentation automatically be included and made searchable into the central help system?

>I'd rather use man than HTML Help with its abysmal searching, insulting baby talk, and horrendous security holes.

You're in luck because PowerShell doesn't use HTML Help :)

>Which would be why POSIX, uh, standardized them more than ten years ago, and imposed rules which virtually all standard tools follow.

Yeah, sure.
> tar czf myfile myfile.tgz
Damn, I just deleted 'myfile' with doubly-archived myfile.tgz

> dd if=... of=...
Very standard.

Then there are --argument=blah and "--argument blah" forms which don't always work. And then there are short forms which sometimes require argument to be written immediately without intervening spaces.

In PowerShell _everything_ is standardized _and_ autocompleteable. I can do things like "my-command -p<tab>" and get the list of parameters (with description and default values!) starting with 'p'. This is sort of possible in bash/zsh with cooperating tools, but in PowerShell it's all completely automatic.

And since PowerShell is a static language with type inference, it'll warn me if I write stuff like this "destroy-hard-drive -in hello -seconds" (because 'hello' is not an integer).

>You carefully named the two shells which have extremely extensive autocompletion, in zsh's case shipped with the package

I know perfectly well how bash/zsh autocompletion work.

>Yes, it's not 100% automatable without a bit of per-tool scripting. The workload is minimal compared to writing the tools. Perhaps in an ideal world it could be completely automated, but that just shifts the burden from writing the autocompletion code to writing some sort of reflective description of the system. Big deal.

And in PowerShell I get autocompletion basically for free. And it's good. It's VERY good. I'm actually using PowerShell instead of Postgres's psql because PowerShell is much more powerful.

I can autocomplete table names while writing SQL queries. In command line.

This is not really possible in bash/zsh because text matching games only can lead you so far. Even autocompletion for scp in bash is already straining things.

I wish people would sometimes go out and see what's happening outside of the Linux/Unix world. It's not a wonder that the most popular Linux distribution is actually barely a Unix system.

Sorry, but this is wrong. Dead wrong.

Posted Dec 11, 2011 11:07 UTC (Sun) by nix (subscriber, #2304) [Link] (1 responses)

I'll agree that man is not a very nice documentation system, but it *is* one, so your claim that it does not exist is vacuous on its face.
PowerShell allows to embed documentation directly into objects and also structure it by parameters, methods, etc.
Ah. Like POD, doxygen, and similar systems, all of which can generate manpage output. i.e., dead heat here.
And there's nice "-online" switch that leads you directly into the TechNet article associated with tool, with Q&A and other additional functionality.
That's great -- if and only if Microsoft wrote the tool. Only useful in a software monoculture.
Can you make your scripts in your home directory have their documentation automatically be included and made searchable into the central help system?
Yes. Learn about apropos databases and MANPATH. Manpages can be stored absolutely anywhere (as can info pages).

Most of the rest of what you say is a combination of ignorance of what the Unix tools you discuss can actually do, and complaints that things are not acceptable because they're not just like PowerShell does them. We get that you like it, but we've all been through this parochial 'the newest system I just saw is the answer to everyone's prayers' phase, and, y'know? It's always wrong. There are limitations there: you're just not seeing them.

Sorry, but this is wrong. Dead wrong.

Posted Dec 11, 2011 11:50 UTC (Sun) by Cyberax (✭ supporter ✭, #52523) [Link]

>I'll agree that man is not a very nice documentation system, but it *is* one, so your claim that it does not exist is vacuous on its face.

I'm claiming that it's not a GOOD system.

>Ah. Like POD, doxygen, and similar systems, all of which can generate manpage output. i.e., dead heat here.

So... How do I make documentation for my bash script (with all its options), make a man page, link it into the central system and all of it without doing anything more than simply declaring options?

And no, doxygen won't help you - it doesn't support bash (I'd actually tried to find an automatic documentation system for bash some time last year - there was none).

>That's great -- if and only if Microsoft wrote the tool. Only useful in a software monoculture.

That's easily adapted if tools' authors provide their own URLs (which they can do in PowerShell - VMWare has its own doc system, for example).

>Yes. Learn about apropos databases and MANPATH. Manpages can be stored absolutely anywhere (as can info pages).

I know perfectly well how most of Linux tools work. I know that one can maintain their own local man databases. And I also know perfectly well that almost nobody does, mostly because it's complicated and error-prone.

>Most of the rest of what you say is a combination of ignorance of what the Unix tools you discuss can actually do, and complaints that things are not acceptable because they're not just like PowerShell does them.

I know most of the standard Unix tools. I've built my own custom distributions from scratch (first time without the benefit of the LFS book) and support a network of embedded devices. I've been using Linux on my desktops since 90-s and can recollect all the steps that have been taken to make Linux to be at least possible to use on desktop.

A lot of these steps involved bashing at least some old Unix-heads with spiked hammers: udev, HAL, KMS, dbus to name a few. Oh, and the whole 'Android' thingie. Now the same thing repeats with pulseaudio, systemd and journald.

>We get that you like it, but we've all been through this parochial 'the newest system I just saw is the answer to everyone's prayers' phase, and, y'know? It's always wrong. There are limitations there: you're just not seeing them.

Some things do solve all the (existing) problems. Because they are designed to solve them.

PowerShell is one such example. It's designed to be a better shell than text-based shells and it excels at it. It's not yet as polished as bash/zsh but it's getting better with each new release.

Of course, PowerShell has limitations and a set of new problems, but so does bash/zsh. And limitations of bash/zsh are MUCH more constricting.

Sorry, but this is wrong. Dead wrong.

Posted Dec 16, 2011 5:11 UTC (Fri) by tom.prince (guest, #70680) [Link]

>>Yes, it's not 100% automatable without a bit of per-tool scripting. The workload is minimal compared to writing the tools. Perhaps in an ideal world it could be completely automated, but that just shifts the burden from writing the autocompletion code to writing some sort of reflective description of the system. Big deal.

>And in PowerShell I get autocompletion basically for free. And it's good. It's VERY good. I'm actually using PowerShell instead of Postgres's psql because PowerShell is much more powerful.

You only get this for free, if your software is designed to work with PowerShell, otherwise you are in the same boat as with zsh/bash.

There is at least one framework (twisted-python) that automatically generates zsh completions from the description of options.

Sorry, but this is wrong. Dead wrong.

Posted Dec 11, 2011 16:21 UTC (Sun) by anselm (subscriber, #2796) [Link] (1 responses)

Which would be why POSIX, uh, standardized them more than ten years ago, and imposed rules which virtually all standard tools follow.

Some of the most basic aspects were indeed standardised by POSIX, but that doesn't detract from the fact that, say, the option to specify a field delimiter is »-d« for cut(1), »-t« for sort(1), »-F« for awk(1), and so on. POSIX basically codified the wild hodgepodge that existed at the time.

Now the meaning of common options (as opposed to basic option syntax) would have been something actually worth standardising, but of course it would have rendered 20 years' worth of shell scripts virtually useless, so it didn't happen.

Sorry, but this is wrong. Dead wrong.

Posted Dec 11, 2011 17:03 UTC (Sun) by raven667 (subscriber, #5198) [Link]

I also think that standardization wasn't an unmitigated positive because it solidified and made rigid 20 years worth of crufty, work-in-progress system software and didn't allow for any forward progress for cleaning up the mess. GNU helped a lot but only now are we going back and re-thinking the system from the beginning and making it super awesome. Plan9 did much of the same work but failed because it wasn't a gradual, in-place migration path.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 0:06 UTC (Fri) by grg (guest, #76756) [Link] (1 responses)

essential features such as multiple tabs, run-based selection and the ability to click URLs are entirely absent

I use xterm and reject all other X based terminal emulators. Its small, fast, and has enough features for me. I don't need multiple tabs or the ability to click urls and I don't know what the hell run-based selection is supposed to be - these don't seem particularly essential.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 2:44 UTC (Fri) by cortana (subscriber, #24596) [Link]

Run based selection as opposed to block based selection. In Windows you can only select by block. This drives me up the wall whenever I have to use Windows!

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 4:26 UTC (Fri) by jthill (subscriber, #56558) [Link] (18 responses)

> Try sorting the output of, say, "aptitude show '~i'" by version.

It's a one-liner.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 5:09 UTC (Fri) by HelloWorld (guest, #56129) [Link] (17 responses)

Depending on the length of the line, everything is.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 5:26 UTC (Fri) by jthill (subscriber, #56558) [Link] (16 responses)

>You can't.

It's a perfectly reasonable one-liner.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 8:53 UTC (Fri) by HelloWorld (guest, #56129) [Link] (15 responses)

Go ahead, show me.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:29 UTC (Fri) by jthill (subscriber, #56558) [Link] (14 responses)

This question belongs on linuxquestions.org or some such, not here.

> OK, I admit I didn't know about the -k switch

But before you take it there, you might want to spend more than the few seconds you've apparently spent on the attempt.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:39 UTC (Fri) by HelloWorld (guest, #56129) [Link] (13 responses)

Dude, the -k switch doesn't help at all in this case, as "aptitude show" prints multi-line records. It works for ps as that prints one record per line.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 21:27 UTC (Fri) by jthill (subscriber, #56558) [Link] (12 responses)

Take it to Linux - Newbie. This isn't the forum for spoon-feeding.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 21:40 UTC (Fri) by HelloWorld (guest, #56129) [Link] (1 responses)

Dude, do you even realize that you _completely_ miss the point? The point isn't that I don't know how to do it, the point is that it's harder to do than it should be.

Well, I probably shouldn't be wasting my time with a bloody stupid troll like you.

Enough

Posted Dec 9, 2011 22:26 UTC (Fri) by corbet (editor, #1) [Link]

OK, folks (plural), that's enough. I'm sure there's a nice site for name-calling over --------> there somewhere, but please try not to do it here.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 22:10 UTC (Fri) by HelloWorld (guest, #56129) [Link] (9 responses)

Oh, by the way, the fact that you still haven't shown how to do this with a reasonable one-liner makes me think that you can't.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 23:10 UTC (Fri) by jthill (subscriber, #56558) [Link] (8 responses)

sed -nr '/^Package:/!{H;/^Version:/{G;h};$!d};x;s/\n/\x0/gp'|sort -V|sed 's/[^\x0]*\x0//;s/\x0/\n/g'

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 23:33 UTC (Fri) by HelloWorld (guest, #56129) [Link] (7 responses)

We clearly have a different idea about what a reasonable one-liner is. And it doesn't work when another locale is set.

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 2:19 UTC (Sat) by nybble41 (subscriber, #55106) [Link] (6 responses)

aptitude search -F %p -O version '~i' | xargs aptitude show

Evolution of shells in Linux (developerWorks)

Posted Dec 11, 2011 10:39 UTC (Sun) by HelloWorld (guest, #56129) [Link] (5 responses)

You're missing the point. The point is that it's not possible to use the sort(1) tool to actually sort stuff. The fact that aptitude offers the -O switch for sorting is a symptom of that problem, as it wouldn't be necessary to implement sorting in aptitude if Unix shells sucked less.

Evolution of shells in Linux (developerWorks)

Posted Dec 11, 2011 22:48 UTC (Sun) by nybble41 (subscriber, #55106) [Link] (4 responses)

It's not like PowerShell automatically adds support for sorting arbitrary data to an existing sort utility, either. Microsoft also had to write a new sort utility to handle structured data, which in turn relies on API support from the applications supplying that data. You could get the same effect by standardizing the labeled-groups-of-lines format output by 'aptitude show' and writing a sort utility specialized on that format (a 10-minute job in any major scripting language). Or you could just use one of the many portable interactive scripting language shells, like python or irb or scsh, which provide the same services for their respective bindings that PowerShell provides for .NET.

An aptitude-equivalent under PowerShell would need to export the ability to compare package objects by version for any third-party sort utility to work with the data anyway, which is 90% of the way toward just implementing the -O option internally.

"Structured data" is nearly always the first thing anyone tries when implementing a new system. A number of older systems don't support anything *but* structured data for disk files, for example--I've worked on operating systems where even plain text files were stored, one row per line, as tables in a database. The fact that all modern systems actually treat disk files and most inter-process communications as plain byte streams should tell you something. Standardized data *formats*, encoded as plain byte streams, are more flexible and resilient than standardized APIs, as used by PowerShell.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 5:58 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link] (3 responses)

>It's not like PowerShell automatically adds support for sorting arbitrary data to an existing sort utility, either.

But it does. You can easily sort any data on any field even using your own custom comparator (try that with 'sort'!), using only one utility. Which had to be written, just as 'sort' utility in Unix.

>You could get the same effect by standardizing the labeled-groups-of-lines format output by 'aptitude show' and writing a sort utility specialized on that format (a 10-minute job in any major scripting language). Or you could just use one of the many portable interactive scripting language shells, like python or irb or scsh, which provide the same services for their respective bindings that PowerShell provides for .NET.

Well, of course you can achieve the same functionality as in PowerShell by replicating what PowerShell does (namely, by introducing structured data). The point is - no one is really doing it in the Unix world.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 6:25 UTC (Mon) by nybble41 (subscriber, #55106) [Link] (2 responses)

>> It's not like PowerShell automatically adds support for sorting arbitrary data to an existing sort utility, either.

> But it does. You can easily sort any data on any field even using your own custom comparator (try that with 'sort'!), using only one utility. Which had to be written, just as 'sort' utility in Unix.

The point was that the 'sort' utility used in PowerShell only really adds anything over the standard Unix sort command if the source application includes bindings for .NET.[1] A similar utility could be written for Unix with the same features, if anyone was actually interested. It's not exactly a difficult problem, even including fancy features like custom comparators. The critical part is a runtime-specific binding for the source application. If aptitude had bindings for Python, Ruby, or Scheme you could do exactly the same thing from these shells under Unix today. It wouldn't require any more work than implementing the .NET bindings necessary to make aptitude work with PowerShell (assuming they worked on the same platform to begin with). In fact, if aptitude had a DBUS interface, as many other application s do, it would automatically with with all these Unix shells and more.

> Well, of course you can achieve the same functionality as in PowerShell by replicating what PowerShell does (namely, by introducing structured data). The point is - no one is really doing it in the Unix world.

For good reason, as I said--such systems have never won out in the past, why should .NET and PowerShell be any different? Anyway, my suggestion wasn't to replicate what PowerShell does, it was to implement a record-oriented sort utility within the existing Unix byte-stream framework, using a standard data encoding rather than opaque "structured data" in a particular runtime. Historically, forcing everything to work within a single complex runtime or RPC model has not been a winning strategy. Simple, portable, reliable interfaces as best.

----

[1] Purely out of curiosity, how does this kind of sort command work if the data is coming from a file, without the .NET APIs, rather than directly from an application? Is it even possible to get the same effect as redirecting the output of a command to a file before sorting it, or must the source and sort command be directly coupled? It seems like the extra features must get lost in translation, since only the textual representation will be saved on disk.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 8:59 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

>The point was that the 'sort' utility used in PowerShell only really adds anything over the standard Unix sort command if the source application includes bindings for .NET.[1] A similar utility could be written for Unix with the same features, if anyone was actually interested.

The problem is, you have to write a special-case utility. While PowerShell provides you a generic mechanism for this. It provides a robust platform that other systems can use.

And some features of sort are quite difficult to replicate in bash, for example, custom comparators (I believe, the only way to do this is to create string ordinals suitable for the 'sort' utility).

You've mentioned D-BUS - it's actually quite similar to PowerShell in a lot of regards. It provided a platform that other applications could easily use and adapt (instead of inventing their own ad-hoc remoting). And so application quickly adopted it. The same is happening with PowerShell in Windows.

>For good reason, as I said--such systems have never won out in the past, why should .NET and PowerShell be any different?

Well, it is different because it's already adopted in the Windows world and it's clear that PowerShell is here to stay.

And I think the main reason for adoption is that the time has come for it. All the ingredients for PowerShell were finally in place - even twelve years ago it would have been impossible without the common environment that .NET provides (PowerShell based on COM? Shudder).

The same story with DBUS - there quite a few failed attempts to build a common desktop RPC system in Unix: DCE RPC, DCOP, CORBA in GNOME and I probably forgot quite a few others. Yet DBUS has clearly won and is now ubiquitous. We even have projects for kernel-accelerated DBUS.

>[1] Purely out of curiosity, how does this kind of sort command work if the data is coming from a file, without the .NET APIs, rather than directly from an application?

That depends on what you're trying to do. You surely can dump objects' content in XML/JSON/binary but it's rarely useful in practice. More often you'd use FT command (Format-Table) to create textual representation that fits your purposes. For objects like files or list of files you can dump them as simple filenames. PowerShell is quite flexible in that regard.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 18:59 UTC (Mon) by raven667 (subscriber, #5198) [Link]

For good reason, as I said--such systems have never won out in the past, why should .NET and PowerShell be any different?

What is this "winning" you are concerned about? PowerShell is already the preferred standard command shell and scripting environment on MS Windows servers and is not really relevant outside that space. UNIX shells like bash are really only relevant on UNIX systems, they have only marginal presence in the Windows space, so there is really no context where "winning" makes sense as they are not directly comparable.

The idea that the shell can be both built out of and expose a full featured programming language and library is not new as you point out but is well demonstrated by PowerShell and not a bad idea.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:23 UTC (Thu) by fb (guest, #53265) [Link]

> Microsoft does a wonderful job with PowerShell, it's getting better than anything Unix ever has had.

I wouldn't know. I think Cygwin does a wonderful job by letting me do the minimal Windows scripting when necessary without having to learn anything new.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:38 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (44 responses)

> Microsoft does a wonderful job with PowerShell
PS> whoami arg1 arg2 arg3
ERROR: Invalid argument/option - 'arg1'.
Type "WHOAMI /?" for usage.

PS> $foo = "foo\abc def\";
PS> whoami $foo bar qux args args
ERROR: Invalid argument/option - 'foo\abc def" bar qux args args'.
Type "WHOAMI /?" for usage.
Yeah, you know what? I'll consider using PowerShell the day the PowerShell team figures out how the hell to properly quote command line arguments. It ain't hard.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 11:48 UTC (Thu) by bjartur (guest, #67801) [Link] (2 responses)

Not that the Bourne shell POSIX sh is based on got it right, either. That's why we have rc, after all.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:46 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (1 responses)

sh allows you to pass *any* string as an argument if you're sufficiently careful and follow a few straightforward rules. PowerShell hopelessly mangles arguments containing double-quotes and backslashes, and there's no workaround less complex than just calling CreateProcess yourself.

I find it very difficult to care about PowerShell's fancy object piping when basic arguments don't work right.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:35 UTC (Thu) by HelloWorld (guest, #56129) [Link]

Unlike the lack of structured data in pipes, this is something that can be fixed in the shell itself, i. e. without modifying all programs/cmdlets.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 18:58 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (31 responses)

You're in luck, then.

Quoting in PowerShell is _way_ ahead of bash and other text-based shells. Writing correct bash code that works with quoting is an exercise in pulling your hair, while in PowerShell it's completely natural.

See here for overview: http://www.techotopia.com/index.php/Windows_PowerShell_1....

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:38 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (30 responses)

No it isn't. What's natural is passing $foo as an argument word, not mangling it by blindly surrounding it with double-quoted while ignoring its contents. (No, writing "$foo" doesn't help. In fact, it has no effect.)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:54 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (29 responses)

And that leads to all kinds of problems with filenames containing spaces or special characters.

Like:
> touch '-a'
>touch: missing file operand
>Try `touch --help' for more information.
WTF.

> touch -- -a
Ok

> rm *
> rm: invalid option -- 'a'
WTF^2???

Dumb text expansion is one of the main problems of shells. It's so bad that suid functionality for scripts has been removed in 80-s because writing secure scripts is just not possible.

In PowerShell it's all natural:
>function touch {set-content -Path ($args[0]) -Value ([String]::Empty) }
>touch -test

No errors, no problems, everything works.

If you REALLY want to reinterpret your string - you can do this explicitly.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:59 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (21 responses)

"touch -- -a" works fine. You're being disingenuous.

You're ignoring the very serious deficiency in PowerShell I identified, a deficiency which makes it practically unusable for me and many others (consider the problems passing SQL queries to programs).

No, don't come back with an example of a cmdlet that properly accepts $foo. The problem is with external programs, which makes the issue worse, not better.

No, not everything is a cmdlet, nor will everything be a cmdlet in our lifetimes.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:16 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (14 responses)

??
You can unpack your string into argument list - by adding ONE character to variable. Is that hard?

And working with SQL queries in PowerShell is WAY better than anything Linux has because one can use statically typed LINQ. I can actually tab-complete table names while writing a query from a freaking command line.

For example: http://bartdesmet.net/blogs/bart/archive/2008/06/07/linq-...

External legacy programs are just that - legacy. They can be easily wrapped but quite often it's easier to replace them completely.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:20 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (13 responses)

> You can unpack your string into argument list - by adding ONE character to variable.

Show me.

> External legacy programs are just that - legacy.

No, external programs are essential. If you persist in this assertion, and claim PowerShell isn't broken because you don't really need the broken feature, we can't have a discussion. If I'm going to write a stand-alone program that doesn't act like a shell, there are many languages better-suited to the task than PowerShell is.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:07 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (12 responses)

> Show me.

It's called 'splatting' and is done by prepending '@' to variable name.

Now can you show me how I can tab-complete table names in Bash while I'm editing a query?

> No, external programs are essential. If you persist in this assertion, and claim PowerShell isn't broken because you don't really need the broken feature, we can't have a discussion. If I'm going to write a stand-alone program that doesn't act like a shell, there are many languages better-suited to the task than PowerShell is.

Sure, a lot of legacy is essential and PowerShell can easily work with legacy text-based programs. No big deal.

But that doesn't make it any less legacy in the Windows world.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:11 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (11 responses)

You're supplying generic rebuttals to points I didn't make and ignoring the real issues I actually raised.

> It's called 'splatting' and is done by prepending '@' to variable name.

Splatting doesn't address the issue I raised.

> PowerShell can easily work with legacy text-based programs

I demonstrated that PowerShell cannot reliably pass arguments to external programs. That's "broken", not "easy".

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:18 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (10 responses)

And you're not supplying anything.

>I demonstrated that PowerShell cannot reliably pass arguments to external programs. That's "broken", not "easy".

Let's see:
>PS C:\Users\cyberax> $c = "cyberax@sdmain"
>PS C:\Users\cyberax> ssh $c
>Linux sdmain 2.6.39-bpo.2-amd64 #1 SMP Tue Jul 26 10:35:23 UTC 2011 x86_64
>
>The programs included with the Debian GNU/Linux system are free software;
>the exact distribution terms for each program are described in the
>individual files in /usr/share/doc/*/copyright.

>Debian GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
>permitted by applicable law.
>Last login: Thu Dec 8 17:51:16 2011 from 74.101.247.211
>cyberax@sdmain:~$

Looks like it works just fine. Now describe your problem in details, without knee-jerk reactions.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:21 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (9 responses)

> Now describe your problem in details, without knee-jerk reactions.

You must have missed the thrust of my original post:

https://lwn.net/Articles/471133/

It's not possible to pass arbitrary strings from PowerShell to external programs.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:41 UTC (Thu) by mathstuf (subscriber, #69389) [Link] (8 responses)

To support *arbitrary* [NUL-terminated] strings in shell, you have to, AFAIK, do the following (to avoid arbitrary execution, get the exact backslash count, and more):

> for x in $( seq 1 ${#var} ); do
> echo -n "${var[$x]}"
> done

Granted, most of the time you don't need this (who puts 7 backslashes in a row? Well...tr might make sense with that many backslashes as an argument), but it needs to be copied verbatim every time unless you want to do another level of escaping passing it to a function.

Of course, if there's a shorter way, I'd be thrilled to know it :) .

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:51 UTC (Thu) by quotemstr (subscriber, #45331) [Link]

You don't need to do any of that.

printf '%s' "$bar" is sufficient. The value of bar isn't expanded or interpreted.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 9:39 UTC (Fri) by HelloWorld (guest, #56129) [Link] (6 responses)

That code doesn't make sense. You take the length of the first string in the array named "var", and then, for x from 1 to said length print the xth element of the array.

If all you wanted to do is print the elements of an array, you can do that with echo "${var[@]}". Well, unless something in ${var[@]} starts with a dash...

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 18:07 UTC (Fri) by mathstuf (subscriber, #69389) [Link] (5 responses)

It's a string, not an array (indexing strings works here in bash). The idea is to avoid creating escape sequences that may appear when var is blindly expanded. Of course, the printf command given is much more succint and cleaner.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 19:31 UTC (Fri) by HelloWorld (guest, #56129) [Link] (4 responses)

> It's a string, not an array (indexing strings works here in bash).
It certainly doesn't work with the syntax you've shown, as that is the syntax for array subscription.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:09 UTC (Fri) by nybble41 (subscriber, #55106) [Link] (3 responses)

Technically it *will* work, but only because the value of the variable is output in the first iteration of the loop, and the following iterations have no effect (${var[1..N]} are empty). I.e., aside from the obvious inefficiency, the entire loop is equivalent to:

echo -n "$var"

To work as intended the variable reference would need to use substring syntax, "${var:$x:1}", rather than the array syntax "${var[$x]}". However, it is sufficient to place the variable in double-quotes, which (in bash) causes the value to be output verbatim, with no further quoting, expansion, or splitting.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:13 UTC (Fri) by HelloWorld (guest, #56129) [Link] (2 responses)

> Technically it *will* work, but only because the value of the variable is output in the first iteration of the loop,
No, the loop starts at 1 instead of 0.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:25 UTC (Fri) by nybble41 (subscriber, #55106) [Link] (1 responses)

Hm. You're right, which is strange since both arrays and characters are zero-indexed (except for $@). Perhaps that loop was written for a different shell entirely? It's certainly not necessary in reasonably modern implementations (i.e. within the last decade) of bash.

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 3:34 UTC (Sat) by mathstuf (subscriber, #69389) [Link]

Nope, I was a typo. Somehow I missed that it just printed something on the first iteration, and nothing on the rest... Not sure how I forgot about printf either.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:41 UTC (Thu) by HelloWorld (guest, #56129) [Link] (5 responses)

> "touch -- -a" works fine.
Yeah, if you think of using the special -- argument. 99% of all shell script authors don't, and they shouldn't need to.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 7:33 UTC (Fri) by ekj (guest, #1524) [Link] (4 responses)

Well, you do need two different syntaxes for:

ls -l

and

ls -- -l

You don't need to use -- for the purpose, but you need to do *something* since both are valid and reasonable commands, but the two have distinct meaning.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 8:52 UTC (Fri) by HelloWorld (guest, #56129) [Link] (3 responses)

> You don't need to use -- for the purpose, but you need to do *something*
Well yes, if you actually want to specifically list a file called -l, you need to type something else than -l, such as ./-l. But that is the simple, obvious case that will be caught when you first try to run your script. It's another thing if a filename such as -l is generated by globbing. The issue here is that an application should know whether an argument was generated by globbing or not, so that it can treat an argument such as -l as a positional parameter instead of an option if it was generated by a glob pattern. But it can't, since, globbing is done by the shell, and the information about which parameters were globbed and which weren't is lost.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:19 UTC (Fri) by nix (subscriber, #2304) [Link] (1 responses)

This feature was actually added to bash at one point: it could export the offsets of globbed arguments to child processes in an environment variable. But it was quietly removed years ago, so presumably there were problems with it.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:38 UTC (Fri) by HelloWorld (guest, #56129) [Link]

That's rather interesting, actually. I guess it was removed because there are corner cases left. For example, something like
touch -- --harmful-flag
foo=(*)
foobar "${foo[@]}"
would likely still not be caught. Of course, it would be possible to treat variable expansions as positional arguments as well, but that would probably break lots of scripts. Incrementally building a list of flags in a shell variable is a common idiom, after all.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 16:30 UTC (Fri) by nybble41 (subscriber, #55106) [Link]

> The issue here is that an application should know whether an argument was generated by globbing or not, so that it can treat an argument such as -l as a positional parameter instead of an option if it was generated by a glob pattern.

There's a standard solution to this: instead of "ls *.c", write "ls ./*.c", which has the same effect, and yet has no chance of accidentally expanding to an option rather than the expected filename.

Or, for any program which has a standard getopt-style command-line parser, just use "--" before any glob patterns.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 2:46 UTC (Fri) by vonbrand (subscriber, #4458) [Link] (1 responses)

No, that isn't the reason. Running scripts SUID is inherently racy, something could interrrupt the interpreter after going SUID and before running the script proper, or switch the script underneath.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 3:29 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

Both issues can be fixed (in fact, Perl with suidperl does this just fine).

You see, even Perl is better than shell scripts.

Evolution of shells in Linux (developerWorks)

Posted Dec 11, 2011 23:59 UTC (Sun) by chuckles (guest, #41964) [Link] (4 responses)

or just use:
touch ./-a

rm ./-a

In PowerShell it's all natural:
>function touch {set-content -Path ($args[0]) -Value ([String]::Empty) }
>touch -test

lol. 'all natural' sorry that made me laugh.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 6:00 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link] (3 responses)

Well, try to write 'touch' utility in Bash (assume that there's no 'touch' utility present).

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 16:40 UTC (Mon) by jimparis (guest, #38647) [Link] (2 responses)

That's just about the most trivial thing you can do:
>filename

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 17:07 UTC (Mon) by mpr22 (subscriber, #60784) [Link] (1 responses)

That only works if the file doesn't exist. Otherwise, it replaces your file with an empty file.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 17:42 UTC (Mon) by jimparis (guest, #38647) [Link]

I know, that matches the behavior of the PowerShell code that Cyberax posted (as far as I can tell). Not really "touch" but "make this file empty".

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:38 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (8 responses)

Without PowerShell:

>C:\>whoami "foo\abc def\"
>ERROR: Invalid argument/option - 'foo\abc def"'.
>Type "WHOAMI /?" for usage.

That's just the way Windows console behaves. Arguments are not splitted if they are quoted - it's different from Linux, but it's consistent.

You can, of course work around it by splitting your parameter list manually. As you have to do it in the regular Windows console.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:48 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (7 responses)

> That's just the way Windows console behaves. Arguments are not splitted if they are quoted - it's different from Linux, but it's consistent.

No, PowerShell is not being consistent with cmd here. PowerShell in fact tries to quote arguments (so they will be properly decoded and sent to [w]main), but fails in some glaring cases. Here is a case that *does* work:

PS> $foo="hello world"
PS> whoami $foo bar
ERROR: Invalid argument/option - 'hello world'.
Type "WHOAMI /?" for usage.

Here, you see that whoami (which prints its first argument, and so is useful as a testing tool) prints *just* "hello world", not "hello world bar".

Please, learn how the Windows command line argument system works before attempting to defend it.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:09 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (5 responses)

You can do it like this: 'whoami @($c -split " ")'. If you need this often then define a function.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:26 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (4 responses)

> whoami @($c -split " ")

No. Splatting does _NOTHING_ to address the quoting issue I raised. It does not allow you to pass arbitrary argument words to external programs. Your technique actually makes the problem _WORSE_ because you lose the identity of individual arguments which CAN contain spaces_and which you DO need to quote. You've utterly failed to even comprehend my basic point. This conversation is over.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:28 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (2 responses)

Can you show me what EXACTLY you are trying to do? Paste a piece of code here, including the parts which receive parameters.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:25 UTC (Fri) by nix (subscriber, #2304) [Link] (1 responses)

quotemstr is trying to show you faults in the Windows command-line parsing system by using whoami as a tool to indicate the contents of the first argument.

You are treating it as if he wants to run 'whoami' but doesn't know how.

When a finger points at the moon...

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 3:00 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

I understand that 'whoami' is used as an example.

I finally understood what author wants and gave an answer: "@($var -split ' ')". It does the braindead thing that was requested.

Why braindead? Because good PowerShell scripts would just use an _array_ and 'splat' it when required, instead of constructing shell arguments as raw strings.

And no, this hack with splitting string into arguments is definitely not required to work with legacy code.

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 3:01 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

Try it. It does EXACTLY what you've requested - passes a string as a number of arguments, split by spaces.

Evolution of shells in Linux (developerWorks)

Posted Dec 16, 2011 3:33 UTC (Fri) by useerup (guest, #81854) [Link]

You err because you want PowerShell to act exactly the way POSIX shells do. PowerShell is actually a bit more structured and it may require you to do things in a little different way. The advantage is that many of the error-prone constructs in bash are avoided.

For your example try this:

PS>$foo="/groups","/user"
PS>whoami $foo

And then try
PS>whoami $foo[0]
PS>whoami $foo[1]

PowerShell actually integrates quite nicely with external programs; it does so even retaining (not re-interpreting) arguments.

In your example you told PS to pass a string containing "arg1 arg2" to an external program - which it did. You assume that the entire commandline is turned into text and re-interpreted (the way of POSIX shells).

As you can see above, PowerShell understands arrays and will readily pass the array items as discrete arguments. It's not harder - just a little different and a lot more robust and avoids the risk of injection vulnerabilities.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 15:01 UTC (Thu) by ccchips (subscriber, #3222) [Link] (27 responses)

No, it isn't. It's different. It's sort of a polyglot (some Perl, some Python, some bash, object pipelines instead of text pipelines.

I think they have some good ideas, but it has many deficiencies; for example, you have to go "around the block" to get byte-stream-oriented input redirection; it doesn't integrate well with *existing* tools that use text-based output. "cmndlet" B may not be dependent on the byte-stream output format of cmdlet A, but it *is* dependent on cmdlet A's object definitions. Their object model is still very weak; for instance, try to use their own tools to get the last logon date for an Active Directory user.

They have problems with directory management. They don't have a working "du" type tool that produces objects as output, and the workarounds they do have barf if the paths are too long. They pride themselves on ease-of-discuvery, but try to discover all those .net classes without going on the Web to look up the objects and their meanings.

On the other hand, when the object model does work, it's very easy to get things done, and you can count on data formats between pipeline elements.

I enjoy working with Powershell, but I also enjoy working with Bash, Perl, Python, and several other tools. I'd like to see a working Powershell implementation in the *nix world, or if nothing else, some interest in object-oriented (or mixed) pipelines.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 15:20 UTC (Thu) by felixfix (subscriber, #242) [Link] (2 responses)

Every once in a while I come across some program that spits out several lines per record, and wished there were some way to use it in a pipeline. What it needs is some way to group lines into single lines and break them apart again, and two new pipeline utilities come to mind. The collector would group input lines based on command line criteria, such as a count or a pattern, and print them as a single line formatted per more command line args. The splitter would split each input line into several lines based on similar command line args.

Then programs which print several lines per record could be used with pipelines. It wouldn't be perfect, but all it would take is the two utilities, which I am, alas, too lazy and/or uninspired to write. The problem comes up too seldom to take the time to write it, and I have always found simple work arounds.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 0:32 UTC (Fri) by wahern (subscriber, #37304) [Link]

You mean cut(1), paste(1), and join(1)?

They can be rather awkward to use, but I believe do what you want.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 5:03 UTC (Fri) by jthill (subscriber, #56558) [Link]

sed does exactly what you want. For instance, to accumulate aptitude show's output one package per line: sed -nr '/^Package:/!{H;$!d};x;s/\n/\x0/gp'

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 17:05 UTC (Thu) by ccchips (subscriber, #3222) [Link] (8 responses)

I just encountered another serious (in my opinion) deficiency in Powershell.

I have a "cron job" (scheduled task) and I want the job to send *all* the output to a text file, *including* all error messages. The only way I could track down an error in my script was to run it from the console and read the error message (in red) from the screen.

Not good.

This is just one example where a potentially very powerful shell fails in critical situations. Object model or no, the Powershell team should *pay attention* to the needs of system administrators everywhere, not just the point-and-click/read the screen types.

And there's no good reason why migrating Bash users should have to go around the block several times to do things that have been extremely simple for years:

foo <bar.txt >result.txt

vs.

get-content....blablabla....out-file....

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:02 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (7 responses)

>I have a "cron job" (scheduled task) and I want the job to send *all* the output to a text file, *including* all error messages. The only way I could track down an error in my script was to run it from the console and read the error message (in red) from the screen.

Uhm. PowerShell supports standard text redirection. What exactly do you need?

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:41 UTC (Thu) by ccchips (subscriber, #3222) [Link] (2 responses)

I need the equivalent of:

foo >bar 2>&1

to have *every* text line from standard output and *every* error line that would sho up in red on the screen, as the result of the above command.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:58 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

You'd be surprised, but the equivalent of "foo >bar 2>&1" in PowerShell is:

> foo >bar 2>&1

Yep. It works just as in Unix.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:59 UTC (Thu) by ccchips (subscriber, #3222) [Link]

Ok, well I just tried "2>&1" on that job and it worked. Stupid me. I was looking for instructions rather than experimenting.

However, I did find a different problem; things go really wrong if you try something like this:

dir | get-content

because "get-content" takes a list of objects from the input pipe that are interpreted as file names.

Still, my first problem is solved, I have egg on my face, and so it goes. I never said I didn't like Powershell--I enjoy working with all of the common shells (except maybe MS-DOS Command prompt) and would like to see Powershell be more easily integrated with existing tools.

I suspect there is a way to get "dir | get-content" to act more like UNIX shells would, but we could argue all day about the merits.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:59 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (3 responses)

> Uhm. PowerShell supports standard text redirection.

What do you think this PowerShell pipeline puts into foo.txt?

echo hello > foo.txt

If you guessed ASCII "hello" or "hello\n" or "hello\r\n", you'd be wrong. Horribly wrong.

PowerShell converts everything to UTF-16-LE, prepends a BOM, and puts the result in foo.txt. This behavior is almost too horrible for works. PowerShell even mangles the output of *native* programs, e.g.

ipconfig > ipconfig-out.txt

In cmd, ipconfig-out.txt contains ASCII. When the same pipeline is run in PowerShell, it's UTF-16-LE-with-BOM. This encoding is more verbose, less commonly supported, and (because of the BOM) far less friendly to splicing and concatenation. This behavior is just awful.

As a lemma, PoewrShell will also mangle the output of programs that produce non-text output: consider

ps2pdf < foo.ps > foo.pdf

foo.pdf will take each "line" of ps2pdf output, convert it to UTF-16 as best it can, then dump the result in foo.pdf. The result looks like a PDF sent through a meat grinder.

(Don't tell me you're supposed to use some cmdlet for doing the conversion: the point of a shell is to work with existing programs. If I wanted to use a library, I'd use a different programming language.)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:14 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (2 responses)

That's the way Windows works. Sorry.

You can change default encoding to UTF-8 with "$OutputEncoding = New-Object -typename System.Text.UTF8Encoding". Binary mode is more tricky, you have to add "-encoding byte" everywhere.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:27 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (1 responses)

> That's the way Windows works. Sorry.

Thus my original point: I'll consider using PowerShell only after the critical issues I've identified are fixed. Until then, Unix shells (even running under Cygwin) are superior by far.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:51 UTC (Thu) by tialaramex (subscriber, #21167) [Link]

Windows exists in a parallel universe where at some moment soon a great wizard will appear and banish everything outside the Basic Multilingual Plane so that they can go back to UCS-2 where things were simpler (yet conveniently incompatible with stodgy old Unix).

People who are never going to be happy about LLP64, UTF-16, or various other arbitrary yet defensible differences from Unix will never be comfortable on Windows. This isn't even a religious war, it's some fundamental cultural difference just like the relationship between directions and time.

For me "bringing forward" a meeting will always make it sooner, and UTF-8 will always be the more intuitive and sensible encoding. So no Windows for me.

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 3:06 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link] (14 responses)

>Their object model is still very weak; for instance, try to use their own tools to get the last logon date for an Active Directory user.
Easy:
>Get-AdUser -Filter * | Format-Table LastLogon,UserPrincipalName

You'll get a list of last login timestamps and users names.

You can argue that PowerShell does not yet cover all possible functionality, but it's not a deficiency of the idea itself. Just a sign or relative youth.

Relative youth?

Posted Dec 10, 2011 6:45 UTC (Sat) by khim (subscriber, #9252) [Link] (4 responses)

You can argue that PowerShell does not yet cover all possible functionality, but it's not a deficiency of the idea itself. Just a sign or relative youth.

Nope. It's an Achilles' heel. PowerShell is half-decade old already, so you can not say it's all that young. It's just not a shell replacement, it belongs to a long list of "universal glue languages" (which were rarely all that universal). Besides LISP on the LISP Machines or Oberon on Oberon OS (which is the only examples where "universal languages" were almost fully universal indeed) this list includes things like REXX, AppleScript, VBScript, etc. They are good in their "area of expertise", but please don't try to mix them with shells - because they are not shells despite the hype.

Why? Because they assume programs will offer specialized interfaces just for that one flavor or scripting. But developers of a lot of programs just don't care enough to do that! They may provide some kind of command line switches and/or offer some textual output (because it's easy), but why should they bother to offer all these other things? This is not what they are paid for!

Some programs don't include any scripting support at all (in this case even bash can do nothing), but more often then not they do include some scripting - because their authors need it for development purposes. But it's as minimal as possible, because it's side-show at best. And most such schemes and languages are horrible when they need to interact with programs which don't include nice-structured-interface-of-the-year. PowerShell is not an exception.

Relative youth?

Posted Dec 10, 2011 7:28 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link] (3 responses)

>Nope. It's an Achilles' heel. PowerShell is half-decade old already, so you can not say it's all that young. It's just not a shell replacement, it belongs to a long list of "universal glue languages" (which were rarely all that universal).

No. It IS the shell replacement. It also is a glue language.

>Why? Because they assume programs will offer specialized interfaces just for that one flavor or scripting. But developers of a lot of programs just don't care enough to do that! They may provide some kind of command line switches and/or offer some textual output (because it's easy),

Sure. And you can work with these programs just fine. It's not going to be as natural as working with native cmdlets, but it's good enough to work with legacy stuff.

For example, PowerShell's git integration is nicer than in bash/zsh and takes only a fraction of code for the similar functionality. See for yourself: https://github.com/dahlbyk/posh-git

>but why should they bother to offer all these other things? This is not what they are paid for!

Because writing a cmdlet is like 10 times easier than writing a command-line utility! It saves time! Especially if you are using .NET (which most large vendors already do at least for some functionality).

And it's already happening - all good software vendors for server-side apps on Windows already expose functionality using PowerShell. Like: VMWare, Amazon Cloud, MSSQL, etc.

It's apples to oranges...

Posted Dec 10, 2011 8:57 UTC (Sat) by khim (subscriber, #9252) [Link] (2 responses)

Because writing a cmdlet is like 10 times easier than writing a command-line utility!

Only if your program uses .NET - which is not always the case. In other cases you first need to write the command-line utility anyway and in addition you need to write the cmdlet. If you write command-line utility anyway then why bother with cmdlet at all?

Especially if you are using .NET (which most large vendors already do at least for some functionality).

Not especially. Only. If you don't use .NET then it's easier to write command-line utility - and while large vendors are prone to misallocation of the resources even Microsoft is ready to scale back this abomination as Windows8 shows. But it'll probably be 10 more years till it's finally dropped...

And it's already happening - all good software vendors for server-side apps on Windows already expose functionality using PowerShell.

s/good/buzzword-compliant/

PowerShell is obvious waste of the resources, but it's not a zero-sum game: most sensible solutions are not the ones which sell well thus sometimes you need to invest in the most buzzword-compliant approach. But it only makes sense if this approach slays buzzword. When it's no longer the case support dries up quite fast. As world becomes less Windows-centric and Windows becomes less .NET-centric it makes less and less sense to spend resources on PowerShell.

People will probably not drop already written things (albeit it may happen later), but new developments... I doubt it.

It's apples to oranges...

Posted Dec 11, 2011 0:07 UTC (Sun) by raven667 (subscriber, #5198) [Link]

Ok, we get it, you don't like (MS, .NET/C#, PowerShell) The idea that MS or the industry in general is moving away from VM style languages is just laughable as is the idea that .NET is an "abomination". We can certainly have a discussion on the merits but this isn't one.

It's apples to oranges...

Posted Dec 11, 2011 0:26 UTC (Sun) by Cyberax (✭ supporter ✭, #52523) [Link]

>Only if your program uses .NET - which is not always the case. In other cases you first need to write the command-line utility anyway and in addition you need to write the cmdlet. If you write command-line utility anyway then why bother with cmdlet at all?

Nope. VMWare doesn't use .NET but its PowerShell management interface is top-notch. git doesn't use PowerShell but posh-git beats bash_completion in usability.

Have you actually managed something on new Windows Server platforms? Or do you think that Windows Server is still used only for file-servers?

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 16:44 UTC (Mon) by ccchips (subscriber, #3222) [Link] (8 responses)

LastLogon: 129681773131629678
LastLogonTimestamp: 129681773131629678

Quest Software fixed this in their product, which is, fortunately "free beer."

Somehow, I think we've been trolled. Smoke signals?

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 17:27 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link] (7 responses)

Uhm. That's a timestamp in awful ActiveDirectory format. No need for third-party software.

http://msdn.microsoft.com/en-us/library/windows/desktop/m...

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 18:05 UTC (Mon) by ccchips (subscriber, #3222) [Link] (6 responses)

Ah, yes - so I hand the above link to a new system administrator who has to use Powershell, ask him to give me a list of Last Logon Dates for anyone who hasn't logged on in 30 days, and.....then what?

This issue should have been addressed *before* Microsoft releasted the Active Directory tools for Powershell, not after, and not by requiring some kind of conversion kluge.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 18:31 UTC (Mon) by ccchips (subscriber, #3222) [Link] (5 responses)

Oh, and I did do some research into this. It is possible to do, but I don't find the solution to be any indication of how great Powershell is compared to the other UNIX shells. Yeah, there's a system function to convert this, but again, we're supposed to be making it easier for administrators to do their work, not requiring them to do research into date/time conversions from a "standard" tool.

It is good, as far as I'm soncerned, that Windows system administrators now have a decent shell to work with in their day-to-day activities, and I laud Microsoft for this. In my shop, it has been hard to get administrators to script *anything* before Powershell, as they were not willing to use Perl, Awk, Python, etc. And Powershell does have some interesting innovations. But it still strikes me as odd how a discussion of UNIX shells and their evolution turned into an argument about how great Powershell is compared to all the other methods people have used to perform system administration activities.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 18:48 UTC (Mon) by raven667 (subscriber, #5198) [Link]

But it still strikes me as odd how a discussion of UNIX shells and their evolution turned into an argument about how great Powershell is compared to all the other methods people have used to perform system administration activities.

Well, it didn't start out that way, there was just one "PowerShell is neat" post and in response a flurry of negative responses were written which lead to a lively discussion.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 21:08 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link] (3 responses)

That's mostly a problem of the tool itself. AD is just braindead in many respects. We just have to live with it.

Samba4 on Linux has the same behavior, btw.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 21:34 UTC (Mon) by ccchips (subscriber, #3222) [Link] (2 responses)

Interesting.

Does Samba4 or related utility have the fix?

I was a bit surprised to learn about the [System.DateTime]::FromFileTime() function in Powershell. I did a lot of research into this previously and didn't see that.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 21:41 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link] (1 responses)

No. Why should there be a fix?

LastLogon is a valid timestamp, just in a very braindead format (in 100-s nanosecond increments since 1601).

Evolution of shells in Linux (developerWorks)

Posted Dec 13, 2011 16:20 UTC (Tue) by ccchips (subscriber, #3222) [Link]

There should be a fix because people need to use that information, not dig around on the Internet, figure out how braindead the timestamp is, and come up with a scheme to read it in human-readable form. Quest Software fixed it by providing a conversion method.

I was asking about fixing *accessibility to the information.* Which is far more important to me than what shell handles what arguments, and how the pipelines work.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:02 UTC (Thu) by cmccabe (guest, #60281) [Link] (12 responses)

The big deal with PowerShell, of course, is the use of objects and types rather than byte streams.

Linux has had that for decades . You can get that in Linux by opening up just about any scripting language and running the interpreter. Try the Ruby shell, the Python shell, or even the Scheme shell (scsh).

What would really be nice is some kind of standard data interchange format that could be shared between applications. For a while, people were pushing XML for that use, but XML is ugly as sin and overly complex. I think JSON is a better choice.

When I was working on Ceph, we implemented a --json flag for most of the tools that let them output JSON. Then you could manipulate it to your heart's content in the scripting language of your choice.

If we had a few more tools that let non-programmers manipulate JSON with the command line, and a --json option on popular programs, I think we could get most of the benefits of PowerShell on Linux.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:10 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (7 responses)

Not really. PowerShell allows bidirectional communication while Unix pipes are traditionally unidirectional. Also, having all commands executing in the same address space is quite useful for a lot of stuff - I can pass gigabytes of image data without having it copied through pipes.

But yes, internally PowerShell is just a REPL console for a statically typed language with good introspection capabilities.

It's just cleanly implemented and with a lot of functionality. There were several attempts to do it in Unix but they died for a lack of infrastructure.

You can work with byte streams in PowerShell but it just feels unnatural after working with typed and structured objects.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:06 UTC (Thu) by cmccabe (guest, #60281) [Link] (3 responses)

> But yes, internally PowerShell is just a REPL console for a statically
> typed language with good introspection capabilities.
>
> It's just cleanly implemented and with a lot of functionality. There were
> several attempts to do it in Unix but they died for a lack of
> infrastructure.

Well, Perl, Python and Ruby don't seem dead to me. Yes, it would be nice if they supported static typing, but that's a different conversation.

> PowerShell allows bidirectional communication while Unix pipes
> are traditionally unidirectional. Also, having all commands executing in
> the same address space is quite useful for a lot of stuff - I can pass
> gigabytes of image data without having it copied through pipes.

Solaris had (has?) bidirectional pipes. Linux never implemented that, and it's probably a good thing on the whole.

The thing that I think you are missing is that a good shell needs to be designed to be useful to system administrators, not to programmers. Good programmers may be able to debug the race conditions, misconfigurations, and so forth that can result in bidirectional communication between modules. But system administrators will find the extra complexity to be a huge burden.

The genius of UNIX was that it tore down the wall between system administrators and programmers. This meant allowing visibility into the guts of the system. It meant that sysadmins could automate common tasks. Does PowerShell allow users to do that, or is it just another shrine built to another proprietary Microsoft programming framework?

There have been so many of those over the years-- OLE, COM, DCOM, ActiveX, etc. Ironically, the number of dead proprietary Microsoft programming frameworks is almost greater than the number of living open source ones!

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:24 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (2 responses)

>Well, Perl, Python and Ruby don't seem dead to me. Yes, it would be nice if they supported static typing, but that's a different conversation.

I'm talking about _shell_ in Python/Perl. I'm aware only of http://code.google.com/p/hotwire-shell/ which is kinda still alive, but not very active.

Also, static typing is essential for PowerShell because it allows to have full automatic introspection. So I can tab-complete LINQ queries without any additional hackery like in bash_completion.d

>The thing that I think you are missing is that a good shell needs to be designed to be useful to system administrators, not to programmers. Good programmers may be able to debug the race conditions, misconfigurations, and so forth that can result in bidirectional communication between modules. But system administrators will find the extra complexity to be a huge burden.

Good Windows sysadmins _love_ PowerShell, because it makes a lot of jobs easier. Also, a lot of companies start building their tools around it. VMWare has _very_ nice management interface for PowerShell, for example.

>The genius of UNIX was that it tore down the wall between system administrators and programmers. This meant allowing visibility into the guts of the system. It meant that sysadmins could automate common tasks. Does PowerShell allow users to do that, or is it just another shrine built to another proprietary Microsoft programming framework?

PowerShell is really a logical extension of Unix ideology, so it has all the advantages of Unix.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 2:31 UTC (Fri) by ccchips (subscriber, #3222) [Link] (1 responses)

I think I'll believe that when I can do:

[get a list of all computers that have tape drives] | [get-tape-label]

from Powershell against any of the many proprietary backup programs Windows users are faced with....

Will companies like CA, Seagate, GFI, and others rally 'round this object model and help us get our work done *without* using their stupid GUI applications?

I have my doubts, but go 'head, keep on truckin'.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 3:15 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

You're in luck.

We use http://www.veeam.com/ for VMWare backups. Works like a charm.

I have no idea how it works with tapes (I've not seen one for more than 10 years), but it can certainly list the targets for backups with a simple command.

So now please show me how to do this in Bash. For DriveXML working inside a VMWare virtual machine.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:20 UTC (Fri) by ccchips (subscriber, #3222) [Link] (2 responses)

> PowerShell allows bidirectional communication while Unix pipes are traditionally unidirectional.

Can you give us some examples or links to some examples of this bidirectional communication and its advantages/disadvantages? Only thing I have found so far is that you can use a .net class to set up a named pipe.

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 3:08 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

Bidirectional pipes are rarely used in PowerShell - it's a minor feature. More exactly, you can provide your own implementation for a pipe. Including pipes that use pigeon mail or encode data as smoke signals.

Evolution of shells in Linux (developerWorks)

Posted Dec 16, 2011 3:56 UTC (Fri) by useerup (guest, #81854) [Link]

I believe he may be referring to the fact that PowerShell not only allows objects to be passed through the pipes, but also allows you/the script to actually *interact* with the objects, ie calling methods.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 17:35 UTC (Fri) by jsanders (subscriber, #69784) [Link] (3 responses)

I wonder whether it would be useful to develop a set of json unix-like shell-like utilities, e.g. jls, jcp. I can't find anything like this available. Perhaps you could do something like "jls *.ls | jselect size" to print out the sizes of *.ls. Perhaps it would make a useful halfway house between python/perl and shell scripting.

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 1:16 UTC (Sat) by cmccabe (guest, #60281) [Link] (2 responses)

I actually really do think this would be handy. You could have something like jsort, juniq, jcat, etc. Also you would want something kind of like xquery that could pull an element(s) out of a JSON object.

It might make it possible to do some interesting and readable one-liners.

Evolution of shells in Linux (developerWorks)

Posted Dec 11, 2011 11:22 UTC (Sun) by HelloWorld (guest, #56129) [Link] (1 responses)

There was a project like that actually:
http://lwn.net/Articles/443473/
I haven't heard of it since though, and its github repo hasn't seen any activity since June.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 23:33 UTC (Mon) by cmccabe (guest, #60281) [Link]

TermKit is (was?) an extremely ambitious project basically involving a redesign from scratch of the whole terminal system. It seems like they wanted it to be GUI-based and mostly coded in HTML5.

What I am thinking of is just a set of utilities to manipulate JSON easily-- nothing more, nothing less.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 7:32 UTC (Thu) by ncm (guest, #165) [Link] (8 responses)

I only have one question about shells: Why does line-editing on a command pulled from the history, in bash, destructively edit the history? Do they all do that, going all the way back to ksh, or is this a bash abomination?

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 8:33 UTC (Thu) by geuder (subscriber, #62854) [Link] (5 responses)

> line-editing on a command pulled from the history, in bash, destructively edit the history

Annoyance #1 with bash, I agree. I wonder whether there is an option to control that. Read the man page once, but I might have overlooked it.

Annoyance #2: If you press Ctrl-C, get your prompt not in the beginning of the line, scroll back in history and edit, the outcome is not WYSIWYG but a mess.
This is 2011 and not 1970 so it should be possible for the program to make sure it knows what the line on the screen looks like. I guess technically this is a GNU readline issue, so the fix needs to go there. Isn't history editing also provided by GNU readline? So the same might hold for Annoyance #1

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 11:57 UTC (Thu) by bjartur (guest, #67801) [Link] (1 responses)

I don't think there are many willing to step up and fix readline, after patching every mainstream VT100 emulator out there. The paradigm of redirecting debugging information, output and input echo to a single character stream is plain outdated.

Does anyone own any of the TeleTypes that might brake if anything is changed? I, for one, have never seen one, and call for a rewrite. Keep stdin, stdout and stderr - but throw out the overmultiplexing and terminal emulation.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 13:28 UTC (Fri) by mpr22 (subscriber, #60784) [Link]

I'm not clear what your final paragraph is calling for a rewrite of. Could you clarify?

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 13:01 UTC (Thu) by grawity (subscriber, #80596) [Link] (2 responses)

> Annoyance #1 with bash, I agree. I wonder whether there is an option to control that. Read the man page once, but I might have overlooked it.

In ~/.inputrc, "set revert-all-at-newline on"

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 0:46 UTC (Fri) by ncm (guest, #165) [Link] (1 responses)


WOW

This made my day. Thank you, grawity. After decades of suffering, I'm finally happy with bash.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:28 UTC (Fri) by nix (subscriber, #2304) [Link]

You might want to read the readline info book sometime to avoid some more decades of suffering :)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 12:07 UTC (Thu) by nye (subscriber, #51576) [Link]

>Why does line-editing on a command pulled from the history, in bash, destructively edit the history?

I generally like this behaviour because it means you can go up or down in history (for reference) while editing a command line, without losing your edits in progress.

I guess you already know this, but as a workaround there's always alt+r, for 'undo all changes to this line'.

Also, there's alt+# for 'comment out this line and add it to my history', which is very useful when you've been editing a line in your history, decided you don't want to execute it now, don't want your history destructively edited, but might want to come back to it later.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 15:06 UTC (Thu) by felixfix (subscriber, #242) [Link]

It only modifies the historic line IFFF you don't hit ENTER before leaving that line.

Try this:

echo "miss daisy"

Hit up arrow, add ' crazy' after daisy, and hit ENTER.

Use up arrow to look at history -- both lines are intact.

Pick one, edit it, then use up or down arrow to move away, then come back, and you will find your edit intact. Hit ENTER.

Use up arrow again, and you will find the modified line, not the original line.

Near as I can guess, it only does this because the alternative is that if you edit a line and move away, perhaps to look at a different line to remember what was typed, then move back to finish your edit, your edit would have been discarded. I think this would be just as annoying as corrupting history, but at different times.

I can't say as I like it or despise it; it's a "feature" I take advantage of, sometimes happy to have it, sometimes annoyed at not having a true reliable history of what was actally executed.


Copyright © 2011, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds