Evolution of shells in Linux (developerWorks)
Ken Thompson (of Bell Labs) developed the first shell for UNIX called the V6 shell in 1971. Similar to its predecessor in Multics, this shell (/bin/sh) was an independent user program that executed outside of the kernel. Concepts like globbing (pattern matching for parameter expansion, such as *.txt) were implemented in a separate utility called glob, as was the if command to evaluate conditional expressions. This separation kept the shell small, at under 900 lines of C source".
Posted Dec 7, 2011 23:22 UTC (Wed)
by JoeBuck (subscriber, #2330)
[Link] (5 responses)
Posted Dec 8, 2011 0:37 UTC (Thu)
by theophrastus (guest, #80847)
[Link] (4 responses)
Posted Dec 8, 2011 1:48 UTC (Thu)
by JoeBuck (subscriber, #2330)
[Link]
Posted Dec 8, 2011 14:51 UTC (Thu)
by felixfix (subscriber, #242)
[Link] (2 responses)
Posted Dec 8, 2011 19:19 UTC (Thu)
by jzbiciak (guest, #5246)
[Link] (1 responses)
Posted Dec 9, 2011 12:17 UTC (Fri)
by nix (subscriber, #2304)
[Link]
Posted Dec 8, 2011 0:37 UTC (Thu)
by lkundrak (subscriber, #43452)
[Link] (4 responses)
Posted Dec 8, 2011 5:04 UTC (Thu)
by tshow (subscriber, #6411)
[Link] (1 responses)
Posted Dec 8, 2011 14:35 UTC (Thu)
by nix (subscriber, #2304)
[Link]
I have never seen another program written in BOURNEGOL, which is a real pity: the more programming becomes a matter of manipulating something deeply disgusting like that, the fewer people will want to be programmers and the higher our paycheques go :P
(similarly, I propose rotating knives and Great Cthulhu in job interviews.)
*removes tongue from cheek*
Posted Dec 8, 2011 6:28 UTC (Thu)
by BrucePerens (guest, #2510)
[Link]
I remember being recruited for a company where they were pushing that we would get to work with Steve Bourne! Now, I have never met him and don't know anything about him except for how much I hated that MACRO ALGOL when I was trying to fix something in v7 shell. I don't think I explained to the recruiter why the prospect of working with him had me nonplussed. But I'm sure he heard about it eventually.
Posted Dec 8, 2011 14:33 UTC (Thu)
by nix (subscriber, #2304)
[Link]
Posted Dec 8, 2011 5:47 UTC (Thu)
by imgx64 (guest, #78590)
[Link] (17 responses)
I'm glad this is less true these days; most people just use Bash because it's the default on their distribution and never think about it. I wasted too much time thinking about editors and trying different ones until I settled on Vim. I do not want to repeat that for shells.
Although to be honest, I did contemplate ZSH at one point, and I liked many of its features. The ones that stand out the most are the autocomplete (for example, autocomplete for umount only shows mounted devices), and ZLE (the line editor, I wish Bash/Readline could edit multi-line history as easily).
But then, the good is the enemy of the perfect (apologies to Voltaire), so I decided that Bash is good enough, and that switching and learning all these new features is not worth my time.
Posted Dec 8, 2011 6:47 UTC (Thu)
by lindahl (guest, #15266)
[Link] (12 responses)
complete -A hostname ssh ping traceroute mtr
It doesn't seem to have an 'action' for mounted filesystems.
Still, I have to finish by mentioning that you should definitely switch to emacs :-)
Posted Dec 8, 2011 7:24 UTC (Thu)
by wahern (subscriber, #37304)
[Link] (5 responses)
Posted Dec 8, 2011 8:42 UTC (Thu)
by thedevil (guest, #32913)
[Link] (2 responses)
Posted Dec 8, 2011 9:31 UTC (Thu)
by HelloWorld (guest, #56129)
[Link] (1 responses)
Posted Dec 8, 2011 18:33 UTC (Thu)
by clump (subscriber, #27801)
[Link]
Posted Dec 8, 2011 10:26 UTC (Thu)
by imgx64 (guest, #78590)
[Link]
But then again, I don't think you'll really "use bash when bash learns RPROMPT", simply because there is no reason to use Bash if you're happy with Zsh[1].
[1] I don't want to start a shell wars thread, but I'm curious if Bash has any advantages over Zsh other than ubiquity and the user being familiar with Readline (has a customized .inputrc for example).
Posted Dec 8, 2011 14:42 UTC (Thu)
by nix (subscriber, #2304)
[Link]
Posted Dec 8, 2011 14:40 UTC (Thu)
by nix (subscriber, #2304)
[Link] (5 responses)
(But it is seriously awesome despite the ridiculous overdesign.)
Posted Dec 8, 2011 14:50 UTC (Thu)
by nye (subscriber, #51576)
[Link] (1 responses)
Can you give an example of the kind of autocompletion that can be done in zsh but not in bash?
Posted Dec 8, 2011 16:44 UTC (Thu)
by nix (subscriber, #2304)
[Link]
bash has nothing remotely comparable.
It is *crazy* flexible, so flexible that there is an autoloaded 'compinit' function just so that normal mortals stand a chance of configuring its *default* setup. (This is the 'zshcompsys' completion system, btw, not the 'zshcompctl' system, which is akin to bash's, and is obsolete.)
Posted Dec 8, 2011 15:48 UTC (Thu)
by joey (guest, #328)
[Link] (1 responses)
Posted Dec 14, 2011 0:01 UTC (Wed)
by tertium (guest, #56169)
[Link]
Posted Dec 8, 2011 23:34 UTC (Thu)
by nevyn (guest, #33129)
[Link]
That's probably at least 50% of the reason I still use zsh, the other 90% being that I configured zsh like 16 years ago and am happy to not have to configure bash when "yum install zsh" works instead :).
Posted Dec 8, 2011 6:56 UTC (Thu)
by geuder (subscriber, #62854)
[Link] (1 responses)
autocomplete behaviour is not really built-in or more exactly the built-in default behaviour is not suitable for many commands, e.g. umount. Bash does offer an API for everybody (i. e. every command) to get customized autocomplete behaviour.
I have always wondered why there are so many commands that do not install suitable autocompletion scripts for bash. But hey, it's open source. Maybe we should comment less and code more... Contributing those scripts might be more laborous than writing them, though. They don't just belong to bash but you have to work with every command's upstream.
Posted Dec 8, 2011 14:43 UTC (Thu)
by nix (subscriber, #2304)
[Link]
Posted Dec 8, 2011 8:28 UTC (Thu)
by rvfh (guest, #31018)
[Link]
Ubuntu does that for me :-) Just checked on 10.04 at work.
Posted Dec 8, 2011 10:13 UTC (Thu)
by fb (guest, #53265)
[Link]
At multiple moments in the 90s and early 00s, I got so annoyed with tcsh (or the lack of it pre-installed in many places where I had no su-powers) that I tried to switch to bash. Every time I got so annoyed that I remained with tcsh as my main shell.
IIRC bash had hard limitations at what I could do in a prompt (short of running an embedded command), and at the time it had no completion.
After learning about ZSH, I lived happy ever after. (Note that I learned about it from a friend talking about a shell at which I could play Tetris.)
But you are right about 'good enough', there is a time in life where you can afford to be playing around with different shells, editors etc; later the cost and benefit of migration just don't make sense.
[...]
FWIW, ZSH 4.3.13 was released yesterday.
Posted Dec 8, 2011 6:47 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (156 responses)
Microsoft does a wonderful job with PowerShell, it's getting better than anything Unix ever has had.
Posted Dec 8, 2011 9:10 UTC (Thu)
by tuxmania (guest, #70024)
[Link] (68 responses)
I have used PS a fair bit and the thing that stands out is how much work has been put into being different for the sake of being different, not better.
If Microsoft had instead ported bash onto Windows they would have an army of good scripters. Now, MS fanboys i know just snear at PS and discard it as "that crap from Linux" and the people who would know how to handle it is put off by its stupitity. Its a fail-fail situation sadly.
Posted Dec 8, 2011 9:36 UTC (Thu)
by HelloWorld (guest, #56129)
[Link] (67 responses)
Posted Dec 8, 2011 9:54 UTC (Thu)
by tuxmania (guest, #70024)
[Link] (66 responses)
Many things is much harder to do in Powershell despite this "structured data" holy grail you talk about. Why? Because third party support for Powershell is abysmal, and even Microsoft tends to avoid it like the plague in many places, especially where there is a GPO for roughly the same thing somewhere, making it next to impossible to do what you really need to do.
Posted Dec 8, 2011 10:13 UTC (Thu)
by HelloWorld (guest, #56129)
[Link] (65 responses)
Posted Dec 8, 2011 10:43 UTC (Thu)
by mpr22 (subscriber, #60784)
[Link] (12 responses)
Posted Dec 8, 2011 10:51 UTC (Thu)
by HelloWorld (guest, #56129)
[Link] (11 responses)
Posted Dec 8, 2011 18:09 UTC (Thu)
by niner (subscriber, #26151)
[Link] (10 responses)
Having other conventions does not help magically. It's adherence to such conventions that makes a system work.
Posted Dec 8, 2011 20:29 UTC (Thu)
by HelloWorld (guest, #56129)
[Link] (9 responses)
> Having other conventions does not help magically. It's adherence to such conventions that makes a system work.
By the way, there's another, related problem in Unix shell design: the shell does the globbing, but it doesn't interpret the arguments, as that is left to the program being executed. But the program has no way to know whether an argument was produced by globbing or not, which gives you trouble when you have files named --foo around. It's all so broken...
Posted Dec 9, 2011 2:30 UTC (Fri)
by vonbrand (subscriber, #4458)
[Link] (8 responses)
Right. Because you never have to sort stuff separated by anything but TAB. BTW, you can sort version-wise by sorting numerically and separating on '.' That is exactly the reason why many programs interpret "--" as "end of switches, rest is arguments." Besides, I have had the dubious pleasure of working with systems where each program was in charge of interpreting globs, and not too surprisingly each one did it their own way (or not at all), for all around confusion.
This is a command interpreter for interactive use we are talking about here, for crying out loud. That you can do a surpising amount of programming in it is a welcome bonus, so you don't have to go grab a full-fledged programming language all the time (in many cases you wouldn't bother and do it by hand or just give up). And most of the "problems" mentioned are just that the programs involved aren't built for easy chaining (can't ask for one-line records, hard to parse output, ...), not some "Unix failure" per se.
Posted Dec 9, 2011 8:56 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (7 responses)
Posted Dec 9, 2011 10:24 UTC (Fri)
by nicku (guest, #777)
[Link] (6 responses)
Posted Dec 9, 2011 10:29 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (5 responses)
> I and my workmates use this whenever required.
Posted Dec 9, 2011 10:33 UTC (Fri)
by dlang (guest, #313)
[Link]
and how do you define 'shell programmer'
I would say that people who are making their living on a team working with the shell (sysadmin types) have a pretty good chance of knowing this, but people who are tinkering and learning alone are less likely to have learned this, but such people are far more likely to have gaps of all kinds in their knowledge.
Posted Dec 9, 2011 12:10 UTC (Fri)
by mpr22 (subscriber, #60784)
[Link] (3 responses)
Posted Dec 9, 2011 12:32 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (2 responses)
Posted Dec 9, 2011 13:06 UTC (Fri)
by dlang (guest, #313)
[Link] (1 responses)
people keep trying to produce things that can be extended to do things that weren't initially programmed in via config files of various kinds, but evenutally every one of these config files grows into (or adopts) some sort of programming language.
Posted Dec 9, 2011 13:23 UTC (Fri)
by HelloWorld (guest, #56129)
[Link]
Posted Dec 8, 2011 10:53 UTC (Thu)
by lkundrak (subscriber, #43452)
[Link] (1 responses)
Thank you.
Posted Dec 8, 2011 11:03 UTC (Thu)
by HelloWorld (guest, #56129)
[Link]
Posted Dec 8, 2011 13:45 UTC (Thu)
by cortana (subscriber, #24596)
[Link] (30 responses)
...
I wrote that paragraph before I noticed that I already have a sort-dctrl program on my system; it's part of dctrl-tools. Unfortunately it doesn't actually work with aptitude's output; aptitude doesn't output deb822 data... it doesn't use a . to represent blank lines in its fields. :)
Anyway, I actually agree that PowerShell is pretty cool. I find it utterly unusable, however, because it's still welded to the utterly dreadful excuse for a terminal emulator that is conhost.exe. In 2011, essential features such as multiple tabs, run-based selection and the ability to click URLs are entirely absent, and the features that do exist, such as resizing windows, block based selection and changing the font/colours are buried behind a UI that seems to be actively designed to get in my way, making me want to nuke the computer and install Debian rather than be forced to use them for even one minute longer.
Posted Dec 8, 2011 14:54 UTC (Thu)
by HelloWorld (guest, #56129)
[Link] (27 responses)
> I wrote that paragraph before I noticed that I already have a sort-dctrl program on my system; it's part of dctrl-tools. Unfortunately it doesn't actually work with aptitude's output; aptitude doesn't output deb822 data... it doesn't use a . to represent blank lines in its fields. :)
> Anyway, I actually agree that PowerShell is pretty cool. I find it utterly unusable, however, because it's still welded to the utterly dreadful excuse for a terminal emulator that is conhost.exe.
Posted Dec 8, 2011 15:35 UTC (Thu)
by cortana (subscriber, #24596)
[Link] (10 responses)
I'm not saying that PowerShell isn't terribly clever and all that, but I don't see every tool I use on a daily basis being rewritten to output its data in some standard structured format, and consume data in that format. As a result, actually using Powershell is annoying because it comes with some nice built-in stuff, but not enough third-party programs bother to make use of it. At best you end up with a bunch of scripts that parse the structured output of other programs and turn them into whatever it is that PowerShell uses internally so that Sort-Object & co. will deal with them. At that point you may as well just use a programming language that provides a decent selection of data structures in the first place (e.g., python, perl), with a module that gives you the data you want in such a structure.
Finally, and to digress, thanks for the link to Console2. Everyone who uses Windows should have it installed. Although it makes using the command-line on Windows a much less frustrating experience, it doesn't do all of what I want and sadly introduces its own quirks into the mix. It's funny you say that the lack of a standard structured data format is a fundamental problem with Unix; after many years of being trying to various conhost replacements, I've come to the conclusion that there is in fact something terribly wrong within some fundamental layer of Windows itself that makes it impossible to implement a decent terminal emulator. If that were not the case then surely someone would have done so by now; the awfulness of the command prompt on Windows is not so much an itch that needs scratching as a foot-long needle through the eye that inflicts a constant searing pain upon all users of the platform. Whereas I think the problem with a standard for structured data is that there are already plenty of standards to choose from; line-based character-separated or deb822 if you merely want a flat list of data with attributes; JSON or XML if you want something that can represent a tree of data, to name but a few. The power of Unix's use of unstructured data is that you are able to pick and choose with of these formats, plus many others, you can use when glueing together other bits of software.
Posted Dec 8, 2011 16:07 UTC (Thu)
by halla (subscriber, #14185)
[Link]
Posted Dec 8, 2011 16:47 UTC (Thu)
by nix (subscriber, #2304)
[Link]
Posted Dec 8, 2011 19:16 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (4 responses)
Standard _console_ _emulator_ on Windows definitely sucks, that's true. But the console layer itself is quite robust and easy to use.
For example, there is no freaking way in Unix consoles to detect if a modifier key is pressed. Then there's the fact that Unix consoles are in reality a result of adding features to something that initially had been a line printer output.
As a result, making applications with good text UI in console is close to impossible. That's why Midnight Commander is such a buggy POS. In fact, Unix consoles are so bad that tmux/screen had to implement console emulators working on top of console emulators because it's impossible to reliably track the state of a console.
I really really love one good Windows console application - FAR Manager. It's far ahead of everything else on the planet for console-style work.
Posted Dec 8, 2011 21:19 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (3 responses)
Are you on the payroll, or are you just completely unacquainted history and actual programming?
The Windows console layer has no pseudoterminals. That's why there are a dozen remote-command-invocation facilities in Windows, and it's why none of them works right. It's why Console2 fails badly when talking to some programs (e.g., cmd) --- Windows programs are forced to use pipes instead of consoles, so programs run under a console other than conhost itself don't even know they're being used interactively, resulting in various horrible issues the Unix world fixed 20 years ago.
Posted Dec 8, 2011 21:33 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (2 responses)
>The Windows console layer has no pseudoterminals.
It would have been OK, but terminal emulators suffer from terminal schizophrenia. They try to act as if they are line printers while at the same time trying to act as if they are driving a VT100 console. Which fails badly.
In Windows consoles are just consoles. I.e. a screen buffer on which you can draw text - and that's basically all. You can do whatever you want with it.
Instead of pseudoterms you can create child consoles (including invisible ones) and use them to start other programs. I'm not sure why console2 doesn't do it.
Posted Dec 8, 2011 21:44 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (1 responses)
Consider using the VTE widget from GNOME: http://developer.gnome.org/vte/0.30/
> It's essentially a channel to transmit bytecode which is executed by terminal emulator to draw text.
Yes, the unix terminal system uses in-band signaling. In-band signaling has advantages and disadvantages with respect to out-of-band signaling, but in the case of the tty layer, the advantages outweigh the disadvantages, which have all been overcome. The chief advantage is being to use a terminal-agnostic channel (like a TCP connection) over which a terminal-program and a terminal-emulator can communicate. The Windows world has no equivalent because its console uses opaque out-of-band signaling.
> Which fails badly.
No, the terminal system actually works very, very well in practice.
> Instead of pseudoterms you can create child consoles (including invisible ones) and use them to start other programs. I'm not sure why console2 doesn't do it.
The reason console2 doesn't use the technique you describe is that it doesn't work. Yes, you can create "child consoles" and associate processes with them, but there's no "master side" interface to these child consoles: there is no way to extract text from them, to see what your child programs are attempting to output and output it yourself in a controlled manner. Yes, you can scrape the console, but the scraping process has inherent race conditions. (And scraping is messy besides: you're complaining about terminal control codes?)
Posted Dec 8, 2011 21:54 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link]
I want it to work in plaing text consoles (over SSH). There are more than enough GUI file managers. Right now I'm trying to layer it over tmux, and I'm slowly making progress.
>Yes, the unix terminal system uses in-band signaling. In-band signaling has advantages and disadvantages with respect to out-of-band signaling, but in the case of the tty layer, the advantages outweigh the disadvantages, which have all been overcome.
I don't mind the idea of transmitting bytecode for drawing. I just don't like the way it works right now. It's messy, ugly, buggy and so on.
Have you ever tried to read the code of terminal emulators? Nightmares of escape sequences still plague my dreams :)
>The reason console2 doesn't use the technique you describe is that it doesn't work. Yes, you can create "child consoles" and associate processes with them, but there's no "master side" interface to these child consoles: there is no way to extract text from them, to see what your child programs are attempting to output and output it yourself in a controlled manner. Yes, you can scrape the console, but the scraping process has inherent race conditions. (And scraping is messy besides: you're complaining about terminal control codes?)
You can associate a child console with parent which would make it visible. But yes, I guess it's one week spot. As far as I remember, conman (console manager) worked around it by intercepting console layer calls by injecting a DLL into address space of child processes.
If I were to design a console layer right now, then I'd have have chosen Windows model with explicitly defined marshalling protocol.
Posted Dec 8, 2011 20:50 UTC (Thu)
by HelloWorld (guest, #56129)
[Link] (2 responses)
Posted Dec 8, 2011 22:15 UTC (Thu)
by ebiederm (subscriber, #35028)
[Link] (1 responses)
Posted Dec 9, 2011 12:23 UTC (Fri)
by nix (subscriber, #2304)
[Link]
Posted Dec 8, 2011 23:13 UTC (Thu)
by GhePeU (subscriber, #56133)
[Link] (15 responses)
I don't find 'ps -Ao user,pid,ppid,time,command | sort -k3n' particularly messy.
Posted Dec 9, 2011 9:06 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (14 responses)
Posted Dec 9, 2011 9:28 UTC (Fri)
by khim (subscriber, #9252)
[Link] (13 responses)
Not if your programs continue to produce "untyped byte streams". PowerShell is classic example of well-known problem. With bash you need to know how to handle mess which existing programs produce and consume. With PowerShell you still need to know that (see lots of threads above about how to handle it - it's as ugly as in bash if not uglier), but you also have thus nice, user-friendly way which does not work all that often, but presented as the solution "for the future". Just like all other "grand unification schemes" before and after it'll fail so to say that PowerShell is better then bash is crazytalk: it's strictly worse and this is very easy to prove... PowerShell must die - but I'm content because it'll do just that anyway. It'll take a long time, but now, when madness around JVM and CLR is slowly dying because people understand that they were duped it's only matter of time. It'll be interesting to see if Microsoft will survive destruction of it's .NET dream or if it'll die off with too. I'd be relighted to see that, but don't hold my breath: Microsoft still looks way too resilient for that. Albeit lately it does so many stupid things die-off is actually likely. We'll see.
Posted Dec 10, 2011 2:52 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link] (12 responses)
PowerShell _is_ a unified language which interoperates with all other .NET languages. It does allow to transfer structured data in a unified way. UNIX people should sometimes leave their self-righteousness at the door and actually check what other people are doing.
PowerShell is definitely more powerful than bash or zsh because IT IS BUILT ON A GOOD FOUNDATION. It's that simple.
Bash knows nothing but byte streams. That's nice - for 1973. You can construct pipes in bash and then try use them to do something. But:
Posted Dec 10, 2011 6:18 UTC (Sat)
by khim (subscriber, #9252)
[Link] (4 responses)
In other words: it's LISP Machines all over again. In a hypothetical world where everyone is using .NET it may even be a good thing. In real world where .NET rage is on decline and it's now clear that .NET will not ever take over the world... not so much. UNIX people have looked from sidelines when first coming of LISP machines have come and gone. They'll do that a second time (and third time - if that'll ever happen), they are patient guys. They'll pick some good ideas from the whole sad story, as usual, but the whole "let's redo everything from scratch" thing is just not what UNIX people plan to ever do. It's more powerful but it's more complex, too. You may as well say that emacs prompt is more powerful then bash - and you'll be right. But it only helps you if you want to want in "Emacs OS". When you need to interact with a real world it's more complex then bash or zsh because it needs to do what bash/zsh are doing and it needs to do it in a way which makes it possible to easily do things in it's own world. Sure, it's possible to use emacs as as shell replacement - but few people do. PowerShell is the same way. But unlike Emacs it's just as shell wannabe, nothing else, so I'm not sure it'll survive when all the rage about this generation of LISP Machines will die off. Long list of arguments skipped. Just note how everything you've cited was solved in "the first coming" (on LISP Machines), too. The difference is in scale: first time tens of millions were burned thus only thousands of machines were produced and the whole saga took about ten years on sidelines. Second time architecture astronauts had billions so the whole story took longer and rage was raised higher. The result is the same (one company was burned to a crisp, second one is stagnating, but is more-or-less ready to throw all these baubles out), it just took longer. When the hype will die off PowerShell will either die with .NET or it'll be used in some limited settings, while bash will live as a mainstream tool (albeit may be in split personality if Apple will continue to fear GPLv3). P.S. Note that I'm not saying .NET will die off totally: "first coming" left Emacs behind, I'm pretty sure "second coming" will leave few IDEs behind, too. Apparently that's the niche where all your vaunted advances are really helpful.
Posted Dec 10, 2011 7:18 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link] (3 responses)
Nothing had ever came close to PowerShell in functionality and usability for system shells. And no, old elisp shells or Rexx scripts ARE NOT the answer. They don't offer the two most powerful features of PowerShell: introspection and structured pipelines.
>It's more powerful but it's more complex, too. You may as well say that emacs prompt is more powerful then bash - and you'll be right. But it only helps you if you want to want in "Emacs OS". When you need to interact with a real world it's more complex then bash or zsh because it needs to do what bash/zsh are doing and it needs to do it in a way which makes it possible to easily do things in it's own world.
Sorry, but PowerShell can do ANYTHING bash can do. Without exceptions. Some things might be more clumsy than in bash - because PowerShell is not bash and things are done differently there.
And PowerShell even beats bash at its own game! PowerShell integration with git is much nicer than in bash: https://github.com/dahlbyk/posh-git - and achieves the similar functionality in a fraction of lines of code. A few screenshots can be seen here: http://markembling.info/2010/03/git-powershell-revisited
Posted Dec 10, 2011 8:41 UTC (Sat)
by khim (subscriber, #9252)
[Link] (2 responses)
They do offer introspection and as for "structured pipelines"... this is only important if you want to pretend you are replacing "shell proper". So what? Any turing-complete language with support for fork/exec (or CheateProcess on Windows) can do the same. And that's the point: PowerShell can not replace bash so bash will survive while PowerShell will die (or at least will be relegated for niche work). If you don't like LISP Machines analogue then I have another for you: Space Shuttle. This extremely expensive program was invented to replace old rockets: Atlas, Delta, Saturn, etc. Just like PowerShell it promised the world and delivered very little¹. After all the hype and all the expenses it was only able to replace few of them - the ones which were forcibly killed to "open the road for the future". Today Space Shuttle is history and in it's place is huge glaring void - while people who were sensible enough to continue to use "old school" tools (Delta, Proton, Soyuz, etc) are still around and doing what Space Shuttle was supposed to do. It does not matter how many lines of code it requires. The same story as with Space Shuttle: sunk costs. When people try to justify Space Shuttle craze they compare development costs of Saturn V² and Space Shuttle - but this is completely wrong: it was not possible to get money spent on Saturn V back, thus you need to compare ongoing costs of Saturn V with new development costs of Space Shuttle - and there are no comparison. CMD/Bash is here, it's not goes away so program will need to support it anyhow. This means all additional developments (things like cmdlets) should be counted extra. ¹) Well, one nice property of this stupidity was the fact that they managed to convince USSR to waste a lot of resources for the mirror project - and if this was the goal then the project can be considered success. ²) Why Saturn? Well, alternative for Space Shuttle idiocy was continuation of Saturn program so it's only natural to compare Space Shuttle to Saturn.
Posted Dec 10, 2011 13:15 UTC (Sat)
by dlang (guest, #313)
[Link]
Posted Dec 11, 2011 1:03 UTC (Sun)
by Cyberax (✭ supporter ✭, #52523)
[Link]
Please show me how I can tab-complete SQL queries in any elisp shell. I'd like to be able to at least complete all table names in all contexts.
PowerShell is freakingly powerful in that regard - I already use it instead of psql shell to work with Postgres databases. Mostly because autocompletion in posh based on simple introspection is miles ahead of what's present in psql.
And that says something.
>If you don't like LISP Machines analogue then I have another for you: Space Shuttle
Yup. It's funny because bash is very much like Space Shuttle: it's old, it's expensive to use (bash hell scripts are not easy to write), it's prone to crashes, etc. The only reason it's used is because of a sunken costs of tons of hell scripts.
Posted Dec 11, 2011 0:11 UTC (Sun)
by nix (subscriber, #2304)
[Link] (6 responses)
Posted Dec 11, 2011 5:36 UTC (Sun)
by Cyberax (✭ supporter ✭, #52523)
[Link] (3 responses)
PowerShell allows to embed documentation directly into objects and also structure it by parameters, methods, etc.
And there's nice "-online" switch that leads you directly into the TechNet article associated with tool, with Q&A and other additional functionality.
Can you make your scripts in your home directory have their documentation automatically be included and made searchable into the central help system?
>I'd rather use man than HTML Help with its abysmal searching, insulting baby talk, and horrendous security holes.
You're in luck because PowerShell doesn't use HTML Help :)
>Which would be why POSIX, uh, standardized them more than ten years ago, and imposed rules which virtually all standard tools follow.
Yeah, sure.
> dd if=... of=...
Then there are --argument=blah and "--argument blah" forms which don't always work. And then there are short forms which sometimes require argument to be written immediately without intervening spaces.
In PowerShell _everything_ is standardized _and_ autocompleteable. I can do things like "my-command -p<tab>" and get the list of parameters (with description and default values!) starting with 'p'. This is sort of possible in bash/zsh with cooperating tools, but in PowerShell it's all completely automatic.
And since PowerShell is a static language with type inference, it'll warn me if I write stuff like this "destroy-hard-drive -in hello -seconds" (because 'hello' is not an integer).
>You carefully named the two shells which have extremely extensive autocompletion, in zsh's case shipped with the package
I know perfectly well how bash/zsh autocompletion work.
>Yes, it's not 100% automatable without a bit of per-tool scripting. The workload is minimal compared to writing the tools. Perhaps in an ideal world it could be completely automated, but that just shifts the burden from writing the autocompletion code to writing some sort of reflective description of the system. Big deal.
And in PowerShell I get autocompletion basically for free. And it's good. It's VERY good. I'm actually using PowerShell instead of Postgres's psql because PowerShell is much more powerful.
I can autocomplete table names while writing SQL queries. In command line.
This is not really possible in bash/zsh because text matching games only can lead you so far. Even autocompletion for scp in bash is already straining things.
I wish people would sometimes go out and see what's happening outside of the Linux/Unix world. It's not a wonder that the most popular Linux distribution is actually barely a Unix system.
Posted Dec 11, 2011 11:07 UTC (Sun)
by nix (subscriber, #2304)
[Link] (1 responses)
Most of the rest of what you say is a combination of ignorance of what the Unix tools you discuss can actually do, and complaints that things are not acceptable because they're not just like PowerShell does them. We get that you like it, but we've all been through this parochial 'the newest system I just saw is the answer to everyone's prayers' phase, and, y'know? It's always wrong. There are limitations there: you're just not seeing them.
Posted Dec 11, 2011 11:50 UTC (Sun)
by Cyberax (✭ supporter ✭, #52523)
[Link]
I'm claiming that it's not a GOOD system.
>Ah. Like POD, doxygen, and similar systems, all of which can generate manpage output. i.e., dead heat here.
So... How do I make documentation for my bash script (with all its options), make a man page, link it into the central system and all of it without doing anything more than simply declaring options?
And no, doxygen won't help you - it doesn't support bash (I'd actually tried to find an automatic documentation system for bash some time last year - there was none).
>That's great -- if and only if Microsoft wrote the tool. Only useful in a software monoculture.
That's easily adapted if tools' authors provide their own URLs (which they can do in PowerShell - VMWare has its own doc system, for example).
>Yes. Learn about apropos databases and MANPATH. Manpages can be stored absolutely anywhere (as can info pages).
I know perfectly well how most of Linux tools work. I know that one can maintain their own local man databases. And I also know perfectly well that almost nobody does, mostly because it's complicated and error-prone.
>Most of the rest of what you say is a combination of ignorance of what the Unix tools you discuss can actually do, and complaints that things are not acceptable because they're not just like PowerShell does them.
I know most of the standard Unix tools. I've built my own custom distributions from scratch (first time without the benefit of the LFS book) and support a network of embedded devices. I've been using Linux on my desktops since 90-s and can recollect all the steps that have been taken to make Linux to be at least possible to use on desktop.
A lot of these steps involved bashing at least some old Unix-heads with spiked hammers: udev, HAL, KMS, dbus to name a few. Oh, and the whole 'Android' thingie. Now the same thing repeats with pulseaudio, systemd and journald.
>We get that you like it, but we've all been through this parochial 'the newest system I just saw is the answer to everyone's prayers' phase, and, y'know? It's always wrong. There are limitations there: you're just not seeing them.
Some things do solve all the (existing) problems. Because they are designed to solve them.
PowerShell is one such example. It's designed to be a better shell than text-based shells and it excels at it. It's not yet as polished as bash/zsh but it's getting better with each new release.
Of course, PowerShell has limitations and a set of new problems, but so does bash/zsh. And limitations of bash/zsh are MUCH more constricting.
Posted Dec 16, 2011 5:11 UTC (Fri)
by tom.prince (guest, #70680)
[Link]
>And in PowerShell I get autocompletion basically for free. And it's good. It's VERY good. I'm actually using PowerShell instead of Postgres's psql because PowerShell is much more powerful.
You only get this for free, if your software is designed to work with PowerShell, otherwise you are in the same boat as with zsh/bash.
There is at least one framework (twisted-python) that automatically generates zsh completions from the description of options.
Posted Dec 11, 2011 16:21 UTC (Sun)
by anselm (subscriber, #2796)
[Link] (1 responses)
Some of the most basic aspects were indeed standardised by POSIX, but that doesn't detract from the fact that, say, the option to specify a field delimiter is »-d« for cut(1), »-t« for sort(1), »-F« for awk(1), and so on. POSIX basically codified the wild hodgepodge that existed at the time.
Now the meaning of common options (as opposed to basic option syntax) would have been something actually worth standardising, but of course it would have rendered 20 years' worth of shell scripts virtually useless, so it didn't happen.
Posted Dec 11, 2011 17:03 UTC (Sun)
by raven667 (subscriber, #5198)
[Link]
Posted Dec 9, 2011 0:06 UTC (Fri)
by grg (guest, #76756)
[Link] (1 responses)
Posted Dec 9, 2011 2:44 UTC (Fri)
by cortana (subscriber, #24596)
[Link]
Posted Dec 9, 2011 4:26 UTC (Fri)
by jthill (subscriber, #56558)
[Link] (18 responses)
It's a one-liner.
Posted Dec 9, 2011 5:09 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (17 responses)
Posted Dec 9, 2011 5:26 UTC (Fri)
by jthill (subscriber, #56558)
[Link] (16 responses)
It's a perfectly reasonable one-liner.
Posted Dec 9, 2011 8:53 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (15 responses)
Posted Dec 9, 2011 20:29 UTC (Fri)
by jthill (subscriber, #56558)
[Link] (14 responses)
> OK, I admit I didn't know about the -k switch
But before you take it there, you might want to spend more than the few seconds you've apparently spent on the attempt.
Posted Dec 9, 2011 20:39 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (13 responses)
Posted Dec 9, 2011 21:27 UTC (Fri)
by jthill (subscriber, #56558)
[Link] (12 responses)
Posted Dec 9, 2011 21:40 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (1 responses)
Well, I probably shouldn't be wasting my time with a bloody stupid troll like you.
Posted Dec 9, 2011 22:26 UTC (Fri)
by corbet (editor, #1)
[Link]
Posted Dec 9, 2011 22:10 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (9 responses)
Posted Dec 9, 2011 23:10 UTC (Fri)
by jthill (subscriber, #56558)
[Link] (8 responses)
Posted Dec 9, 2011 23:33 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (7 responses)
Posted Dec 10, 2011 2:19 UTC (Sat)
by nybble41 (subscriber, #55106)
[Link] (6 responses)
Posted Dec 11, 2011 10:39 UTC (Sun)
by HelloWorld (guest, #56129)
[Link] (5 responses)
Posted Dec 11, 2011 22:48 UTC (Sun)
by nybble41 (subscriber, #55106)
[Link] (4 responses)
An aptitude-equivalent under PowerShell would need to export the ability to compare package objects by version for any third-party sort utility to work with the data anyway, which is 90% of the way toward just implementing the -O option internally.
"Structured data" is nearly always the first thing anyone tries when implementing a new system. A number of older systems don't support anything *but* structured data for disk files, for example--I've worked on operating systems where even plain text files were stored, one row per line, as tables in a database. The fact that all modern systems actually treat disk files and most inter-process communications as plain byte streams should tell you something. Standardized data *formats*, encoded as plain byte streams, are more flexible and resilient than standardized APIs, as used by PowerShell.
Posted Dec 12, 2011 5:58 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link] (3 responses)
But it does. You can easily sort any data on any field even using your own custom comparator (try that with 'sort'!), using only one utility. Which had to be written, just as 'sort' utility in Unix.
>You could get the same effect by standardizing the labeled-groups-of-lines format output by 'aptitude show' and writing a sort utility specialized on that format (a 10-minute job in any major scripting language). Or you could just use one of the many portable interactive scripting language shells, like python or irb or scsh, which provide the same services for their respective bindings that PowerShell provides for .NET.
Well, of course you can achieve the same functionality as in PowerShell by replicating what PowerShell does (namely, by introducing structured data). The point is - no one is really doing it in the Unix world.
Posted Dec 12, 2011 6:25 UTC (Mon)
by nybble41 (subscriber, #55106)
[Link] (2 responses)
> But it does. You can easily sort any data on any field even using your own custom comparator (try that with 'sort'!), using only one utility. Which had to be written, just as 'sort' utility in Unix.
The point was that the 'sort' utility used in PowerShell only really adds anything over the standard Unix sort command if the source application includes bindings for .NET.[1] A similar utility could be written for Unix with the same features, if anyone was actually interested. It's not exactly a difficult problem, even including fancy features like custom comparators. The critical part is a runtime-specific binding for the source application. If aptitude had bindings for Python, Ruby, or Scheme you could do exactly the same thing from these shells under Unix today. It wouldn't require any more work than implementing the .NET bindings necessary to make aptitude work with PowerShell (assuming they worked on the same platform to begin with). In fact, if aptitude had a DBUS interface, as many other application s do, it would automatically with with all these Unix shells and more.
> Well, of course you can achieve the same functionality as in PowerShell by replicating what PowerShell does (namely, by introducing structured data). The point is - no one is really doing it in the Unix world.
For good reason, as I said--such systems have never won out in the past, why should .NET and PowerShell be any different? Anyway, my suggestion wasn't to replicate what PowerShell does, it was to implement a record-oriented sort utility within the existing Unix byte-stream framework, using a standard data encoding rather than opaque "structured data" in a particular runtime. Historically, forcing everything to work within a single complex runtime or RPC model has not been a winning strategy. Simple, portable, reliable interfaces as best.
----
[1] Purely out of curiosity, how does this kind of sort command work if the data is coming from a file, without the .NET APIs, rather than directly from an application? Is it even possible to get the same effect as redirecting the output of a command to a file before sorting it, or must the source and sort command be directly coupled? It seems like the extra features must get lost in translation, since only the textual representation will be saved on disk.
Posted Dec 12, 2011 8:59 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link]
The problem is, you have to write a special-case utility. While PowerShell provides you a generic mechanism for this. It provides a robust platform that other systems can use.
And some features of sort are quite difficult to replicate in bash, for example, custom comparators (I believe, the only way to do this is to create string ordinals suitable for the 'sort' utility).
You've mentioned D-BUS - it's actually quite similar to PowerShell in a lot of regards. It provided a platform that other applications could easily use and adapt (instead of inventing their own ad-hoc remoting). And so application quickly adopted it. The same is happening with PowerShell in Windows.
>For good reason, as I said--such systems have never won out in the past, why should .NET and PowerShell be any different?
Well, it is different because it's already adopted in the Windows world and it's clear that PowerShell is here to stay.
And I think the main reason for adoption is that the time has come for it. All the ingredients for PowerShell were finally in place - even twelve years ago it would have been impossible without the common environment that .NET provides (PowerShell based on COM? Shudder).
The same story with DBUS - there quite a few failed attempts to build a common desktop RPC system in Unix: DCE RPC, DCOP, CORBA in GNOME and I probably forgot quite a few others. Yet DBUS has clearly won and is now ubiquitous. We even have projects for kernel-accelerated DBUS.
>[1] Purely out of curiosity, how does this kind of sort command work if the data is coming from a file, without the .NET APIs, rather than directly from an application?
That depends on what you're trying to do. You surely can dump objects' content in XML/JSON/binary but it's rarely useful in practice. More often you'd use FT command (Format-Table) to create textual representation that fits your purposes. For objects like files or list of files you can dump them as simple filenames. PowerShell is quite flexible in that regard.
Posted Dec 12, 2011 18:59 UTC (Mon)
by raven667 (subscriber, #5198)
[Link]
What is this "winning" you are concerned about? PowerShell is already the preferred standard command shell and scripting environment on MS Windows servers and is not really relevant outside that space. UNIX shells like bash are really only relevant on UNIX systems, they have only marginal presence in the Windows space, so there is really no context where "winning" makes sense as they are not directly comparable.
The idea that the shell can be both built out of and expose a full featured programming language and library is not new as you point out but is well demonstrated by PowerShell and not a bad idea.
Posted Dec 8, 2011 10:23 UTC (Thu)
by fb (guest, #53265)
[Link]
I wouldn't know. I think Cygwin does a wonderful job by letting me do the minimal Windows scripting when necessary without having to learn anything new.
Posted Dec 8, 2011 10:38 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (44 responses)
Posted Dec 8, 2011 11:48 UTC (Thu)
by bjartur (guest, #67801)
[Link] (2 responses)
Posted Dec 8, 2011 19:46 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (1 responses)
I find it very difficult to care about PowerShell's fancy object piping when basic arguments don't work right.
Posted Dec 8, 2011 20:35 UTC (Thu)
by HelloWorld (guest, #56129)
[Link]
Posted Dec 8, 2011 18:58 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (31 responses)
Quoting in PowerShell is _way_ ahead of bash and other text-based shells. Writing correct bash code that works with quoting is an exercise in pulling your hair, while in PowerShell it's completely natural.
See here for overview: http://www.techotopia.com/index.php/Windows_PowerShell_1....
Posted Dec 8, 2011 19:38 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (30 responses)
Posted Dec 8, 2011 19:54 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (29 responses)
Like:
> touch -- -a
> rm *
Dumb text expansion is one of the main problems of shells. It's so bad that suid functionality for scripts has been removed in 80-s because writing secure scripts is just not possible.
In PowerShell it's all natural:
No errors, no problems, everything works.
If you REALLY want to reinterpret your string - you can do this explicitly.
Posted Dec 8, 2011 19:59 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (21 responses)
You're ignoring the very serious deficiency in PowerShell I identified, a deficiency which makes it practically unusable for me and many others (consider the problems passing SQL queries to programs).
No, don't come back with an example of a cmdlet that properly accepts $foo. The problem is with external programs, which makes the issue worse, not better.
No, not everything is a cmdlet, nor will everything be a cmdlet in our lifetimes.
Posted Dec 8, 2011 20:16 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (14 responses)
And working with SQL queries in PowerShell is WAY better than anything Linux has because one can use statically typed LINQ. I can actually tab-complete table names while writing a query from a freaking command line.
For example: http://bartdesmet.net/blogs/bart/archive/2008/06/07/linq-...
External legacy programs are just that - legacy. They can be easily wrapped but quite often it's easier to replace them completely.
Posted Dec 8, 2011 20:20 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (13 responses)
Show me.
> External legacy programs are just that - legacy.
No, external programs are essential. If you persist in this assertion, and claim PowerShell isn't broken because you don't really need the broken feature, we can't have a discussion. If I'm going to write a stand-alone program that doesn't act like a shell, there are many languages better-suited to the task than PowerShell is.
Posted Dec 8, 2011 21:07 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (12 responses)
It's called 'splatting' and is done by prepending '@' to variable name.
Now can you show me how I can tab-complete table names in Bash while I'm editing a query?
> No, external programs are essential. If you persist in this assertion, and claim PowerShell isn't broken because you don't really need the broken feature, we can't have a discussion. If I'm going to write a stand-alone program that doesn't act like a shell, there are many languages better-suited to the task than PowerShell is.
Sure, a lot of legacy is essential and PowerShell can easily work with legacy text-based programs. No big deal.
But that doesn't make it any less legacy in the Windows world.
Posted Dec 8, 2011 21:11 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (11 responses)
> It's called 'splatting' and is done by prepending '@' to variable name.
Splatting doesn't address the issue I raised.
> PowerShell can easily work with legacy text-based programs
I demonstrated that PowerShell cannot reliably pass arguments to external programs. That's "broken", not "easy".
Posted Dec 8, 2011 21:18 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (10 responses)
>I demonstrated that PowerShell cannot reliably pass arguments to external programs. That's "broken", not "easy".
Let's see:
>Debian GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
Looks like it works just fine. Now describe your problem in details, without knee-jerk reactions.
Posted Dec 8, 2011 21:21 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (9 responses)
You must have missed the thrust of my original post:
https://lwn.net/Articles/471133/
It's not possible to pass arbitrary strings from PowerShell to external programs.
Posted Dec 8, 2011 22:41 UTC (Thu)
by mathstuf (subscriber, #69389)
[Link] (8 responses)
> for x in $( seq 1 ${#var} ); do
Granted, most of the time you don't need this (who puts 7 backslashes in a row? Well...tr might make sense with that many backslashes as an argument), but it needs to be copied verbatim every time unless you want to do another level of escaping passing it to a function.
Of course, if there's a shorter way, I'd be thrilled to know it :) .
Posted Dec 8, 2011 22:51 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link]
printf '%s' "$bar" is sufficient. The value of bar isn't expanded or interpreted.
Posted Dec 9, 2011 9:39 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (6 responses)
If all you wanted to do is print the elements of an array, you can do that with echo "${var[@]}". Well, unless something in ${var[@]} starts with a dash...
Posted Dec 9, 2011 18:07 UTC (Fri)
by mathstuf (subscriber, #69389)
[Link] (5 responses)
Posted Dec 9, 2011 19:31 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (4 responses)
Posted Dec 9, 2011 20:09 UTC (Fri)
by nybble41 (subscriber, #55106)
[Link] (3 responses)
echo -n "$var"
To work as intended the variable reference would need to use substring syntax, "${var:$x:1}", rather than the array syntax "${var[$x]}". However, it is sufficient to place the variable in double-quotes, which (in bash) causes the value to be output verbatim, with no further quoting, expansion, or splitting.
Posted Dec 9, 2011 20:13 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (2 responses)
Posted Dec 9, 2011 20:25 UTC (Fri)
by nybble41 (subscriber, #55106)
[Link] (1 responses)
Posted Dec 10, 2011 3:34 UTC (Sat)
by mathstuf (subscriber, #69389)
[Link]
Posted Dec 8, 2011 20:41 UTC (Thu)
by HelloWorld (guest, #56129)
[Link] (5 responses)
Posted Dec 9, 2011 7:33 UTC (Fri)
by ekj (guest, #1524)
[Link] (4 responses)
ls -l
and
ls -- -l
You don't need to use -- for the purpose, but you need to do *something* since both are valid and reasonable commands, but the two have distinct meaning.
Posted Dec 9, 2011 8:52 UTC (Fri)
by HelloWorld (guest, #56129)
[Link] (3 responses)
Posted Dec 9, 2011 12:19 UTC (Fri)
by nix (subscriber, #2304)
[Link] (1 responses)
Posted Dec 9, 2011 12:38 UTC (Fri)
by HelloWorld (guest, #56129)
[Link]
Posted Dec 9, 2011 16:30 UTC (Fri)
by nybble41 (subscriber, #55106)
[Link]
There's a standard solution to this: instead of "ls *.c", write "ls ./*.c", which has the same effect, and yet has no chance of accidentally expanding to an option rather than the expected filename.
Or, for any program which has a standard getopt-style command-line parser, just use "--" before any glob patterns.
Posted Dec 9, 2011 2:46 UTC (Fri)
by vonbrand (subscriber, #4458)
[Link] (1 responses)
No, that isn't the reason. Running scripts SUID is inherently racy, something could interrrupt the interpreter after going SUID and before running the script proper, or switch the script underneath.
Posted Dec 9, 2011 3:29 UTC (Fri)
by Cyberax (✭ supporter ✭, #52523)
[Link]
You see, even Perl is better than shell scripts.
Posted Dec 11, 2011 23:59 UTC (Sun)
by chuckles (guest, #41964)
[Link] (4 responses)
rm ./-a
In PowerShell it's all natural:
lol. 'all natural' sorry that made me laugh.
Posted Dec 12, 2011 6:00 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link] (3 responses)
Posted Dec 12, 2011 16:40 UTC (Mon)
by jimparis (guest, #38647)
[Link] (2 responses)
Posted Dec 12, 2011 17:07 UTC (Mon)
by mpr22 (subscriber, #60784)
[Link] (1 responses)
Posted Dec 12, 2011 17:42 UTC (Mon)
by jimparis (guest, #38647)
[Link]
Posted Dec 8, 2011 21:38 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (8 responses)
>C:\>whoami "foo\abc def\"
That's just the way Windows console behaves. Arguments are not splitted if they are quoted - it's different from Linux, but it's consistent.
You can, of course work around it by splitting your parameter list manually. As you have to do it in the regular Windows console.
Posted Dec 8, 2011 21:48 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (7 responses)
No, PowerShell is not being consistent with cmd here. PowerShell in fact tries to quote arguments (so they will be properly decoded and sent to [w]main), but fails in some glaring cases. Here is a case that *does* work:
PS> $foo="hello world"
Here, you see that whoami (which prints its first argument, and so is useful as a testing tool) prints *just* "hello world", not "hello world bar".
Please, learn how the Windows command line argument system works before attempting to defend it.
Posted Dec 8, 2011 22:09 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (5 responses)
Posted Dec 8, 2011 22:26 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (4 responses)
No. Splatting does _NOTHING_ to address the quoting issue I raised. It does not allow you to pass arbitrary argument words to external programs. Your technique actually makes the problem _WORSE_ because you lose the identity of individual arguments which CAN contain spaces_and which you DO need to quote. You've utterly failed to even comprehend my basic point. This conversation is over.
Posted Dec 8, 2011 22:28 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (2 responses)
Posted Dec 9, 2011 12:25 UTC (Fri)
by nix (subscriber, #2304)
[Link] (1 responses)
You are treating it as if he wants to run 'whoami' but doesn't know how.
When a finger points at the moon...
Posted Dec 10, 2011 3:00 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link]
I finally understood what author wants and gave an answer: "@($var -split ' ')". It does the braindead thing that was requested.
Why braindead? Because good PowerShell scripts would just use an _array_ and 'splat' it when required, instead of constructing shell arguments as raw strings.
And no, this hack with splitting string into arguments is definitely not required to work with legacy code.
Posted Dec 10, 2011 3:01 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link]
Posted Dec 16, 2011 3:33 UTC (Fri)
by useerup (guest, #81854)
[Link]
For your example try this:
PS>$foo="/groups","/user"
And then try
PowerShell actually integrates quite nicely with external programs; it does so even retaining (not re-interpreting) arguments.
In your example you told PS to pass a string containing "arg1 arg2" to an external program - which it did. You assume that the entire commandline is turned into text and re-interpreted (the way of POSIX shells).
As you can see above, PowerShell understands arrays and will readily pass the array items as discrete arguments. It's not harder - just a little different and a lot more robust and avoids the risk of injection vulnerabilities.
Posted Dec 8, 2011 15:01 UTC (Thu)
by ccchips (subscriber, #3222)
[Link] (27 responses)
I think they have some good ideas, but it has many deficiencies; for example, you have to go "around the block" to get byte-stream-oriented input redirection; it doesn't integrate well with *existing* tools that use text-based output. "cmndlet" B may not be dependent on the byte-stream output format of cmdlet A, but it *is* dependent on cmdlet A's object definitions. Their object model is still very weak; for instance, try to use their own tools to get the last logon date for an Active Directory user.
They have problems with directory management. They don't have a working "du" type tool that produces objects as output, and the workarounds they do have barf if the paths are too long. They pride themselves on ease-of-discuvery, but try to discover all those .net classes without going on the Web to look up the objects and their meanings.
On the other hand, when the object model does work, it's very easy to get things done, and you can count on data formats between pipeline elements.
I enjoy working with Powershell, but I also enjoy working with Bash, Perl, Python, and several other tools. I'd like to see a working Powershell implementation in the *nix world, or if nothing else, some interest in object-oriented (or mixed) pipelines.
Posted Dec 8, 2011 15:20 UTC (Thu)
by felixfix (subscriber, #242)
[Link] (2 responses)
Then programs which print several lines per record could be used with pipelines. It wouldn't be perfect, but all it would take is the two utilities, which I am, alas, too lazy and/or uninspired to write. The problem comes up too seldom to take the time to write it, and I have always found simple work arounds.
Posted Dec 9, 2011 0:32 UTC (Fri)
by wahern (subscriber, #37304)
[Link]
They can be rather awkward to use, but I believe do what you want.
Posted Dec 9, 2011 5:03 UTC (Fri)
by jthill (subscriber, #56558)
[Link]
Posted Dec 8, 2011 17:05 UTC (Thu)
by ccchips (subscriber, #3222)
[Link] (8 responses)
I have a "cron job" (scheduled task) and I want the job to send *all* the output to a text file, *including* all error messages. The only way I could track down an error in my script was to run it from the console and read the error message (in red) from the screen.
Not good.
This is just one example where a potentially very powerful shell fails in critical situations. Object model or no, the Powershell team should *pay attention* to the needs of system administrators everywhere, not just the point-and-click/read the screen types.
And there's no good reason why migrating Bash users should have to go around the block several times to do things that have been extremely simple for years:
foo <bar.txt >result.txt
vs.
get-content....blablabla....out-file....
Posted Dec 8, 2011 19:02 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (7 responses)
Uhm. PowerShell supports standard text redirection. What exactly do you need?
Posted Dec 8, 2011 20:41 UTC (Thu)
by ccchips (subscriber, #3222)
[Link] (2 responses)
foo >bar 2>&1
to have *every* text line from standard output and *every* error line that would sho up in red on the screen, as the result of the above command.
Posted Dec 8, 2011 20:58 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link]
> foo >bar 2>&1
Yep. It works just as in Unix.
Posted Dec 8, 2011 20:59 UTC (Thu)
by ccchips (subscriber, #3222)
[Link]
However, I did find a different problem; things go really wrong if you try something like this:
dir | get-content
because "get-content" takes a list of objects from the input pipe that are interpreted as file names.
Still, my first problem is solved, I have egg on my face, and so it goes. I never said I didn't like Powershell--I enjoy working with all of the common shells (except maybe MS-DOS Command prompt) and would like to see Powershell be more easily integrated with existing tools.
I suspect there is a way to get "dir | get-content" to act more like UNIX shells would, but we could argue all day about the merits.
Posted Dec 8, 2011 21:59 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (3 responses)
What do you think this PowerShell pipeline puts into foo.txt?
echo hello > foo.txt
If you guessed ASCII "hello" or "hello\n" or "hello\r\n", you'd be wrong. Horribly wrong.
PowerShell converts everything to UTF-16-LE, prepends a BOM, and puts the result in foo.txt. This behavior is almost too horrible for works. PowerShell even mangles the output of *native* programs, e.g.
ipconfig > ipconfig-out.txt
In cmd, ipconfig-out.txt contains ASCII. When the same pipeline is run in PowerShell, it's UTF-16-LE-with-BOM. This encoding is more verbose, less commonly supported, and (because of the BOM) far less friendly to splicing and concatenation. This behavior is just awful.
As a lemma, PoewrShell will also mangle the output of programs that produce non-text output: consider
ps2pdf < foo.ps > foo.pdf
foo.pdf will take each "line" of ps2pdf output, convert it to UTF-16 as best it can, then dump the result in foo.pdf. The result looks like a PDF sent through a meat grinder.
(Don't tell me you're supposed to use some cmdlet for doing the conversion: the point of a shell is to work with existing programs. If I wanted to use a library, I'd use a different programming language.)
Posted Dec 8, 2011 22:14 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (2 responses)
You can change default encoding to UTF-8 with "$OutputEncoding = New-Object -typename System.Text.UTF8Encoding". Binary mode is more tricky, you have to add "-encoding byte" everywhere.
Posted Dec 8, 2011 22:27 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (1 responses)
Thus my original point: I'll consider using PowerShell only after the critical issues I've identified are fixed. Until then, Unix shells (even running under Cygwin) are superior by far.
Posted Dec 8, 2011 22:51 UTC (Thu)
by tialaramex (subscriber, #21167)
[Link]
People who are never going to be happy about LLP64, UTF-16, or various other arbitrary yet defensible differences from Unix will never be comfortable on Windows. This isn't even a religious war, it's some fundamental cultural difference just like the relationship between directions and time.
For me "bringing forward" a meeting will always make it sooner, and UTF-8 will always be the more intuitive and sensible encoding. So no Windows for me.
Posted Dec 10, 2011 3:06 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link] (14 responses)
You'll get a list of last login timestamps and users names.
You can argue that PowerShell does not yet cover all possible functionality, but it's not a deficiency of the idea itself. Just a sign or relative youth.
Posted Dec 10, 2011 6:45 UTC (Sat)
by khim (subscriber, #9252)
[Link] (4 responses)
Nope. It's an Achilles' heel. PowerShell is half-decade old already, so you can not say it's all that young. It's just not a shell replacement, it belongs to a long list of "universal glue languages" (which were rarely all that universal). Besides LISP on the LISP Machines or Oberon on Oberon OS (which is the only examples where "universal languages" were almost fully universal indeed) this list includes things like REXX, AppleScript, VBScript, etc. They are good in their "area of expertise", but please don't try to mix them with shells - because they are not shells despite the hype. Why? Because they assume programs will offer specialized interfaces just for that one flavor or scripting. But developers of a lot of programs just don't care enough to do that! They may provide some kind of command line switches and/or offer some textual output (because it's easy), but why should they bother to offer all these other things? This is not what they are paid for! Some programs don't include any scripting support at all (in this case even bash can do nothing), but more often then not they do include some scripting - because their authors need it for development purposes. But it's as minimal as possible, because it's side-show at best. And most such schemes and languages are horrible when they need to interact with programs which don't include nice-structured-interface-of-the-year. PowerShell is not an exception.
Posted Dec 10, 2011 7:28 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link] (3 responses)
No. It IS the shell replacement. It also is a glue language.
>Why? Because they assume programs will offer specialized interfaces just for that one flavor or scripting. But developers of a lot of programs just don't care enough to do that! They may provide some kind of command line switches and/or offer some textual output (because it's easy),
Sure. And you can work with these programs just fine. It's not going to be as natural as working with native cmdlets, but it's good enough to work with legacy stuff.
For example, PowerShell's git integration is nicer than in bash/zsh and takes only a fraction of code for the similar functionality. See for yourself: https://github.com/dahlbyk/posh-git
>but why should they bother to offer all these other things? This is not what they are paid for!
Because writing a cmdlet is like 10 times easier than writing a command-line utility! It saves time! Especially if you are using .NET (which most large vendors already do at least for some functionality).
And it's already happening - all good software vendors for server-side apps on Windows already expose functionality using PowerShell. Like: VMWare, Amazon Cloud, MSSQL, etc.
Posted Dec 10, 2011 8:57 UTC (Sat)
by khim (subscriber, #9252)
[Link] (2 responses)
Only if your program uses .NET - which is not always the case. In other cases you first need to write the command-line utility anyway and in addition you need to write the cmdlet. If you write command-line utility anyway then why bother with cmdlet at all? Not especially. Only. If you don't use .NET then it's easier to write command-line utility - and while large vendors are prone to misallocation of the resources even Microsoft is ready to scale back this abomination as Windows8 shows. But it'll probably be 10 more years till it's finally dropped... s/good/buzzword-compliant/ PowerShell is obvious waste of the resources, but it's not a zero-sum game: most sensible solutions are not the ones which sell well thus sometimes you need to invest in the most buzzword-compliant approach. But it only makes sense if this approach slays buzzword. When it's no longer the case support dries up quite fast. As world becomes less Windows-centric and Windows becomes less .NET-centric it makes less and less sense to spend resources on PowerShell. People will probably not drop already written things (albeit it may happen later), but new developments... I doubt it.
Posted Dec 11, 2011 0:07 UTC (Sun)
by raven667 (subscriber, #5198)
[Link]
Posted Dec 11, 2011 0:26 UTC (Sun)
by Cyberax (✭ supporter ✭, #52523)
[Link]
Nope. VMWare doesn't use .NET but its PowerShell management interface is top-notch. git doesn't use PowerShell but posh-git beats bash_completion in usability.
Have you actually managed something on new Windows Server platforms? Or do you think that Windows Server is still used only for file-servers?
Posted Dec 12, 2011 16:44 UTC (Mon)
by ccchips (subscriber, #3222)
[Link] (8 responses)
Quest Software fixed this in their product, which is, fortunately "free beer."
Somehow, I think we've been trolled. Smoke signals?
Posted Dec 12, 2011 17:27 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link] (7 responses)
http://msdn.microsoft.com/en-us/library/windows/desktop/m...
Posted Dec 12, 2011 18:05 UTC (Mon)
by ccchips (subscriber, #3222)
[Link] (6 responses)
This issue should have been addressed *before* Microsoft releasted the Active Directory tools for Powershell, not after, and not by requiring some kind of conversion kluge.
Posted Dec 12, 2011 18:31 UTC (Mon)
by ccchips (subscriber, #3222)
[Link] (5 responses)
It is good, as far as I'm soncerned, that Windows system administrators now have a decent shell to work with in their day-to-day activities, and I laud Microsoft for this. In my shop, it has been hard to get administrators to script *anything* before Powershell, as they were not willing to use Perl, Awk, Python, etc. And Powershell does have some interesting innovations. But it still strikes me as odd how a discussion of UNIX shells and their evolution turned into an argument about how great Powershell is compared to all the other methods people have used to perform system administration activities.
Posted Dec 12, 2011 18:48 UTC (Mon)
by raven667 (subscriber, #5198)
[Link]
Well, it didn't start out that way, there was just one "PowerShell is neat" post and in response a flurry of negative responses were written which lead to a lively discussion.
Posted Dec 12, 2011 21:08 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link] (3 responses)
Samba4 on Linux has the same behavior, btw.
Posted Dec 12, 2011 21:34 UTC (Mon)
by ccchips (subscriber, #3222)
[Link] (2 responses)
Does Samba4 or related utility have the fix?
I was a bit surprised to learn about the [System.DateTime]::FromFileTime() function in Powershell. I did a lot of research into this previously and didn't see that.
Posted Dec 12, 2011 21:41 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
LastLogon is a valid timestamp, just in a very braindead format (in 100-s nanosecond increments since 1601).
Posted Dec 13, 2011 16:20 UTC (Tue)
by ccchips (subscriber, #3222)
[Link]
I was asking about fixing *accessibility to the information.* Which is far more important to me than what shell handles what arguments, and how the pipelines work.
Posted Dec 8, 2011 20:02 UTC (Thu)
by cmccabe (guest, #60281)
[Link] (12 responses)
Linux has had that for decades . You can get that in Linux by opening up just about any scripting language and running the interpreter. Try the Ruby shell, the Python shell, or even the Scheme shell (scsh).
What would really be nice is some kind of standard data interchange format that could be shared between applications. For a while, people were pushing XML for that use, but XML is ugly as sin and overly complex. I think JSON is a better choice.
When I was working on Ceph, we implemented a --json flag for most of the tools that let them output JSON. Then you could manipulate it to your heart's content in the scripting language of your choice.
If we had a few more tools that let non-programmers manipulate JSON with the command line, and a --json option on popular programs, I think we could get most of the benefits of PowerShell on Linux.
Posted Dec 8, 2011 20:10 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (7 responses)
But yes, internally PowerShell is just a REPL console for a statically typed language with good introspection capabilities.
It's just cleanly implemented and with a lot of functionality. There were several attempts to do it in Unix but they died for a lack of infrastructure.
You can work with byte streams in PowerShell but it just feels unnatural after working with typed and structured objects.
Posted Dec 8, 2011 21:06 UTC (Thu)
by cmccabe (guest, #60281)
[Link] (3 responses)
Well, Perl, Python and Ruby don't seem dead to me. Yes, it would be nice if they supported static typing, but that's a different conversation.
> PowerShell allows bidirectional communication while Unix pipes
Solaris had (has?) bidirectional pipes. Linux never implemented that, and it's probably a good thing on the whole.
The thing that I think you are missing is that a good shell needs to be designed to be useful to system administrators, not to programmers. Good programmers may be able to debug the race conditions, misconfigurations, and so forth that can result in bidirectional communication between modules. But system administrators will find the extra complexity to be a huge burden.
The genius of UNIX was that it tore down the wall between system administrators and programmers. This meant allowing visibility into the guts of the system. It meant that sysadmins could automate common tasks. Does PowerShell allow users to do that, or is it just another shrine built to another proprietary Microsoft programming framework?
There have been so many of those over the years-- OLE, COM, DCOM, ActiveX, etc. Ironically, the number of dead proprietary Microsoft programming frameworks is almost greater than the number of living open source ones!
Posted Dec 8, 2011 21:24 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (2 responses)
I'm talking about _shell_ in Python/Perl. I'm aware only of http://code.google.com/p/hotwire-shell/ which is kinda still alive, but not very active.
Also, static typing is essential for PowerShell because it allows to have full automatic introspection. So I can tab-complete LINQ queries without any additional hackery like in bash_completion.d
>The thing that I think you are missing is that a good shell needs to be designed to be useful to system administrators, not to programmers. Good programmers may be able to debug the race conditions, misconfigurations, and so forth that can result in bidirectional communication between modules. But system administrators will find the extra complexity to be a huge burden.
Good Windows sysadmins _love_ PowerShell, because it makes a lot of jobs easier. Also, a lot of companies start building their tools around it. VMWare has _very_ nice management interface for PowerShell, for example.
>The genius of UNIX was that it tore down the wall between system administrators and programmers. This meant allowing visibility into the guts of the system. It meant that sysadmins could automate common tasks. Does PowerShell allow users to do that, or is it just another shrine built to another proprietary Microsoft programming framework?
PowerShell is really a logical extension of Unix ideology, so it has all the advantages of Unix.
Posted Dec 9, 2011 2:31 UTC (Fri)
by ccchips (subscriber, #3222)
[Link] (1 responses)
[get a list of all computers that have tape drives] | [get-tape-label]
from Powershell against any of the many proprietary backup programs Windows users are faced with....
Will companies like CA, Seagate, GFI, and others rally 'round this object model and help us get our work done *without* using their stupid GUI applications?
I have my doubts, but go 'head, keep on truckin'.
Posted Dec 9, 2011 3:15 UTC (Fri)
by Cyberax (✭ supporter ✭, #52523)
[Link]
We use http://www.veeam.com/ for VMWare backups. Works like a charm.
I have no idea how it works with tapes (I've not seen one for more than 10 years), but it can certainly list the targets for backups with a simple command.
So now please show me how to do this in Bash. For DriveXML working inside a VMWare virtual machine.
Posted Dec 9, 2011 20:20 UTC (Fri)
by ccchips (subscriber, #3222)
[Link] (2 responses)
Can you give us some examples or links to some examples of this bidirectional communication and its advantages/disadvantages? Only thing I have found so far is that you can use a .net class to set up a named pipe.
Posted Dec 10, 2011 3:08 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link]
Posted Dec 16, 2011 3:56 UTC (Fri)
by useerup (guest, #81854)
[Link]
Posted Dec 9, 2011 17:35 UTC (Fri)
by jsanders (subscriber, #69784)
[Link] (3 responses)
Posted Dec 10, 2011 1:16 UTC (Sat)
by cmccabe (guest, #60281)
[Link] (2 responses)
It might make it possible to do some interesting and readable one-liners.
Posted Dec 11, 2011 11:22 UTC (Sun)
by HelloWorld (guest, #56129)
[Link] (1 responses)
Posted Dec 12, 2011 23:33 UTC (Mon)
by cmccabe (guest, #60281)
[Link]
What I am thinking of is just a set of utilities to manipulate JSON easily-- nothing more, nothing less.
Posted Dec 8, 2011 7:32 UTC (Thu)
by ncm (guest, #165)
[Link] (8 responses)
Posted Dec 8, 2011 8:33 UTC (Thu)
by geuder (subscriber, #62854)
[Link] (5 responses)
Annoyance #1 with bash, I agree. I wonder whether there is an option to control that. Read the man page once, but I might have overlooked it.
Annoyance #2: If you press Ctrl-C, get your prompt not in the beginning of the line, scroll back in history and edit, the outcome is not WYSIWYG but a mess.
Posted Dec 8, 2011 11:57 UTC (Thu)
by bjartur (guest, #67801)
[Link] (1 responses)
Does anyone own any of the TeleTypes that might brake if anything is changed? I, for one, have never seen one, and call for a rewrite. Keep stdin, stdout and stderr - but throw out the overmultiplexing and terminal emulation.
Posted Dec 9, 2011 13:28 UTC (Fri)
by mpr22 (subscriber, #60784)
[Link]
Posted Dec 8, 2011 13:01 UTC (Thu)
by grawity (subscriber, #80596)
[Link] (2 responses)
In ~/.inputrc, "set revert-all-at-newline on"
Posted Dec 9, 2011 0:46 UTC (Fri)
by ncm (guest, #165)
[Link] (1 responses)
This made my day. Thank you, grawity. After decades of suffering, I'm finally happy with bash.
Posted Dec 9, 2011 12:28 UTC (Fri)
by nix (subscriber, #2304)
[Link]
Posted Dec 8, 2011 12:07 UTC (Thu)
by nye (subscriber, #51576)
[Link]
I generally like this behaviour because it means you can go up or down in history (for reference) while editing a command line, without losing your edits in progress.
I guess you already know this, but as a workaround there's always alt+r, for 'undo all changes to this line'.
Also, there's alt+# for 'comment out this line and add it to my history', which is very useful when you've been editing a line in your history, decided you don't want to execute it now, don't want your history destructively edited, but might want to come back to it later.
Posted Dec 8, 2011 15:06 UTC (Thu)
by felixfix (subscriber, #242)
[Link]
Try this:
echo "miss daisy"
Hit up arrow, add ' crazy' after daisy, and hit ENTER.
Use up arrow to look at history -- both lines are intact.
Pick one, edit it, then use up or down arrow to move away, then come back, and you will find your edit intact. Hit ENTER.
Use up arrow again, and you will find the modified line, not the original line.
Near as I can guess, it only does this because the alternative is that if you edit a line and move away, perhaps to look at a different line to remember what was typed, then move back to finish your edit, your edit would have been discarded. I think this would be just as annoying as corrupting history, but at different times.
I can't say as I like it or despise it; it's a "feature" I take advantage of, sometimes happy to have it, sometimes annoyed at not having a true reliable history of what was actally executed.
... considering that all of the major shells described predate Linux, which did not exist until 1991. There's nothing "in Linux" about the topic.
Odd choice for a title ...
Odd choice for a title ...
Of course not. Rather, Linux is just a kernel, and the Posix world is much older and bigger. Linux was initially built so the pre-existing GNU and BSD userland utilities could run on cheap hardware; the shells came first.
Odd choice for a title ...
Odd choice for a title ...
Odd choice for a title ...
Odd choice for a title ...
One of interesting things about early UNIX shells (Thomson's, up to sixth edition) was that exit and goto were not built-ins, but simple programs that seeked a file descriptor open for the running script appropriately, either to the beginning proceeding to given label or to the end.
Also, Bourne's seventh edition shell is almost universally considered to be one of the most horrible C code ever written. From its mac.h source file:
Evolution of shells in Linux (developerWorks)
#define LOCAL static
#define PROC extern
#define TYPE typedef
#define STRUCT TYPE struct
#define UNION TYPE union
#define REG register
#define IF if(
#define THEN ){
#define ELSE } else {
#define ELIF } else if (
#define FI ;}
#define BEGIN {
#define END }
#define SWITCH switch(
#define IN ){
#define ENDSW }
#define FOR for(
#define WHILE while(
#define DO ){
#define OD ;}
#define REP do{
#define PER }while(
#define DONE );
#define LOOP for(;;){
#define POOL }
#define SKIP ;
#define DIV /
#define REM %
#define NEQ ^
#define ANDF &&
#define ORF ||
#define TRUE (-1)
#define FALSE 0
Thankfully, many of the macros, such as DIV are not actually used anywhere. This is what the code actually looks like:
VOID fault(sig)
REG INT sig;
{
REG INT flag;
signal(sig,fault);
IF sig==MEMF
THEN IF setbrk(brkincr) == -1
THEN error(nospace);
FI
Also, the code quoted is another rather creative technique used by the shell: it's the signal handler for memory faults, that simply enlarges heap with setbrk() when an out-of-bounds memory is signalled by kernel. Needless to say, there are not many calls to malloc() in the source...
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Oh yes, Steve Bourne's MACRO ALGOL.Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Needless to say, there are not many calls to malloc() in the source
This would be because it predates malloc(). (Remember your K&R first edition, with no mention of malloc() but discussion of an alloc() memory allocator that you might wish to write. And that was 1978...)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
What part about "right-hand prompt" and "no approximations" did you not understand?
Not helpful. There are many ways to have the prompt echo the current working directory. One such way is to use a right hand prompt in zsh, there are others as well. No need to attack people for mentioning that.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
RPROMPT is nice too though ;).
Evolution of shells in Linux (developerWorks)
> autocomplete for umount only shows mounted devices),
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Yeah right, except that it's unlike all bourne-style shells out there. Its pipes exchange structured data, and not just an unstructured byte stream. This is about as revolutionary as it gets as far as shells are concerned.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
You must be a pretty unheeding person then. For example, the sort(1) utility is almost useless because pipes are just byte streams. Try sorting the output of, say, "aptitude show '~i'" by version. You can't, because "aptitude show" doesn't output one package per line (as that would be too long) and even if it did, you can't tell sort how to compare (i. e. by version instead of lexicographically).
Sorry, but UNIX shells are simply broken by design. The fact that they can be used to do useful things occasionally doesn't change that.
Um, you're not citing a deficiency in the shell. You're citing deficiencies in the programs made available to it.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Even if aptitude would print one line per package (which, given the amount of information to be printed, would result in unreadable garbage), it still wouldn't work. There's no convention about which character encoding to use (which is why pipes really aren't text streams but merely byte streams), there's no convention about the separator character and there are no escaping rules, it's all just a huge clusterfuck.
It's not about other conventions, it's about having conventions *at all*, and UNIX doesn't. Tools like cut(1) and awk(1) allow you to set the input record separator precisely because of that: there is no convention.
Evolution of shells in Linux (developerWorks)
It's not about other conventions, it's about having conventions *at all*, and UNIX doesn't. Tools like cut(1) and awk(1) allow you to set the input record separator precisely because of that: there is no convention.
By the way, there's another, related problem in Unix shell design: the shell does the globbing, but it doesn't interpret the arguments, as that is left to the program being executed. But the program has no way to know whether an argument was produced by globbing or not, which gives you trouble when you have files named --foo around. It's all so broken...
Evolution of shells in Linux (developerWorks)
I know, but nobody ever thinks of using that in practice, making it useless in practice. Not to mention that there are also many tools that don't support this.
Evolution of shells in Linux (developerWorks)
I don't understand why you say this. I and my workmates use this whenever required.
> That is exactly the reason why many programs interpret "--" as "end of switches, rest is arguments."
I know, but nobody ever thinks of using that in practice, making it useless in practice.Evolution of shells in Linux (developerWorks)
Because that's the way that it is.
Then you're obviously more attentive than the average shell programmer.
Evolution of shells in Linux (developerWorks)
I think it's safe to say that for any widely-used programming language, there is a sizeable group you can more or less reasonably describe as being "average $LANGUAGE programmers" who are not very good at adhering to best practice. So, saying "the average $LANGUAGE programmer doesn't do $BEST_PRACTICE_ITEM" seems likely to be true, obvious, and unenlightening.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Well, this is not a binary thing. The thing is that in order to write reliable shell scripts, you need to jump through hoops *all the time* (i. e. every time you use a glob, every time you use sed in a locale other than the one you have tested your script with etc.) in order to stop bad things from happening.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Yes, and that is precisely what makes it useless. And actually, it can't even sort lines properly; try sorting, say, the output of ps by ppid or something. Yeah, it's doable, but it's messy, and it needn't be.
That pretty much proves the point. Now we already have two sorting utilities that are useless as they can't sort what I want them two. Contrast with PowerShell's Sort-Object, which makes it trivial to sort processes listed by Get-Process.
The terminal emulator can easily be replaced, for example with this one:
http://sourceforge.net/projects/console/
On the other hand, a fundamental problem like the lack of some kind of structured data exchange format can't be solved that easily.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
the awfulness of the command prompt on Windows is not so much an itch that needs scratching as a foot-long needle through the eye that inflicts a constant searing pain upon all users of the platform
Now that's a QoTW candidate if ever I've seen one.
Evolution of shells in Linux (developerWorks)
Heh. I have COMPLETELY opposite experience. Console on Linux sucks and can't be fixed by any means short of going back in time and shooting K&R before they start writing the first line of Unix code.
</flamebait mode>
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Which is good. Because pseudoterminals are a braindead idea. It's essentially a channel to transmit bytecode which is executed by terminal emulator to draw text.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Yes, and the only reason it's so limited is that the Unix shell paradigm doesn't allow anything better. Which was the point all along. Is that so hard to understand?
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Sorry, but this is wrong. Dead wrong.
That isn't necessary if there's a standard mechanism for structured data.
Sorry, but this is wrong. Dead wrong.
1) There is no static typing or checks. Everything is dynamic which combines nicely with...
2) No real way to unit-test your scripts.
3) No central documentation or help system. And no, 'man' isn't it. Scripts and utilities have all kinds of argument formats without any real convention. Which leads to...
4) No discoverability. Autocompletion is not sanely possible in bash or zsh.
5) Abysmal tooling. The only real tool is just a text editor. Continuous integration? Unit testing? Static analysis? Code coverage analysis? Who needs them, not us!
Keep dreaming...
PowerShell _is_ a unified language which interoperates with all other .NET languages.
UNIX people should sometimes leave their self-righteousness at the door and actually check what other people are doing.
PowerShell is definitely more powerful than bash or zsh because IT IS BUILT ON A GOOD FOUNDATION.
Bash knows nothing but byte streams. That's nice - for 1973.
Keep dreaming...
It does not matter...
Nothing had ever came close to PowerShell in functionality and usability for system shells. And no, old elisp shells or Rexx scripts ARE NOT the answer. They don't offer the two most powerful features of PowerShell: introspection and structured pipelines.
Sorry, but PowerShell can do ANYTHING bash can do. Without exceptions.
Some things might be more clumsy than in bash - because PowerShell is not bash and things are done differently there.
PowerShell integration with git is much nicer than in bash: https://github.com/dahlbyk/posh-git - and achieves the similar functionality in a fraction of lines of code.
It does not matter...
It does not matter...
Sorry, but this is wrong. Dead wrong.
3) No central documentation or help system. And no, 'man' isn't it.
So... the central Unix documentation system is not acceptable because... you say so? I'd rather use man than HTML Help with its abysmal searching, insulting baby talk, and horrendous security holes.
Scripts and utilities have all kinds of argument formats without any real convention.
Which would be why POSIX, uh, standardized them more than ten years ago, and imposed rules which virtually all standard tools follow. A couple of holes exist, mostly for backward compatibility or ease-of-use's sake (e.g. tail and head's - and +-based arguments) and a couple are just deeply unusual (e.g. find and dd), but most are pretty consistent, and the ones with weird user interfaces get comprehended in the end by frequent use. I'd prefer Lisp everywhere, but, face it, it's not gonna happen.
4) No discoverability. Autocompletion is not sanely possible in bash or zsh.
You carefully named the two shells which have extremely extensive autocompletion, in zsh's case shipped with the package. I don't know what you're trying to do, but you're doing such a good job arguing against yourself that I don't see why I need bother. (Yes, it's not 100% automatable without a bit of per-tool scripting. The workload is minimal compared to writing the tools. Perhaps in an ideal world it could be completely automated, but that just shifts the burden from writing the autocompletion code to writing some sort of reflective description of the system. Big deal.)
Sorry, but this is wrong. Dead wrong.
Nope. Man's shortcomings are well known. It's basically a simple indexed document storage, without built-in searching and poorly structured.
> tar czf myfile myfile.tgz
Damn, I just deleted 'myfile' with doubly-archived myfile.tgz
Very standard.
I'll agree that man is not a very nice documentation system, but it *is* one, so your claim that it does not exist is vacuous on its face.
Sorry, but this is wrong. Dead wrong.
PowerShell allows to embed documentation directly into objects and also structure it by parameters, methods, etc.
Ah. Like POD, doxygen, and similar systems, all of which can generate manpage output. i.e., dead heat here.
And there's nice "-online" switch that leads you directly into the TechNet article associated with tool, with Q&A and other additional functionality.
That's great -- if and only if Microsoft wrote the tool. Only useful in a software monoculture.
Can you make your scripts in your home directory have their documentation automatically be included and made searchable into the central help system?
Yes. Learn about apropos databases and MANPATH. Manpages can be stored absolutely anywhere (as can info pages).
Sorry, but this is wrong. Dead wrong.
Sorry, but this is wrong. Dead wrong.
Sorry, but this is wrong. Dead wrong.
Which would be why POSIX, uh, standardized them more than ten years ago, and imposed rules which virtually all standard tools follow.
Sorry, but this is wrong. Dead wrong.
essential features such as multiple tabs, run-based selection and the ability to click URLs are entirely absent
Evolution of shells in Linux (developerWorks)
I use xterm and reject all other X based terminal emulators. Its small, fast, and has enough features for me. I don't need multiple tabs or the ability to click urls and I don't know what the hell run-based selection is supposed to be - these don't seem particularly essential.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Take it to Linux - Newbie. This isn't the forum for spoon-feeding.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
OK, folks (plural), that's enough. I'm sure there's a nice site for name-calling over --------> there somewhere, but please try not to do it here.
Enough
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
sed -nr '/^Package:/!{H;/^Version:/{G;h};$!d};x;s/\n/\x0/gp'|sort -V|sed 's/[^\x0]*\x0//;s/\x0/\n/g'
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
For good reason, as I said--such systems have never won out in the past, why should .NET and PowerShell be any different?
Evolution of shells in Linux (developerWorks)
> Microsoft does a wonderful job with PowerShell
Evolution of shells in Linux (developerWorks)
PS> whoami arg1 arg2 arg3
ERROR: Invalid argument/option - 'arg1'.
Type "WHOAMI /?" for usage.
PS> $foo = "foo\abc def\";
PS> whoami $foo bar qux args args
ERROR: Invalid argument/option - 'foo\abc def" bar qux args args'.
Type "WHOAMI /?" for usage.
Yeah, you know what? I'll consider using PowerShell the day the PowerShell team figures out how the hell to properly quote command line arguments. It ain't hard.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
> touch '-a'
>touch: missing file operand
>Try `touch --help' for more information.
WTF.
Ok
> rm: invalid option -- 'a'
WTF^2???
>function touch {set-content -Path ($args[0]) -Value ([String]::Empty) }
>touch -test
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
You can unpack your string into argument list - by adding ONE character to variable. Is that hard?
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
>PS C:\Users\cyberax> $c = "cyberax@sdmain"
>PS C:\Users\cyberax> ssh $c
>Linux sdmain 2.6.39-bpo.2-amd64 #1 SMP Tue Jul 26 10:35:23 UTC 2011 x86_64
>
>The programs included with the Debian GNU/Linux system are free software;
>the exact distribution terms for each program are described in the
>individual files in /usr/share/doc/*/copyright.
>permitted by applicable law.
>Last login: Thu Dec 8 17:51:16 2011 from 74.101.247.211
>cyberax@sdmain:~$
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
> echo -n "${var[$x]}"
> done
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
It certainly doesn't work with the syntax you've shown, as that is the syntax for array subscription.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
No, the loop starts at 1 instead of 0.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Yeah, if you think of using the special -- argument. 99% of all shell script authors don't, and they shouldn't need to.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Well yes, if you actually want to specifically list a file called -l, you need to type something else than -l, such as ./-l. But that is the simple, obvious case that will be caught when you first try to run your script. It's another thing if a filename such as -l is generated by globbing. The issue here is that an application should know whether an argument was generated by globbing or not, so that it can treat an argument such as -l as a positional parameter instead of an option if it was generated by a glob pattern. But it can't, since, globbing is done by the shell, and the information about which parameters were globbed and which weren't is lost.
Evolution of shells in Linux (developerWorks)
That's rather interesting, actually. I guess it was removed because there are corner cases left. For example, something like
Evolution of shells in Linux (developerWorks)
touch -- --harmful-flag
foo=(*)
foobar "${foo[@]}"
would likely still not be caught. Of course, it would be possible to treat variable expansions as positional arguments as well, but that would probably break lots of scripts. Incrementally building a list of flags in a shell variable is a common idiom, after all.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
touch ./-a
>function touch {set-content -Path ($args[0]) -Value ([String]::Empty) }
>touch -test
Evolution of shells in Linux (developerWorks)
That's just about the most trivial thing you can do:
Evolution of shells in Linux (developerWorks)
>filename
That only works if the file doesn't exist. Otherwise, it replaces your file with an empty file.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
>ERROR: Invalid argument/option - 'foo\abc def"'.
>Type "WHOAMI /?" for usage.
Evolution of shells in Linux (developerWorks)
PS> whoami $foo bar
ERROR: Invalid argument/option - 'hello world'.
Type "WHOAMI /?" for usage.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
PS>whoami $foo
PS>whoami $foo[0]
PS>whoami $foo[1]
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
sed does exactly what you want. For instance, to accumulate aptitude show's output one package per line:
Evolution of shells in Linux (developerWorks)
sed -nr '/^Package:/!{H;$!d};x;s/\n/\x0/gp'
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Easy:
>Get-AdUser -Filter * | Format-Table LastLogon,UserPrincipalName
Relative youth?
You can argue that PowerShell does not yet cover all possible functionality, but it's not a deficiency of the idea itself. Just a sign or relative youth.
Relative youth?
It's apples to oranges...
Because writing a cmdlet is like 10 times easier than writing a command-line utility!
Especially if you are using .NET (which most large vendors already do at least for some functionality).
And it's already happening - all good software vendors for server-side apps on Windows already expose functionality using PowerShell.
It's apples to oranges...
It's apples to oranges...
Evolution of shells in Linux (developerWorks)
LastLogonTimestamp: 129681773131629678
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
But it still strikes me as odd how a discussion of UNIX shells and their evolution turned into an argument about how great Powershell is compared to all the other methods people have used to perform system administration activities.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
> typed language with good introspection capabilities.
>
> It's just cleanly implemented and with a lot of functionality. There were
> several attempts to do it in Unix but they died for a lack of
> infrastructure.
> are traditionally unidirectional. Also, having all commands executing in
> the same address space is quite useful for a lot of stuff - I can pass
> gigabytes of image data without having it copied through pipes.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
http://lwn.net/Articles/443473/
I haven't heard of it since though, and its github repo hasn't seen any activity since June.
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
This is 2011 and not 1970 so it should be possible for the program to make sure it knows what the line on the screen looks like. I guess technically this is a GNU readline issue, so the fix needs to go there. Isn't history editing also provided by GNU readline? So the same might hold for Annoyance #1
Evolution of shells in Linux (developerWorks)
I'm not clear what your final paragraph is calling for a rewrite of. Could you clarify?
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
WOW
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)
Evolution of shells in Linux (developerWorks)