|
|
Subscribe / Log in / New account

Evolution of shells in Linux (developerWorks)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 10:38 UTC (Thu) by quotemstr (subscriber, #45331)
In reply to: Evolution of shells in Linux (developerWorks) by Cyberax
Parent article: Evolution of shells in Linux (developerWorks)

> Microsoft does a wonderful job with PowerShell

PS> whoami arg1 arg2 arg3
ERROR: Invalid argument/option - 'arg1'.
Type "WHOAMI /?" for usage.

PS> $foo = "foo\abc def\";
PS> whoami $foo bar qux args args
ERROR: Invalid argument/option - 'foo\abc def" bar qux args args'.
Type "WHOAMI /?" for usage.
Yeah, you know what? I'll consider using PowerShell the day the PowerShell team figures out how the hell to properly quote command line arguments. It ain't hard.


to post comments

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 11:48 UTC (Thu) by bjartur (guest, #67801) [Link] (2 responses)

Not that the Bourne shell POSIX sh is based on got it right, either. That's why we have rc, after all.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:46 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (1 responses)

sh allows you to pass *any* string as an argument if you're sufficiently careful and follow a few straightforward rules. PowerShell hopelessly mangles arguments containing double-quotes and backslashes, and there's no workaround less complex than just calling CreateProcess yourself.

I find it very difficult to care about PowerShell's fancy object piping when basic arguments don't work right.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:35 UTC (Thu) by HelloWorld (guest, #56129) [Link]

Unlike the lack of structured data in pipes, this is something that can be fixed in the shell itself, i. e. without modifying all programs/cmdlets.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 18:58 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (31 responses)

You're in luck, then.

Quoting in PowerShell is _way_ ahead of bash and other text-based shells. Writing correct bash code that works with quoting is an exercise in pulling your hair, while in PowerShell it's completely natural.

See here for overview: http://www.techotopia.com/index.php/Windows_PowerShell_1....

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:38 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (30 responses)

No it isn't. What's natural is passing $foo as an argument word, not mangling it by blindly surrounding it with double-quoted while ignoring its contents. (No, writing "$foo" doesn't help. In fact, it has no effect.)

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:54 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (29 responses)

And that leads to all kinds of problems with filenames containing spaces or special characters.

Like:
> touch '-a'
>touch: missing file operand
>Try `touch --help' for more information.
WTF.

> touch -- -a
Ok

> rm *
> rm: invalid option -- 'a'
WTF^2???

Dumb text expansion is one of the main problems of shells. It's so bad that suid functionality for scripts has been removed in 80-s because writing secure scripts is just not possible.

In PowerShell it's all natural:
>function touch {set-content -Path ($args[0]) -Value ([String]::Empty) }
>touch -test

No errors, no problems, everything works.

If you REALLY want to reinterpret your string - you can do this explicitly.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 19:59 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (21 responses)

"touch -- -a" works fine. You're being disingenuous.

You're ignoring the very serious deficiency in PowerShell I identified, a deficiency which makes it practically unusable for me and many others (consider the problems passing SQL queries to programs).

No, don't come back with an example of a cmdlet that properly accepts $foo. The problem is with external programs, which makes the issue worse, not better.

No, not everything is a cmdlet, nor will everything be a cmdlet in our lifetimes.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:16 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (14 responses)

??
You can unpack your string into argument list - by adding ONE character to variable. Is that hard?

And working with SQL queries in PowerShell is WAY better than anything Linux has because one can use statically typed LINQ. I can actually tab-complete table names while writing a query from a freaking command line.

For example: http://bartdesmet.net/blogs/bart/archive/2008/06/07/linq-...

External legacy programs are just that - legacy. They can be easily wrapped but quite often it's easier to replace them completely.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:20 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (13 responses)

> You can unpack your string into argument list - by adding ONE character to variable.

Show me.

> External legacy programs are just that - legacy.

No, external programs are essential. If you persist in this assertion, and claim PowerShell isn't broken because you don't really need the broken feature, we can't have a discussion. If I'm going to write a stand-alone program that doesn't act like a shell, there are many languages better-suited to the task than PowerShell is.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:07 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (12 responses)

> Show me.

It's called 'splatting' and is done by prepending '@' to variable name.

Now can you show me how I can tab-complete table names in Bash while I'm editing a query?

> No, external programs are essential. If you persist in this assertion, and claim PowerShell isn't broken because you don't really need the broken feature, we can't have a discussion. If I'm going to write a stand-alone program that doesn't act like a shell, there are many languages better-suited to the task than PowerShell is.

Sure, a lot of legacy is essential and PowerShell can easily work with legacy text-based programs. No big deal.

But that doesn't make it any less legacy in the Windows world.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:11 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (11 responses)

You're supplying generic rebuttals to points I didn't make and ignoring the real issues I actually raised.

> It's called 'splatting' and is done by prepending '@' to variable name.

Splatting doesn't address the issue I raised.

> PowerShell can easily work with legacy text-based programs

I demonstrated that PowerShell cannot reliably pass arguments to external programs. That's "broken", not "easy".

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:18 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (10 responses)

And you're not supplying anything.

>I demonstrated that PowerShell cannot reliably pass arguments to external programs. That's "broken", not "easy".

Let's see:
>PS C:\Users\cyberax> $c = "cyberax@sdmain"
>PS C:\Users\cyberax> ssh $c
>Linux sdmain 2.6.39-bpo.2-amd64 #1 SMP Tue Jul 26 10:35:23 UTC 2011 x86_64
>
>The programs included with the Debian GNU/Linux system are free software;
>the exact distribution terms for each program are described in the
>individual files in /usr/share/doc/*/copyright.

>Debian GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
>permitted by applicable law.
>Last login: Thu Dec 8 17:51:16 2011 from 74.101.247.211
>cyberax@sdmain:~$

Looks like it works just fine. Now describe your problem in details, without knee-jerk reactions.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:21 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (9 responses)

> Now describe your problem in details, without knee-jerk reactions.

You must have missed the thrust of my original post:

https://lwn.net/Articles/471133/

It's not possible to pass arbitrary strings from PowerShell to external programs.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:41 UTC (Thu) by mathstuf (subscriber, #69389) [Link] (8 responses)

To support *arbitrary* [NUL-terminated] strings in shell, you have to, AFAIK, do the following (to avoid arbitrary execution, get the exact backslash count, and more):

> for x in $( seq 1 ${#var} ); do
> echo -n "${var[$x]}"
> done

Granted, most of the time you don't need this (who puts 7 backslashes in a row? Well...tr might make sense with that many backslashes as an argument), but it needs to be copied verbatim every time unless you want to do another level of escaping passing it to a function.

Of course, if there's a shorter way, I'd be thrilled to know it :) .

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:51 UTC (Thu) by quotemstr (subscriber, #45331) [Link]

You don't need to do any of that.

printf '%s' "$bar" is sufficient. The value of bar isn't expanded or interpreted.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 9:39 UTC (Fri) by HelloWorld (guest, #56129) [Link] (6 responses)

That code doesn't make sense. You take the length of the first string in the array named "var", and then, for x from 1 to said length print the xth element of the array.

If all you wanted to do is print the elements of an array, you can do that with echo "${var[@]}". Well, unless something in ${var[@]} starts with a dash...

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 18:07 UTC (Fri) by mathstuf (subscriber, #69389) [Link] (5 responses)

It's a string, not an array (indexing strings works here in bash). The idea is to avoid creating escape sequences that may appear when var is blindly expanded. Of course, the printf command given is much more succint and cleaner.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 19:31 UTC (Fri) by HelloWorld (guest, #56129) [Link] (4 responses)

> It's a string, not an array (indexing strings works here in bash).
It certainly doesn't work with the syntax you've shown, as that is the syntax for array subscription.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:09 UTC (Fri) by nybble41 (subscriber, #55106) [Link] (3 responses)

Technically it *will* work, but only because the value of the variable is output in the first iteration of the loop, and the following iterations have no effect (${var[1..N]} are empty). I.e., aside from the obvious inefficiency, the entire loop is equivalent to:

echo -n "$var"

To work as intended the variable reference would need to use substring syntax, "${var:$x:1}", rather than the array syntax "${var[$x]}". However, it is sufficient to place the variable in double-quotes, which (in bash) causes the value to be output verbatim, with no further quoting, expansion, or splitting.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:13 UTC (Fri) by HelloWorld (guest, #56129) [Link] (2 responses)

> Technically it *will* work, but only because the value of the variable is output in the first iteration of the loop,
No, the loop starts at 1 instead of 0.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 20:25 UTC (Fri) by nybble41 (subscriber, #55106) [Link] (1 responses)

Hm. You're right, which is strange since both arrays and characters are zero-indexed (except for $@). Perhaps that loop was written for a different shell entirely? It's certainly not necessary in reasonably modern implementations (i.e. within the last decade) of bash.

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 3:34 UTC (Sat) by mathstuf (subscriber, #69389) [Link]

Nope, I was a typo. Somehow I missed that it just printed something on the first iteration, and nothing on the rest... Not sure how I forgot about printf either.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 20:41 UTC (Thu) by HelloWorld (guest, #56129) [Link] (5 responses)

> "touch -- -a" works fine.
Yeah, if you think of using the special -- argument. 99% of all shell script authors don't, and they shouldn't need to.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 7:33 UTC (Fri) by ekj (guest, #1524) [Link] (4 responses)

Well, you do need two different syntaxes for:

ls -l

and

ls -- -l

You don't need to use -- for the purpose, but you need to do *something* since both are valid and reasonable commands, but the two have distinct meaning.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 8:52 UTC (Fri) by HelloWorld (guest, #56129) [Link] (3 responses)

> You don't need to use -- for the purpose, but you need to do *something*
Well yes, if you actually want to specifically list a file called -l, you need to type something else than -l, such as ./-l. But that is the simple, obvious case that will be caught when you first try to run your script. It's another thing if a filename such as -l is generated by globbing. The issue here is that an application should know whether an argument was generated by globbing or not, so that it can treat an argument such as -l as a positional parameter instead of an option if it was generated by a glob pattern. But it can't, since, globbing is done by the shell, and the information about which parameters were globbed and which weren't is lost.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:19 UTC (Fri) by nix (subscriber, #2304) [Link] (1 responses)

This feature was actually added to bash at one point: it could export the offsets of globbed arguments to child processes in an environment variable. But it was quietly removed years ago, so presumably there were problems with it.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:38 UTC (Fri) by HelloWorld (guest, #56129) [Link]

That's rather interesting, actually. I guess it was removed because there are corner cases left. For example, something like
touch -- --harmful-flag
foo=(*)
foobar "${foo[@]}"
would likely still not be caught. Of course, it would be possible to treat variable expansions as positional arguments as well, but that would probably break lots of scripts. Incrementally building a list of flags in a shell variable is a common idiom, after all.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 16:30 UTC (Fri) by nybble41 (subscriber, #55106) [Link]

> The issue here is that an application should know whether an argument was generated by globbing or not, so that it can treat an argument such as -l as a positional parameter instead of an option if it was generated by a glob pattern.

There's a standard solution to this: instead of "ls *.c", write "ls ./*.c", which has the same effect, and yet has no chance of accidentally expanding to an option rather than the expected filename.

Or, for any program which has a standard getopt-style command-line parser, just use "--" before any glob patterns.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 2:46 UTC (Fri) by vonbrand (subscriber, #4458) [Link] (1 responses)

No, that isn't the reason. Running scripts SUID is inherently racy, something could interrrupt the interpreter after going SUID and before running the script proper, or switch the script underneath.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 3:29 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

Both issues can be fixed (in fact, Perl with suidperl does this just fine).

You see, even Perl is better than shell scripts.

Evolution of shells in Linux (developerWorks)

Posted Dec 11, 2011 23:59 UTC (Sun) by chuckles (guest, #41964) [Link] (4 responses)

or just use:
touch ./-a

rm ./-a

In PowerShell it's all natural:
>function touch {set-content -Path ($args[0]) -Value ([String]::Empty) }
>touch -test

lol. 'all natural' sorry that made me laugh.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 6:00 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link] (3 responses)

Well, try to write 'touch' utility in Bash (assume that there's no 'touch' utility present).

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 16:40 UTC (Mon) by jimparis (guest, #38647) [Link] (2 responses)

That's just about the most trivial thing you can do:
>filename

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 17:07 UTC (Mon) by mpr22 (subscriber, #60784) [Link] (1 responses)

That only works if the file doesn't exist. Otherwise, it replaces your file with an empty file.

Evolution of shells in Linux (developerWorks)

Posted Dec 12, 2011 17:42 UTC (Mon) by jimparis (guest, #38647) [Link]

I know, that matches the behavior of the PowerShell code that Cyberax posted (as far as I can tell). Not really "touch" but "make this file empty".

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:38 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (8 responses)

Without PowerShell:

>C:\>whoami "foo\abc def\"
>ERROR: Invalid argument/option - 'foo\abc def"'.
>Type "WHOAMI /?" for usage.

That's just the way Windows console behaves. Arguments are not splitted if they are quoted - it's different from Linux, but it's consistent.

You can, of course work around it by splitting your parameter list manually. As you have to do it in the regular Windows console.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 21:48 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (7 responses)

> That's just the way Windows console behaves. Arguments are not splitted if they are quoted - it's different from Linux, but it's consistent.

No, PowerShell is not being consistent with cmd here. PowerShell in fact tries to quote arguments (so they will be properly decoded and sent to [w]main), but fails in some glaring cases. Here is a case that *does* work:

PS> $foo="hello world"
PS> whoami $foo bar
ERROR: Invalid argument/option - 'hello world'.
Type "WHOAMI /?" for usage.

Here, you see that whoami (which prints its first argument, and so is useful as a testing tool) prints *just* "hello world", not "hello world bar".

Please, learn how the Windows command line argument system works before attempting to defend it.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:09 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (5 responses)

You can do it like this: 'whoami @($c -split " ")'. If you need this often then define a function.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:26 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (4 responses)

> whoami @($c -split " ")

No. Splatting does _NOTHING_ to address the quoting issue I raised. It does not allow you to pass arbitrary argument words to external programs. Your technique actually makes the problem _WORSE_ because you lose the identity of individual arguments which CAN contain spaces_and which you DO need to quote. You've utterly failed to even comprehend my basic point. This conversation is over.

Evolution of shells in Linux (developerWorks)

Posted Dec 8, 2011 22:28 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link] (2 responses)

Can you show me what EXACTLY you are trying to do? Paste a piece of code here, including the parts which receive parameters.

Evolution of shells in Linux (developerWorks)

Posted Dec 9, 2011 12:25 UTC (Fri) by nix (subscriber, #2304) [Link] (1 responses)

quotemstr is trying to show you faults in the Windows command-line parsing system by using whoami as a tool to indicate the contents of the first argument.

You are treating it as if he wants to run 'whoami' but doesn't know how.

When a finger points at the moon...

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 3:00 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

I understand that 'whoami' is used as an example.

I finally understood what author wants and gave an answer: "@($var -split ' ')". It does the braindead thing that was requested.

Why braindead? Because good PowerShell scripts would just use an _array_ and 'splat' it when required, instead of constructing shell arguments as raw strings.

And no, this hack with splitting string into arguments is definitely not required to work with legacy code.

Evolution of shells in Linux (developerWorks)

Posted Dec 10, 2011 3:01 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

Try it. It does EXACTLY what you've requested - passes a string as a number of arguments, split by spaces.

Evolution of shells in Linux (developerWorks)

Posted Dec 16, 2011 3:33 UTC (Fri) by useerup (guest, #81854) [Link]

You err because you want PowerShell to act exactly the way POSIX shells do. PowerShell is actually a bit more structured and it may require you to do things in a little different way. The advantage is that many of the error-prone constructs in bash are avoided.

For your example try this:

PS>$foo="/groups","/user"
PS>whoami $foo

And then try
PS>whoami $foo[0]
PS>whoami $foo[1]

PowerShell actually integrates quite nicely with external programs; it does so even retaining (not re-interpreting) arguments.

In your example you told PS to pass a string containing "arg1 arg2" to an external program - which it did. You assume that the entire commandline is turned into text and re-interpreted (the way of POSIX shells).

As you can see above, PowerShell understands arrays and will readily pass the array items as discrete arguments. It's not harder - just a little different and a lot more robust and avoids the risk of injection vulnerabilities.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds