User: Password:
|
|
Subscribe / Log in / New account

Shell programming

Shell programming

Posted Dec 1, 2012 21:55 UTC (Sat) by man_ls (guest, #15091)
Parent article: Quotes of the week

It would be interesting to learn why shell programming is still going strong. There are several contenders in the land of scripting languages for Unix systems; some of them really strong and well behaved like Python, others almost ubiquitous like Perl (and also very powerful), yet others really compact like Lua. However shell scripting is still widely used in many systems as duct tape -- exactly what Perl was invented for, and what Python excels at. Why?

There must be some kind of major screw-ups on the part of all scripting languages when shell is still the greatest common divisor among all Unix systems. I could outline my reasons for each of those mentioned above but I will spare you it; however I would welcome more general explanations. Mine is: shell is the greatest common divisor by definition, since all Unix-like systems must have one scripting shell language and it must be compatible with other shells. So we are stuck with one shell or other and all we can do is improve on the shell itself, like Bash or dash have done.


(Log in to post comments)

Shell programming

Posted Dec 1, 2012 22:26 UTC (Sat) by dlang (subscriber, #313) [Link]

In shell it's simple to do simple things.

Frequently you just need to run one command after another, and nothing is simpler than shell for doing this.

Really complex shell programs aren't started as complex programs, they start off as simple programs and get incrementally enhanced until they are big complex programs, at each step of the way it's substantially less work to just tweak this one thing then it is to re-write the entire thing in some other language.

There are also a LOT of tools available to do things for you, so you don't do them in shell, you do them by invoking a program and it does the work for you.

If most of the work you are needing to do can be done by these programs, your 'shell program' can outperform the equivalent program in the 'better' shell languages.

using these external tools is frequently simpler than programming the same thing in one of the 'better' languages

you want to clear a directory before you start doing other things, in shell rm -r dir does it, how many lines of code does it take to do it in Perl or Python?

Least common denominator is a factor, but I think it's a small factor. It's an unusual server that doesn't have both Perl and Python on it, so lack of availability of the language is not an issue (you may not have the libraries you want installed, that can be a real issue)

I do a lot in shell, and a lot in Perl (I do a little in Python). If I am doing something quick and simple, I'll use shell, if I know it's going to be doing a lot of data manipulation, I'll use Perl.

ALthough I'll point out that sed, cut, grep allow you do solve a lot of 'data manipulation' issues fairly easily, so Perl comes into play more when calculations or non-trivial logic come into play.

Yes, Perl has the system() call (and Python has the equivalent), and everything I'm saying above can be done in these languages with a series of such calls. But it's hard to say that the resulting program is any better than the shell program it replaced. I once had a 20 line, ugly shell script that I wrote turned over to the software engineering department for "official maintenance" and to bring it up to "professional standards". In three months they re-wrote the 20 line shell script into a 200 line Perl script, with everything that actually did something being a system() call that invoked the shell commands that were in my initial script (long ; joined lines remaining intact). But this was now a "professional" program, it had lots of comments (none of which explained what any of the calls were doing, or why), every function was less than one screen, etc. It's the best example I've ever seen of how "coding standards" can make a program hard to deal with.

Shell programming

Posted Dec 1, 2012 22:51 UTC (Sat) by man_ls (guest, #15091) [Link]

you want to clear a directory before you start doing other things, in shell rm -r dir does it, how many lines of code does it take to do it in Perl or Python?
Actually it is probably one line also, but the thing is -- I don't remember which line, while rm -r is always at my fingertips. (A quick trip to Google says that shutil.rmtree should do the trick in Python).
In three months they re-wrote the 20 line shell script into a 200 line Perl script, with everything that actually did something being a system() call that invoked the shell commands that were in my initial script [...]
Sounds like a daily wtf to me.

[Pedantic corner {actually not a corner but a footnote of sorts, or more properly an addendum}: "least common denominator" is a number greater than any in the input, while "greatest common divisor" is smaller than the input. E.g. lcd(4, 6) = 12; gcd(4, 6) = 2.]

Shell programming

Posted Dec 5, 2012 5:42 UTC (Wed) by sitaram (guest, #5959) [Link]

Nice answer; hits all the right notes.

For me, the most powerful reason is pipes. For a lot of things I do, it's much easier to think in terms of pipes than pretty much anything else. This allows me to compose a solution where the parts of the solution need not be in shell at all (and are often just normal Unix tools).

My preferred language for anything serious is perl, but I wouldn't convert a pipe-based conceptual data flow to perl, or indeed any language. They're very strong when all the data manipulation is done *inside* the program, but don't offer much when you want to combine different, existing, programs to get the job done.

Shell programming

Posted Dec 6, 2012 3:49 UTC (Thu) by davidescott (guest, #58580) [Link]

Pipes and "rm -rf" are the two things I think ruby comes closest to not sucking at, and why I think something like it would be a god-send for systems scripting.

Being able to express things like: foo.collect{|x| frobnicate(x) } and the easy expression of lambda's and other functional concepts is really useful for a lot of data-manipulation tasks. And you get to be object oriented and use lots of libraries as well. I wish python were more aggressive about including these functional and meta-programming features.

The problems that ruby faces in systems scripting are three fold:

Shell idioms are much more complex than they seem. "rm -rf" might be aliased to "rm -rif --preserve-root --one-file-system" so when you translate that to a C-style language you get the hideous: File.remove(path, recursive=true, ignore_nonexistent_files=true, always_ask=true, override_always_ask_with_never_ask=true, preserve_root=true, one_file_system=true) Similarly mv has a bunch of arguments that we never think of that would need to be supported in some compact way to really compete with shell. I'm not sure anyone has come up with a good way to deal with that and have C style syntax.

Secondly, these shell idioms are tied to POSIX, but when people write python/ruby/etc there is an expectation that it should support other OSes (ie Windows). What does "mv Foo.txt foo.txt" do on Windows? Is it a noop? Does it move the file through a temporary in order to get to lowercase? After you figure out what to do there then you have "mv -n Foo.txt foo.txt." Again trivial to understand the meaning on POSIX, but on Windows an argument could be made either way.

The ruby language is still a bit ill defined, and the weak typing makes it really hard to write robust code. I don't think I every wrote an exception handling block that didn't have a typecast exception inside it. Sadly because ruby does not parse unambiguously and matz's ruby is the unofficial standard a few projects to build static type analysis into ruby all failed.

So Ruby vs Shell in my mind comes down to: (a) shell having a really compact syntax which is not compatible with the large c-style community, (b) shell getting to punt on other OSes which other scripting languages are expected to support with most-favored-OS status, and (c) Ruby having a lack of internal competition and having typing that is a bit too weak.

So unfortunately shell still wins for any one-off project but I would not use shell for anything I plan to come back to in the future.

Shell programming

Posted Dec 6, 2012 15:24 UTC (Thu) by renox (subscriber, #23785) [Link]

Two points:

> Shell idioms are much more complex than they seem. "rm -rf" might be aliased to "rm -rif --preserve-root --one-file-system" so when you translate that to a C-style language you get the hideous: File.remove(path, recursive=true, ignore_nonexistent_files=true, always_ask=true, override_always_ask_with_never_ask=true, preserve_root=true, one_file_system=true)

This argument doesn't seem very strong: you can define another function with the customised default behaviour.. And in Ruby I think that you can even change the original function (monkeypatching) eventhough it's dangerous.

> The ruby language is still a bit ill defined, and the weak typing makes it really hard to write robust code.

As if the shell was better for these points!

Shell programming

Posted Dec 6, 2012 21:39 UTC (Thu) by davidescott (guest, #58580) [Link]

> This argument doesn't seem very strong: you can define another function with the customised default behaviour.

And you can have lots of default arguments (basically the same as your aliases). It could be very nicely defined and minimalistic:

sys.rm(filename="/home/username/trash",recursive=TRUE)

which is the same as "rm -rf /home/username/trash" but as clean as that is it still loses out to shell.

On the other hand you can have very complex command chains in shell that call out to 15 different programs such that understanding what actually happens requires reading a half-dozen man pages. In those cases a C-style language may be more maintainable.

I don't think ruby is the answer to all this, but it made at least the command chains a real pleasure to work with.

Shell programming

Posted Dec 7, 2012 1:08 UTC (Fri) by sitaram (guest, #5959) [Link]

Speaking of very complex command chains, here's something that happened yesterday...

My daughter said that 3 seems to be the most common last digit in primes. I thought it might be 7. Without even thinking too much about it, I banged this out in pretty much a "flow of thought" kind of way:

seq 1 100 | xargs -L1 factor | egrep '^([0-9]+): \1$' | grep -o '[0-9]$' | sort | uniq -c | sort -n -r

Sure you won't use this in any long term program but that's because it is **slow**, not because it is unreadable. Yes it is "calling out to 15 different programs", but they're either fairly common (grep, sort, uniq) or easy to guess what they do (seq, factor).

To be honest I've rarely used anything more than grep/egrep, cut, sort, and occasionally a line or two of sed or perl. Years later I can still understand all those scripts because these basic tools are still around and they're well known and stable.

I agree that this example is somewhat contrived for our discussion (though it was real enough for me at the time I wrote it), but the wins for shell, IMO, come from:

* the lack of any actual variables anywhere
* 'sort | uniq -c', which encapsulates what would be the second largest component of the processing if you did this in a proper language
* 'factor', although it needs to be filtered by a regex to throw out the non-primes

Shell programming

Posted Dec 7, 2012 4:44 UTC (Fri) by davidescott (guest, #58580) [Link]

> The wins for shell, IMO, come from:
> * the lack of any actual variables anywhere
> * 'sort | uniq -c', which encapsulates what would be the second largest component of the processing if you did this in a proper language
> * 'factor', although it needs to be filtered by a regex to throw out the non-primes

What you are describing there is a mixture of functional and template/meta-programming aspects to shell.

The problem with this code is that it is impossible to read without knowing its purpose. This may have seemed trivial to you but it is beyond cryptic to anyone who does not know what it does:
> seq 1 100 |
trivial
> xargs -L1 factor |
I guess my xargs is weak (I seldom use it) had to look up -L1
> egrep '^([0-9]+): \1$' |
Dependent upon the output of factor which also must be looked up (I don't use factor).
Once you know what factor outputs you have to think a bit to understand how that relates to egrep.
> grep -o '[0-9]$' |
Something about the last character... but what is -o?
> sort |
trivial
> uniq -c | sort -n -r
c,n,r aren't terrible arguments but not everyone will know them.

I think this ruby is easier to follow except for the difficulty of knowing what group_by does:

require 'mathn'
(1..100).collect{|x| x.prime? ? x : nil }.compact().group_by{|x| x%10}.map{|k,v| [v.length(),k]}.sort()

Shell programming

Posted Dec 8, 2012 19:05 UTC (Sat) by nix (subscriber, #2304) [Link]

In most cases you're trying too hard. Rather than 'looking up' what xargs -L does, just do 'xargs --help' and it tells you (not that that is much harder than 'man xargs'. Rather than agonizing over what factor does, try 'factor 16' and 'factor 17' (one obvious prime, one obvious non-prime) and it is instantly obvious.

Shell programming

Posted Dec 8, 2012 19:09 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

>cyberax@cybmac:~$ factor 16
>bash: factor: команда не найдена [command not found]
Kinda fails.

Let's try "port search factor" - again it fails. There's no 'factor' utility in Mac OS X's ports.

On the other hand, Ruby version works perfectly even with the bundled Ruby interpreter.

And THAT is the problem with shells.

Shell programming

Posted Dec 8, 2012 20:01 UTC (Sat) by man_ls (guest, #15091) [Link]

I'd say that is a problem with the Mac OS X shell: it lacks the factor command.

Shell programming

Posted Dec 8, 2012 20:04 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

And Linux's shell lacks some SunOS's shell's tar options.

I've checked and it doesn't seem that the 'factor' command is mentioned anywhere in the POSIX spec.

Shell programming

Posted Dec 8, 2012 22:52 UTC (Sat) by man_ls (guest, #15091) [Link]

Neither does the POSIX spec mention ssh, rsync, wget, curl, less or a myriad of other commands in widespread use. Yet using them in scripts is easily forgiven; each command maintains excellent backwards compatibility on its own.

For your Mac OS X, I believe that brew install coreutils will get you a nice local copy of factor.

Shell programming

Posted Dec 8, 2012 23:26 UTC (Sat) by davidescott (guest, #58580) [Link]

What is this "less" you speak of? Maybe you are confused, the proper command is "more." Also what is this "ssh" do you mean "rsh?"

GNU has done an impressive job of unifying the *nix with a set of common tools, and the fact that someone can "brew install coreutils" and get a bunch of useful binaries is a testament to the value of the work GNU has done. BUT...

That does not mean that "shell" is the best language, it just means that shell is the least common denominator for interacting with a diverse set of programs (and it has served *nix well). Text input/output, flags for program options, silence is success, integer return codes, etc... As programs mature they get split into libraries, and those libraries get external interfaces to other languages like python/ruby/etc...

Shell programming

Posted Dec 8, 2012 23:10 UTC (Sat) by davidescott (guest, #58580) [Link]

I would go further and suggest that /usr/bin/factor is not something that should be in coreutils, and that OSX/BSD is far more sensible than GNU in excluding it. It just ends up polluting /usr/bin with functionality that would make more sense in a basic calculator like "bc" (which I curse out every time I start it because its so much easier to do basic arithmetic in python, ruby, R or octave).

If you look at what else is in coreutils factor seems completely out of place. There is not a single other tool in coreutils to do basic arithmetic. Where is there a /usr/bin/factor but not a /usr/bin/factorial, /usr/bin/exponent, /usr/bin/log or for that matter a /usr/bin/prime or /usr/bin/primes. In fact a good multi-function random tool (to generate random numbers or strings) would seem to be a far far far more useful thing to add to coreutils than "factor."

I'm really curious what considerations lead to "factor" being included into coreutils, is there an init script that really needs to factor a number before it can continue?

I don't mean this as a criticism of the developers of bc or coreutils or of GNU in general. Tools like "bc" were advanced and useful for their day. Thankful we have better alternatives for many use cases, and with 8 cores and 16GB of RAM I'm happy to waste a bit of each to get a more use friendly tool to do basic arithmetic and really don't need "bc" or "factor" anymore.

Shell programming

Posted Dec 9, 2012 17:39 UTC (Sun) by nix (subscriber, #2304) [Link]

I suspect only Roland McGrath can tell you why factor is in coreutils. It's been there since before it had an RCS repository (1992).

Shell programming

Posted Dec 8, 2012 22:44 UTC (Sat) by davidescott (guest, #58580) [Link]

> Rather than 'looking up' what xargs -L does, just do 'xargs --help' and it tells you (not that that is much harder than 'man xargs').

In what way is "xargs --help" not "looking it up"?

> Rather than agonizing over what factor does, try 'factor 16' and 'factor 17' (one obvious prime, one obvious non-prime) and it is instantly obvious.

Again I know what factor does (its in the name), what I don't know is that the output format is "INPUT: N1 N2 ... Nk". So I did exactly as you described (except I used 4 and 7).

There is nothing particularly complex about looking things up like this, and I had to look some things up for ruby as well (its been a few years since I have used it heavily). The key difference is that ruby has functions like "numeric.prime?" which (following standard naming convention) is a test if the numeric is prime, and has a standard obvious boolean return value.

The shell variant of this was a rather convoluted test if the output of "factor" was "N: N" which is a rather indirect way of testing if a number is prime. One could reasonably suspect that "factor" 7 would output "7: 1 7" or that factor 8 would output "2^3" or "2,2,2". There is no apriori reason to believe that factor should output the input number followed by the non-unit factors in increasing order (and separated by ":" and " ").

That is one of the advantages of a programming language. Real datatypes make it easier to guess what a function might return (factor() should return an array of prime factors [sorting and the unit are open questions]), and the subsequent manipulation of those objects is usually sufficient to say what kind of object was returned. If my test for primality of x was:
> (x.factor().length()==1)
Then it is clear that factor() returns an array of factors not including 1. In addition the standardized syntax makes it easier to guess what the functions might be.

Now at the expense of portability I could write a program /usr/bin/prime and simplify the shell script. I might even be able to write something like ruby's group_by in shell but it stretches the meaning of "shell" to say that such a script is "standard *nix shell."

Shell programming

Posted Dec 9, 2012 17:44 UTC (Sun) by nix (subscriber, #2304) [Link]

Hm. I hate to point it out, but your 'easy to read' Ruby one-liner was and remains utterly incomprehensible to me, full of thoroughly opaque punctuation, probably just as incomprehensible as the shell equivalent was to you: the presence of the occasional comprehensible word does not help much. The last time I saw that much nested foo().bar().baz().quux() was when building menus using Borland's Turbo Vision library. It is very much not intrinsically easier to follow than pipelines: at least with pipelines you know that what is being transported is text with a print syntax that you can easily examine, rather than some intricate datatype which you have to look up the various functions to figure out.

Your entire argument boils down to 'things that are familiar to me are intrinsically easier to read for everyone than things that are not familiar to me', which is not at all valid.

I'd agree that for larger programs, the shell is hardly ideal: datatyping is worthwhile -- but for quick one-liners to answer quick questions, it's exactly as valid as a bunch of other vaguely-scripty languages. Ruby is not magically better just because you know it better than the shell (and your claim that more people know Ruby than know the shell is unsupported, and to me sounds dubious in the extreme).

Shell programming

Posted Dec 9, 2012 19:01 UTC (Sun) by davidescott (guest, #58580) [Link]

> Ruby is not magically better just because you know it better than the shell

Never wrote that. In fact wrote something much the opposite of that

> (and your claim that more people know Ruby than know the shell is unsupported, and to me sounds dubious in the extreme).

Never wrote that.

> but for quick one-liners to answer quick questions

Not what this discussion was about.

> Your entire argument boils down to 'things that are familiar to me are intrinsically easier to read for everyone than things that are not familiar to me', which is not at all valid.

Your entire argument seems to be based on making up stuff and attributing it to others. There seems little reason to respond.

Shell programming

Posted Dec 9, 2012 23:20 UTC (Sun) by nix (subscriber, #2304) [Link]

No, it's based on not looking upthread to see who wrote what. Feel free to interpret 'you' as 'collective you' if you wish :)

Shell programming

Posted Dec 10, 2012 0:12 UTC (Mon) by Baylink (guest, #755) [Link]

> Your entire argument boils down to 'things that are familiar to me are intrinsically easier to read for everyone than things that are not familiar to me', which is not at all valid.

Exactly.

But the point you make immediately above, which was essentially: debugging is easier because you can merely lop off the end of the pipeline and *look at the data on the terminal*, is the really important one to me.

The best language for anything is very often *the one your programmer understands best*, as that is the one in which s/he'll be most efffective.

Shell programming

Posted Dec 18, 2012 12:25 UTC (Tue) by wookey (subscriber, #5501) [Link]

I think this is the nub of it.

I write stuff in shell because I know how to. After many years I've learned how to deal with many of the horrible things that go wrong, and with a couple of hours of dicking about can usually get what I wanted.

As with others if I know there will be a lot of data manipulation then I'll use perl instead, and know just enough to do that. I avoid writing in python because I don't know how, and worse don't know how to fill in the gaps.

If I'd started writing python 20 years ago things might be different (and I might be more productive - I like shell but I'm not going to claim that it's a _nice_ language)

Shell programming

Posted Dec 10, 2012 0:47 UTC (Mon) by dlang (subscriber, #313) [Link]

the pipe based approach has the wonderful property that it's trivial to truncate the 'program' at any point and see the output at that stage (with normal tools like head/less/wc/etc)

In fact, I normally build up the monster pipe shell programs iteratively this way, one chunk at a time.

Shell programming

Posted Dec 10, 2012 2:17 UTC (Mon) by davidescott (guest, #58580) [Link]

> it's trivial to truncate the 'program' at any point and see the output at that stage

So will interactive ruby (or python for that matter).

> In fact, I normally build up the monster pipe shell programs iteratively this way, one chunk at a time.

Same way I write long ruby chains.

The only difference is that Ruby passes objects and shell passes text.

Shell programming

Posted Dec 7, 2012 9:03 UTC (Fri) by man_ls (guest, #15091) [Link]

Compactness and the use of pipes: two basic Unix principles.

By the way, it appears that the most common last digit in primes is a bit random: running your script for 100 yields 3, for 1000 yields 7 and for 10000 it is 3 again, and for 100000 7 once more. Not bad for a Bash one-liner; and it is not so slow after all.

Shell programming

Posted Dec 7, 2012 10:53 UTC (Fri) by egk (subscriber, #50799) [Link]

Off topic, but anyone interested in the last digit of primes should search for "Chebychev bias", and especially the paper of Rubinstein and Sarnak on this topic. Very short summary: it's much more complicated than one might think...

And if you want to do numerical experiments on this type of questions, the right tool for the job is not a shell-scripting language. (Hint: Pari/GP).

Shell programming

Posted Dec 7, 2012 11:35 UTC (Fri) by man_ls (guest, #15091) [Link]

What is really surprising is that for a quick check, Bash can be useful even for interesting mathematical questions.

Shell programming

Posted Dec 7, 2012 12:01 UTC (Fri) by renox (subscriber, #23785) [Link]

> Compactness and the use of pipes: two basic Unix principles.

Maybe but even if I know better the shell than Ruby, the Ruby solution is more readable than the shell one..

Shell programming

Posted Dec 7, 2012 19:55 UTC (Fri) by dlang (subscriber, #313) [Link]

for you Ruby is the right answer, for now.

However, how much will Ruby change over time. If new people are hired (or your company is bought out, or you leave and go somewhere else), are these other people also going to be in the situation where Ruby is more familiar to them than shell?

Shell is everywhere. It's hard to administer a *nix system without dealing with shell (you can be an application programmer without dealing with shell, but not a sysadmin)

along similar lines, it's hard to be a sysadmin and completely ignore vi, you may prefer emacs, but any system you touch is going to have vi, not all systems will have emacs.

not all systems will have Ruby, Python, or Perl but every system will have shell of some sort.

Shell programming

Posted Dec 7, 2012 20:22 UTC (Fri) by bronson (subscriber, #4806) [Link]

Exactly right. Ten years ago Perl was the answer. Not anymore. (Perl5 is becoming nonessential a lot faster than I thought it would. It's almost scary...)

Scripting languages come and go, and the modern one break their own scripts every five years. Shell (without bashisms) seems to be as close to eternal as the computer industry can manage.

Use the right tool for the job.

Shell programming

Posted Dec 7, 2012 22:02 UTC (Fri) by davidescott (guest, #58580) [Link]

> Shell (without bashisms) seems to be as close to eternal as the computer industry can manage.

Not so sure about that. Trying to run a shell script on an older sysv system I think bash-isms are the least of your problems. Consider all the gnu-isms in the commands you use every day.

Compare the single unix specification ls (copyright is only 11 years ago):
http://pubs.opengroup.org/onlinepubs/009695399/utilities/...
to gnu ls:
http://unixhelp.ed.ac.uk/CGI/man-cgi?ls

-A, --author, -b (aka --escape), --block-size, -B (--ignore-backups), --color, -D, --file-type, --format, --full-time, -G, -h, --si, --dereference-command-line-symlink-to-dir, --hide, --indicator-style, -I, -k, -N (--literal), --show-control-chars, -Q, --quoting-style, -R, -S, --sort, --time, --time-style, -T, -U, -v, -w, -X

If you think about POSIX sh as the "core language" and all the programs in /usr/bin as the "libraries" for that language, then shell has seen a very stable core language and a massive expansion in the number and variety of libraries. On the other hand languages like ruby/python/etc have evolved more evenly across the language/library split.

"Shell is universally understood" has more to do with the overwhelming success of the GNU system and the power of open-source to push out the inferior closed source versions. Not that shell programming has been completely static.

Shell programming

Posted Dec 7, 2012 22:07 UTC (Fri) by dlang (subscriber, #313) [Link]

The key is in backwards compatibility, not in lack of change.

If you take a shell script from 20 years ago, the odds of it being able to run on a system today are very high

Perl is also fairly good about backwards compatibility, but I don't know how far back Perl 5 goes.

most other languages are far worse, and if you take a program written in them 10-15 years ago (if they were even around that long), the odds are very poor that you could run the program today without going through a porting effort that would be comparable to porting to a different language.

Shell programming

Posted Dec 8, 2012 1:35 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

HAhahahahadhahadkLOLOLL.

I've recently spent 5 hours rewriting old scripts from a long ago retired Sun box. Searching old manuals to understand what non-standard command line options do is such a great way to spend time. Especially when it's not possible to simply run them.

Shell programming

Posted Dec 8, 2012 2:05 UTC (Sat) by sitaram (guest, #5959) [Link]

That has nothing to do with shell per se. It's just non-open source versus open source. If they were open source you'd still be able to compile and run them.

As for non-standard options, I suspect most of the Sun utilities would be closer to BSD.

Shell programming

Posted Dec 8, 2012 2:11 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

Even if they were OpenSource (which they are), you'd need to compile 20-year old crufty C code and quite likely newer GCC versions won't do the trick. Waaaaay too much work.

On the other hand, I've ported 15-year old Python scripts without much problems. There were several non-backward-compatible changes in Python since then, but they are fairly minor.

Shell programming

Posted Dec 8, 2012 6:53 UTC (Sat) by mathstuf (subscriber, #69389) [Link]

Some anecdara:

For Python, one I hit today was to change "except BaseException as e:" to "except BaseException: e = sys.exc_info()[0]" to support 2.4 through 3.2 in the same script. A little annoying, but not as pretty as either just-one-way canonical syntax.

As for shell, we found out that BSD sed and GNU sed don't support compatible -i flags. BSD requires a suffix, GNU requires there be no space if one is given, which BSD rejects. Using manual .bak files feels worse than either by a long shot. I think the call has been replaced by awk instead now.

In my experience, GNUisms tend to be harder to work around than Python incompatibilities. That's why I try to avoid them in my shell scripts. Unfortunately, sed -i is very convenient and sponge just isn't common enough, but this is slowly being trained out of my fingers when I'm in "portable" mode (I use every trick I can at the prompt, just not when writing .sh files).

Commands on an old Sun box

Posted Dec 10, 2012 7:19 UTC (Mon) by jrn (subscriber, #64214) [Link]

Have you looked at the heirloom toolset (http://heirloom.sourceforge.net/)?

Commands on an old Sun box

Posted Dec 10, 2012 14:17 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

Nope (I didn't know about it).

In any case, I was interested in a rewrite in a sane language, not just running these scripts.

Shell programming

Posted Dec 6, 2012 3:16 UTC (Thu) by davidescott (guest, #58580) [Link]

> I once had a 20 line, ugly shell script that I wrote turned over to the software engineering department for "official maintenance" and to bring it up to "professional standards". In three months they re-wrote the 20 line shell script into a 200 line Perl script, with everything that actually did something being a system() call that invoked the shell commands that were in my initial script (long ; joined lines remaining intact). But this was now a "professional" program, it had lots of comments (none of which explained what any of the calls were doing, or why), every function was less than one screen, etc. It's the best example I've ever seen of how "coding standards" can make a program hard to deal with.

That is certainly dailywtf worthy, but its also a sign that you screwed up by not following coding standards from the outset.

You fixed a problem (yeah for you) and your boss was happy. He gets to say to his peers in other departments "I've got this hot-shot programming magician who can solve all my problems, and he's all mine NAH-NAH-NAH-NAH." So your boss gives you something else and asks you to hand the old script over to "software engineering."

Software engineering looks at it and pukes. 20 lines of undocumented shell. WTF are they supposed to do with that? They complain to their boss who knows that your boss loves you and will never let you go... so they just get ignored. Some poor smuck is now sitting in "software engineering" with 20lines of shell that do god knows what, but he has to bring it up to "standards" which means perl. He wife is pregnant and his home mortgage is eating 50% of his salary, so he can't risk his job trying to bring the sh*t you dropped on his lap up to standards, so he does the next best thing. He just wraps the whole thing in perl system calls. It does exactly the same thing as yours does and he doesn't have to worry that by replacing your sed with a perl builtin that he might introduce a bug through some esoteric incompatibility between perl and sed handling of unicode strings. If anything goes wrong he can demonstrate that the output is the same as your output so its the same program you gave them.

So the real WTF is why you gave submitted 20 lines of undocumented shell script to systems engineering team for long term maintenance when you knew it wouldn't meet their standards in the first place. It was quick and easy for you yes, but completely opaque to those who were actually in charge of managing it.

Shell programming

Posted Dec 6, 2012 5:42 UTC (Thu) by dlang (subscriber, #313) [Link]

this wasn't just stuff thrown over the wall, this was a system automatin script, it could have been documented better, but that would not have added more than about 10 lines. It was not written to be managed by engineering any more than a normal chef/puppet/cfengine config snippet would be. After the script was written, management decided that it was incredibly useful, but they were on the swing of the pendulum that says that system administrators are not allowed to write any code, only 'professional programmers' are allowed to do so.

making each line (or at most every two lines) of a shell script into a separate function (and of course, each function needs a 10 line comment as a header, giving it's name, plus now there needs to be a main function to call all these other functions, etc) does not help anything.

This same engineering department had a bug report submitted to it that contained a 5 character change to another tool for a bug that was discovered during a production deployment, it was found and tested by the sysadmins during the maintenance window. But the bug took 18 months to close as the entire script was re-written (and it was originally written by the engineering team and did comply with the coding standards to start with).

These are cases of people who don't know what they are doing blindly clinging to 'standards' without any understanding of what the purposes of the standards are, and without any ability to read code and understand it.

Shell programming

Posted Dec 6, 2012 21:03 UTC (Thu) by davidescott (guest, #58580) [Link]

My statement was intended to be tongue-in-cheek, although I think it is a real problem with shell because there are lots more people who have a solid understanding of python than have a solid understanding of unix command line.

Yours sounds like a really dis-functional organization, and it is clear that management is handing the responsibility of maintaining this software to a group that is not capable of understanding what it does. Either management needs to stop doing that and let you manage what you write, or they need to enforce some kind of standard that prevents you from writing stuff that others can't understand, [or they could fire the incompetent people and hire someone who knows what they are doing ... as if].

Shell programming

Posted Dec 1, 2012 23:34 UTC (Sat) by mgb (guest, #3226) [Link]

Different languages are good for different things.

If you just need some simple control flow and pipelines, a shell script is the way to go.

Shell programming

Posted Dec 2, 2012 18:44 UTC (Sun) by bokr (subscriber, #58369) [Link]

#!/usr/bin/bash
kwds="$(echo $@|tr ' ' '+')"
ffarg="http://en.wikipedia.org/wiki/Special:Search?search=$kwds"
firefox $ffarg &

I like being able to make things easy for myself ;-)

I prefer typing "wkf something" over Alt-Tabbing to firefox (if it's running or grabbing the mouse and clicking a sys tray icon if it's not), and then mousing down a menu to select wikipedia (if it's set to google) and then clicking focus to the search slot if necessary, and typing in "something" there.

It's so easy to make a little utility command. If I look up something on on the net more than a few times, like as not I'll put a little script for it in my ~/bin (which is on my $PATH), e.g. areagrep to look up areacodes, ccgrep to look up country codes, airpgrep to look up airports and their codes, etc. (most of these work by a 2-line bash prefix calling grep on the rest of the file as a heredoc, or even grepping $0 and exiting, where the heredoc is made by pasting a table copied from some public web page e.g. areacodes from wikipedia for areagrep. They nicely made it easy to search for specific areacodes by suffixing a colon to area code in the text. So

[18:48 ~/bin]$ areagrep 206:
206: State of Washington (Seattle, all of Bainbridge, Mercer, and Vashon islands, Burien, Des Moines, Lake Forest Park, Normandy Park, Sea-Tac, Shoreline, Tukwila, and some small unincorporated areas adjacent to these. Also, parts of Woodway and Edmonds)
[18:48 ~/bin]$ areagrep 212:
212: New York City (Manhattan except for Marble Hill)

Also it's nice to be able to get any result that comes from these little trivial utilities right into vim, just typing in vim, e.g., ":r!areagrep 212:"

I have a collection of throwntogether stuff, which I think would grow even faster if command line access to clipboard and running programs were more standard and less arcane. E.g., for x clipboard stuff I had to get xclip and compile it, and to get text dynamically from the web may involve e.g. getting wget to impersonate firefox and filtering through an adhoc python script.

But bottom line:
I think there's a bright future for command line input to computers ;-)

Shell programming

Posted Dec 2, 2012 19:37 UTC (Sun) by boudewijn (subscriber, #14185) [Link]

Nice script... But I prefer to press alt-f2, and then type "wp:maillol" if I need to get the wikipedia article for that lemma. Or "qt:qstring" if I've forgotten once again what the correct way of converting an integer to a string is. Or "dict:espieglerie" for definitions of the word I always assumed meant "mirror" or "ggi:maillol" for images of sculptures by the master. In fact, I'm so used to it that I never keep a browser window open, at all.

Shell programming

Posted Dec 10, 2012 0:20 UTC (Mon) by Baylink (guest, #755) [Link]

Sure. But his approach is much more portable across workstation OS distributions than yours is. :-)

Shell programming

Posted Dec 18, 2012 12:35 UTC (Tue) by wookey (subscriber, #5501) [Link]

I tried that and a little window opened to type in, but hitting return said:
Failed to open URI "wp:maillol"

I'm using XFCE. What were you using?

Shell programming

Posted Dec 18, 2012 13:00 UTC (Tue) by boudewijn (subscriber, #14185) [Link]

KDE of course. KDE's krunner is amazingly powerful and useful.

Shell programming

Posted Dec 14, 2012 9:53 UTC (Fri) by oak (guest, #2786) [Link]

Many embedded systems have just Busybox because Perl and Python are too large & redundant for them.

Busybox utilities conform nowadays pretty well to POSIX, there are just a few corner case things that aren't implemented.


Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds