User: Password:
|
|
Subscribe / Log in / New account

Another old security problem

Another old security problem

Posted Sep 9, 2010 18:05 UTC (Thu) by clugstj (subscriber, #4020)
Parent article: Another old security problem

Tried the exploit on my machine. The DOS aspect is bad, but doesn't completely lock up the machine. Of course you could run multiple instances.

What's wrong with the obvious workaround of setting a stack size limit? The exploit program then outputs a boring "execve failed" message and nothing bad happens.


(Log in to post comments)

Another old security problem

Posted Sep 9, 2010 20:30 UTC (Thu) by martinfick (subscriber, #4455) [Link]

"What's wrong with the obvious workaround of setting a stack size limit? The exploit program then outputs a boring "execve failed" message and nothing bad happens."

I suspect that the obvious problem is the limit. Who wants arbitrary limits that are not based on available resources and that affect potentially valid use cases (non exploits)? As for the nothing bad happens, did you forget that the program (and now likely others that should) didn't run? By many people's standards, that would be something bad happening. :(

valid?

Posted Sep 9, 2010 22:38 UTC (Thu) by marcH (subscriber, #57642) [Link]

> Who wants arbitrary limits that are not based on available resources and that affect potentially valid use cases (non exploits)?

Calling a megabytes argv a "valid use case" is a bit of a stretch.

valid?

Posted Sep 9, 2010 23:02 UTC (Thu) by bronson (subscriber, #4806) [Link]

I totally disagree. I love no longer needing to worry about find and xargs all the time. Does a dir have tens of thousands of files? Who cares! Just glob away, it just works.

A gigabytes argv would be a stretch. But megabytes? That seems pretty reasonable to me.

valid?

Posted Sep 15, 2010 0:51 UTC (Wed) by roelofs (guest, #2599) [Link]

A gigabytes argv would be a stretch. But megabytes? That seems pretty reasonable to me.

Absolutely. A minor side project of mine involves the generation of 35000 time-series images per year, each with a name of the form "fubar-XX-yyyymmdd-hhmm-UTC.png". As a same-dir glob, that works out to just over a megabyte; add a "yyyy/" directory prefix and multiple years, and you're easily into the 10MB range. Increase the time resolution by a factor of 3 to 5, and you're well on your way to 100MB. (And yes, it's very cool to watch a full-year sequence animate, particularly on a fast machine; a 5- or 10-year sequence would be even better, assuming I could hit 60fps on the decode.) Of course, at some point it becomes a database-driven custom app, but 10MB command lines are not out of the question with the trivial hack I have so far.

Greg

valid?

Posted Sep 9, 2010 23:04 UTC (Thu) by martinfick (subscriber, #4455) [Link]

Right, and who would ever need more than 640K of RAM? Let's constrain the arbitrary limit thinking for DOS(windows) users and devs, not for linux. ;)

valid?

Posted Sep 13, 2010 19:18 UTC (Mon) by nix (subscriber, #2304) [Link]

Well, actually, stupid arbitrary limits have long been part of the Unix experience. They're part that GNU set itself against, and I'm glad to say that it hasn't been part of the Linux experience heretofore, and Linux is all the better for it.

valid?

Posted Sep 13, 2010 19:25 UTC (Mon) by martinfick (subscriber, #4455) [Link]

Very true. I really should used "legacy OS" instead of "DOS/Windows" in my complaint.


Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds