|
|
Log in / Subscribe / Register

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

W. McDonald Buck, retired CTO of World Bank, looks at what it will take to put Linux on the corporate desktop, on OSDir. "I'm a Linux devotee. I'm offended by the rigged analyses that Microsoft has purchased in its "Get the Facts" campaign. But I think it is important that the open source community demonstrate fairly that open source software presents a better cost/benefit case than Windows. This case is not helped by resorting to the same kind of trickery and distortion of which Microsoft is guilty. I don't like to see obviously skewed analysis on Linux's behalf any more than I like to see it on Microsoft's behalf. No that's wrong. I have a greater dislike of pro-Linux trickery, because I expect better of us."

to post comments

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 1:14 UTC (Sat) by dlang (guest, #313) [Link] (1 responses)

This article is really a rant at how hard it is to buy a machine with Linux pre-installed (or with no OS at all) and he claims that this makes the TCO analysis by pro-linux people incorrect.

here is my reply to him

Re: Part I: Corporate Desktop Linux - The Hard Truth (Score: 0)
by Anonymous on Feb 04, 2005 - 04:57 PM
Yes it is true that the tier 1 vendors make it hard (if not more expensive) to buy their hardware without windows, and I really hate this personally.

however in a TCO analysis this really doesn't matter much.

the cost of Windows isn't in the desktop licenses, it's in the server licenses, the server software, the cost of Anti Virus software (and other anti-Malware software, not to mention the cost of cleanup when this software doesn't catch the latest bug before it infects you), the difficulty in remotely administering servers (and the lower admin to server ratio required)

add to this the microsoft 'critical patch' treadmill and the forced upgrades ('Joe just got a new laptop with an upgraded version of Visio, now everyone else needs to upgrade to read the documents he sends out') and the TCO numbers look really bad for Windows.

CALs

Posted Feb 6, 2005 2:55 UTC (Sun) by miallen (guest, #10195) [Link]

however in a TCO analysis this really doesn't matter much.

the cost of Windows isn't in the desktop licenses, it's in the server licenses, the server software

Actually I think this might be a little wrong. The client licences are the main expense. MS used to charge by the CPU and by package and had much greater emphisis on the server side but they recently changed their licensing all around. Now it's the Client Access Licenses (CALs) that are the bulk of the expense. A lot of server software is even comp'd entirely (e.g. cold boot systems at backup locations). This is why if you're running a Samba domain member server you don't save as much money as you might think because you still need to by CALs for the domain.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 1:25 UTC (Sat) by NightMonkey (subscriber, #23051) [Link] (1 responses)

"But I think it is important that the open source community demonstrate fairly that open source software presents a better cost/benefit case than Windows."

Why does the open source community have to demonstrate *anything*? If a company wants to investigate ways of controlling costs or determine ways of gaining advantage over their competition by using different software and methodologies, then let them do it! The "Open Source Community" doesn't live or die based on what corporations and other businesses choose to use as their desktop OS and applications. When they see that they can save money and/or get more out of their systems, well, then they'll switch. Until then, why care? Whether it's laying off 10,000 workers to improve the stock price, moving plants to China to break unions, or investing in new technology that doubles output, Business has never seemed to have a problem choosing the path that maximizes profit.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 18:31 UTC (Sat) by neoprene (guest, #8520) [Link]

"But I think it is important that the open source community demonstrate fairly that open source software presents a better cost/benefit case than Windows."

Well, uuumm, Linux is not a commercial product, and "the open source community" are not sales-reps, stockholders, benefactors-of-free-software-for-corporate-exploitation and don't have to do anything for anybody, especially Fortune-500 Corporations.

"The open source community" does not have a "marketing plan" to endow Corporations with free stuff. If the Corporations wants to use FOSS they are free to do so. This writer, among many, don't get it, GNU/Linux does not fit into a commmercial metaphor.

As far as leveling the playing field in the PC domain between Commercial/Corporate and TheCommonMan/Personal interests GNU/Linux is "it".

The social responsibility of Corporations are not discussed much. The U.S. laws regulating corporate actions have been weakened and hollowed out. They have bought the Media, the Politicians and the peoples ability to think freely. Why anybody has to do anything for theses Fat Cats is absurd, let em pay thru the nose.

As far as the M$ concept of TCO goes: What does the "O" in TCO stand for? "Ownership"? The M$ EULA grants no ownership of anything.

Whats the price of freedom and access to every sneaking line of code?

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 4:31 UTC (Sat) by bluefoxicy (guest, #25366) [Link] (28 responses)

There's at least one important notion here: pro-Linux BS is BS, and we should (and can) back ourselves up with only real claims. People have claimed Linux to be easy to use (true), to come with "everything you need" (true in some cases, false in others), to be ultrasecure (not true by default, but potentially possible), and to be nebulously "ready for the desktop."

Linux is easy to use and usually comes with enough for a business and the average user at least. For gamers, people grasp at straws: Doom3, Quake3, ok. Older games like Quake1,2 and Duke Nukem, original Doom, etc, sure. New hits like WoW or FF11? No, sorry. Often people are even attached to programs that they have to be eased away from.

"Ultrasecure" claims are made all the time. Linux is not immune to viruses, it's just harder for a normal user to infect the machine. Viruses that take over local root exploits can and will spread when Linux becomes a target for many viruses. I should probably steal the Bliss source code (a virus-on-linux POC) and one of the recent kernel root elevation POCs and hack up a "Linux Virus Proof-of-Concept" from the two. Wouldn't be too hard: int my_attack_routine() { get_me_root(); bliss_me_around(); } Of course, with a little work a lot of this can be helped.

"Ready for the desktop" claims are also often made, but not substanciatable. New hardware pulls ahead of Linux. I have a USB WIFI (netgear WG111) and a PCMCIA WIFI (adm8211 based DLink DWL-650), neither of which work. Newer graphics hardware from ATi and nVidia has no OSS drivers, and the closed ones impede security work. Programs can't be easily downloaded and installed right to the system by a user yet either, something a lot of people do.

And of course, there's some ways to go to get real security, which ties into "Ready for the Desktop." We DO need clamav GUIs that sit in the Gnome system tray and test on demand, as well as host-based firewalls hooking into connect() for the user. Eventually there's going to be spyware and viruses and worms. Heafted-up security is going to be able to bounce the portion of the problem not propogating through e-mail, DCC, and programs on web sites. Digital signing of binaries will help squelch virus propagation.

To substantiate many of the common pro-Linux claims, a lot of work has to go in to make Linux not only a better designed OS in general in relation to current solutions, but a much better OS in a hostile environment. This ties in with TCO, because once Linux hits mainstream, viruses and worms will start popping up, which will severely increase the TCO just as they do on Windows. Needing an expert to chose your hardware for you, or needing to hire driver writers will also increase the TCO.

Mr. Buck may or may not have a point with the TCO article here; but it is true that there's a lot of mystified misconceptions about Linux going around. In both directions.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 12:58 UTC (Sat) by segphault (guest, #27468) [Link] (10 responses)

"Programs can't be easily downloaded and installed right to the system by a user yet either, something a lot of people do."

I do it all the time on my Debian system with apt-get. Even defecient RPM based distributions are capable of doing it with apt-rpm or yum. In fact, one of the reasons I prefer Linux is the broad availability of software and the ease with which I can acquire/install it.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 13:37 UTC (Sat) by seyman (subscriber, #1172) [Link] (9 responses)

> I do it all the time on my Debian system with apt-get.

I believe the OP is referring to inhstalling software by a user other than root.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 14:12 UTC (Sat) by zotz (guest, #26117) [Link] (8 responses)

"> I do it all the time on my Debian system with apt-get.

I believe the OP is referring to inhstalling software by a user other than root."

oh, like sudo apt-get install?

how well can a non-admin install software on XP?

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 10:46 UTC (Sun) by miallen (guest, #10195) [Link] (6 responses)

oh, like sudo apt-get install?

Would you really grant an average user the ability to install anything they wanted? Linux doesn't have profiles like Windows so if they install something it will affect the entire system.

how well can a non-admin install software on XP?

Actually for most things they should be able to. Windows is actually designed specifically to support installing software as an average user. The problem is crappy developers (who almost invariably run as admin) never bother to check to ensure their program installs properly as a non-admin user.

This is just one of those things that is important in a corporate environment and not important at all on a personal system and it's the weekend warriors with personal systems that do all the development so this will probably not be fixed anytime soon.

Personally I think it would be very nice if you could actually install stuff entirely in your home directory easily. It wouldn't be terribly difficult. I believe the only real issue is getting the linker to search the libs in your home directory.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 13:03 UTC (Sun) by zotz (guest, #26117) [Link] (3 responses)

"Actually for most things they should be able to."

Yes, but in most?/many? instances they don't seem to be able to.

"...and it's the weekend warriors with personal systems that do all the development so this will probably not be fixed anytime soon."

You may very well be right on this one. I manage most of my own systems and don't run into the problem so I have never experimented too much with it or thought too deeply about it.

"Personally I think it would be very nice if you could actually install stuff entirely in your home directory easily."

I think you are supposed to be able to actually do this, so the real key for you is easily, correct?

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 22:39 UTC (Sun) by miallen (guest, #10195) [Link] (2 responses)

"Personally I think it would be very nice if you could actually install stuff entirely in your home directory easily."

I think you are supposed to be able to actually do this, so the real key for you is easily, correct?

Well you could put all of the files somewhere in your home directory but that doesn't mean the application will work correctly. First, this assumes the package is properly relocatable or you are building from source it supports changing the prefix. The biggest problem however is if the package installs libraries. You need to communicate to the linker that you have new libraries. I don't think this is as simple as setting LD_LIBRARY_PATH. I would be delighted to hear someone tell me I'm wrong.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 4:17 UTC (Mon) by iabervon (subscriber, #722) [Link]

I believe the main issue is actually getting packages which tend to be configured to install in a common location to actually put the files in some home directory location (i.e., user-apt-get install normal-debian-package). LD_LIBRARY_PATH is fine for getting the dynamic linker to find things; hardcoded (or compile-time constant, if you're getting binaries) paths for data files are a bigger issue, because there isn't an environment variable to change for them.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 11, 2005 4:04 UTC (Fri) by Ross (guest, #4065) [Link]

Unlike Windows, apps shouldn't be modifying the files in their directories.
Since normal execution doesn't require this security practices say that
it should be disallowed. Thus, to prevent more malware, applications
shouldn't be installed in locations where they can be directly modified by
users. What would be better is a graphical package management tool which
allows installs, updgrades, uninstalls, etc. but not abitrary modification.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 9, 2005 17:41 UTC (Wed) by bobs666 (guest, #27795) [Link] (1 responses)

Remember we are talking about the "Corporate Desktop Linux"

If one uses minimal install on the desk top,
where all the corporate data is stored on a server.
Then giving all the users root access on there own
desktops is a no braner. See the MIT desktop project
and the CMU one as well, from the 1980s where the root
password was written on the front of each desktop.

If you do things right, you dont need to get so up
tight with desktop security.

Altho Admins at MIT did carry there own software, on a floppy,
to reinstall the security kernal from the trusted network,
when accessing servers for maintance from unsecured desktops.

It not like you dont have to stop thinking about where your doing.
when your running on untrusted desktops. But painless resecuring
can be done in a simple manner.

as for the linker... cc -Ldir

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 9, 2005 23:38 UTC (Wed) by miallen (guest, #10195) [Link]

If one uses minimal install on the desk top, where all the corporate data is stored on a server. Then giving all the users root access on there own desktops is a no braner.

I suspect this is some kind of troll but I thought I should answer for the benifit of others who might actually take you seriously.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 18:57 UTC (Sun) by bluefoxicy (guest, #25366) [Link]

Like www.download.com solitair right-click Ultimate Solitair and hit Download, double-click, it installs. apt-get relies on debian knowing about it, or on the developer having an apt repository and the user adding that apt repository.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 19:29 UTC (Sat) by mikew (guest, #27697) [Link] (10 responses)

> I have a USB WIFI (netgear WG111) and a PCMCIA WIFI (adm8211 based DLink DWL-650), neither of which work.

The adm8211 has GPL Linux drivers, which is bundled with Ubuntu and at least one other major distro out there, IIRC. The netgear WG111 may have drivers, but I'm not familiar with whatever chipset is on that adaptor. http://aluminum.sourmilk.net/adm8211/

> Newer graphics hardware from ATi and nVidia has no OSS drivers, and the closed ones impede security work.

Of course there are OSS drivers, they're just not 3D accelerated. (and the latest chips from ATI are currently being figured out by the people hacking DRI)

> We DO need clamav GUIs that sit in the Gnome system tray and test on demand, as well as host-based firewalls hooking into connect() for the user.

Hook into connect()? Netfilter isn't enough?

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 20:30 UTC (Sat) by MathFox (guest, #6104) [Link] (3 responses)

We DO need clamav GUIs that sit in the Gnome system tray and test on demand, as well as host-based firewalls hooking into connect() for the user.
Hook into connect()? Netfilter isn't enough?
We do not need virusscanning on demand and checking of outgoing connections and incoming data on Linux yet. It would be good to have the infrastructure ready in case that a serious Linux threath pops up, but so far, security concious design has saved us. (knocks on wood)

RSBAC and (likely) SELinux allready offer the options to restrict network access for outgoing connections on a program and/or user basis. There is no userfriendly interface available yet, but that is a matter of demand and time.
On a similar note, clamav already provides a programmer interface for file scanning; it also is a matter of time to get virus scanning integrated in mailprograms, webbrowsers, etc. and have a WIMPy* interface for it.

*WIMP: Windows Icons Mouse and Pointer

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 19:14 UTC (Sun) by bluefoxicy (guest, #25366) [Link] (2 responses)

YET. As soon as Linux becomes an attractive virus platform, it will be a nightmare. Don't forget that viruses don't necessarily need to be run by root to spread to the system. Security concious design hasn't really done much, seeing as we can do much better than we already do.

Policy for SELinux and RSBAC will never be user compatible. You're suddenly dropping "Write an SELinux policy" on every programmer in the world first; and every program has to come with policy or it can't be run and nobody can run it, AFAIK. Also, don't you need special privileges to rewrite the SELinux policy? Privs that normal users may not have. SE will just deny the request anyway, rather than block until user interaction is given. I want a truly elegant solution for users' needs.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 11:10 UTC (Mon) by MathFox (guest, #6104) [Link]

As soon as Linux becomes an attractive virus platform, it will be a nightmare.
When will Linux become an attractive virus platform? My guess is that it won't become an attractive platform. If you need a "local root exploit" to get the system binaries infected plus a hole in an application for the initial infection, that is one bug more than you need for creating havoc under Windows. Another complication is the diversity in Linux distributions. Debian is slightly different from Fedora and it is very well possible that a virus only runs on a specific version of one distribution, think about the use of stack protector technology.
Window system administrators have become experienced in fighting malware and prevention techniques are well known. Firewalls and virusscans on incoming mail are commonplace and they protect Linux desktops just as wel as their Windows counterparts. If you call the current Windows situation a nightmare, I don't think that Linux will ever get close to such a level of malware problems.

Policy for SELinux and RSBAC will never be user compatible.
We are talking about a corporate desktop here. You don't want a user to unlock the system. SELinux and RSBAC are tools that prevent damage and you can define a "standard profile" for applications that come without a specific policy.
SE will just deny the request anyway, rather than block until user interaction is given.
You can hack the Linux code and add a hook that calls a user program with a "should I allow this interaction" question. (RSBAC has such an option). I think that in a corporate setting you'ld better rely on your corporate firewall; but for a home user a network monitor like this would be usefull.

Linux security is continuously evolving. It is hard for exploit writers to keep up.

Linux virus nightmare...

Posted Feb 7, 2005 15:48 UTC (Mon) by cdmiller (guest, #2813) [Link]

Until this actually happens you are just spreading FUD. This claim about linux viruses waiting for the platform to become more popular has been made for about a decade now.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 19:05 UTC (Sun) by bluefoxicy (guest, #25366) [Link] (5 responses)

The adm8211 has GPL Linux drivers, which is bundled with Ubuntu and at least one other major distro out there, IIRC. The netgear WG111 may have drivers, but I'm not familiar with whatever chipset is on that adaptor. http://aluminum.sourmilk.net/adm8211/

the adm8211 ones are experimental and incomplete to my understanding. I've tried getting them working on pcmcia and they just wouldn't take. The maintainer told me to lspci, and said that all PCMCIA cards show up as PCI; so I tried but it wasn't there, dunno what's up with that. It may have changed in 2.6. cardctl could see it and insert/eject it (the green light on the card goes on and off).

Of course there are OSS drivers, they're just not 3D accelerated. (and the latest chips from ATI are currently being figured out by the people hacking DRI)

Doom3, Quake3. You need 3D and the latest cards. "Wait 5 months and we'll have a working driver for your ATi card" doesn't cut it; gamers just say "well @*#%, windows is obviously a better OS."

Hook into connect()? Netfilter isn't enough?

Host firewalls such as Norton Personal Firewall, Macafee Personal Firewall, and ZoneAlarm go above and beyond iptables by being able to make a connect() call block until the user can acknowledge that the program is indeed supposed to be doing what it's doing. Both netfilter-style firewalls and host-based firewalls are an important part of a defense-in-depth solution. Of course, the host firewalls will have to be controlled by a daemon that authenticates your decicions via your password and PAM.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 19:40 UTC (Sun) by oak (guest, #2786) [Link]

You mean something like the BSD systrace?

It has been ported to Linux (quite a while ago), but it's not in the main
kernel tree yet. I don't know why.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 0:12 UTC (Mon) by mikew (guest, #27697) [Link] (3 responses)

I am the maintainer, and the driver is fairly complete for 99% of the users out there. The driver will no longer be experimental once it gets into the vanilla kernel. The adm8211 is more than a pcmcia card - it is a cardbus card, meaning that from the driver's point of view, it's just a pci card. Contact me directly via email (it's inside the driver tarball) if you'd like more help in getting the card to work.

> Doom3, Quake3. You need 3D and the latest cards. "Wait 5 months and we'll have a working driver for your ATi card" doesn't cut it; gamers just say "well @*#%, windows is obviously a better OS."

Actually, most gamers who bother to try linux for gaming just get a nVidia card. People annoyed with getting their ATI cards to work typically curse ATI, not Linux, and buy a nVidia card.

> Host firewalls such as Norton Personal Firewall, Macafee Personal Firewall, and ZoneAlarm go above and beyond iptables by being able to make a connect() call block until the user can acknowledge that the program is indeed supposed to be doing what it's doing. Both netfilter-style firewalls and host-based firewalls are an important part of a defense-in-depth solution. Of course, the host firewalls will have to be controlled by a daemon that authenticates your decicions via your password and PAM.

The user should not need to acknowledge if the program should be doing this or that. The user is not smart enough to make that decision. There is no great loss if connect() fails - a program monitoring netfilter logs can just indicate to the user that this has happened (and possibly offer to change the ruleset). However, this shouldn't be necessary if the distro already sets everything up properly, as it should.

Netfilter is often used to implement a host based firewall. I think you're trying to differentiate between a firewall implemented within the kernel, and a firewall implemented in userspace.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 1:06 UTC (Mon) by TimCunningham (guest, #10316) [Link] (1 responses)

> Actually, most gamers who bother to try linux for gaming just get a nVidia
> card. People annoyed with getting their ATI cards to work typically curse
> ATI, not Linux, and buy a nVidia card.

I find it hard to believe that people who are trying Linux would be willing to buy another expensive video card for the sake of gaming on Linux if they primarily use their computer to play games.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 1:46 UTC (Mon) by mikew (guest, #27697) [Link]

> I find it hard to believe that people who are trying Linux would be willing to buy another expensive video card for the sake of gaming on Linux if they primarily use their computer to play games.

Some do, but I wouldn't call them people who are merely trying Linux. I shouldn't have added "buy a nVidia card" right at the end of the sentence, but my main point is that gamers don't just curse Linux for the problems. Gaming on Linux is heavily biased towards nVidia right now. (though ATI is catching up)

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 5:53 UTC (Mon) by bluefoxicy (guest, #25366) [Link]

> Actually, most gamers who bother to try linux for gaming just get a nVidia card. People annoyed with getting their ATI cards to work typically curse ATI, not Linux, and buy a nVidia card.

My original complaint was that the nVidia closed drivers impede security work, and we've come full circle.

Impeding security work which is otherwise transparent except for a few cases, which can be fixed in time. In the mean while, work-arounds are to simply disable the protections for breaking programs (what that list in pax.conf is used to do); though all 3D apps using nVidia's libGL.so need mprotect() restrictions disabled.

This is of course very useful and contributes to the ability to stop a lot of security exploits in a transparent way. Barely anything breaks at at first, and of course the distributions can easily adjust for that (it's ~15 seconds of work to track down and mark a broken binary so it doesn't get restrictions and thus doesn't break); give it 6-12 months and the vendors will start to adjust, and upstream will start fixing their code. At that point, nothing should ever visibly break for the user.

Of course right now I'm just focused on PaX, which you'd be most familiar with as used in GrSecurity. And yes I've considered Exec Shield; this is still several steps above and beyond. Currently I'm working out a threat model (potentially temporary link) which will explain why each part of PaX is necessary.

Anyway, to reitterate, my point was that there are outstanding issues which prevent us from actually going forward with large scale security improvements. I'm not one of the crazies that things security needs to break the whole world; but I do believe that when a system can do better than 99% compatibility and reduce more than 10% (PaX: potentially 61.7% based on 60 samples) of security holes in a real way that can't be evaded, the less than 1% that bumps into it should adjust.

It's mostly common sense: people break into houses in your area because they have hollow plywood doors, so replace your door with an oak door in a light aluminum shell. It'll never get in your way, except for those unnecessary times when you feel like busting through the door and turning it to splinters just to look cool; but it'll stop 99% of people trying to break into your house unless they make A LOT of noise (dynamite).

> The user should not need to acknowledge if the program should be doing this or that.

Alright, you implement me a NPF/MPF/ZA clone transparent to the programmer of the network software that works the same, asking about stuff it doesn't know and then letting the user say YES without actually stopping the connection and creating added annoyance for the user. This is why we need it.

It's just more common sense. Do you want the user to wake up one day and sweet holy h--- some program running from ~/.fuzzbomb is connecting to sco.com on port 80 and sending massive amounts of junk! (port 80 is open btw). It's even named ~/.fuzzbomb/firefox-bin so that netfilter sees "firefox-bin" and says "yip matches a --cmd that's allowed" and lets it go.

I don't like the "It's not now so we'll do it later" attitude. This inevitably fails as we've seen many times before; once much third party vendor software relies on certain behavior, you're locked into that behavior. Fortunately in the case of host firewalls we can implement them in a fully transparent fashion to the programmer, so this doesn't quite apply here entirely.

OpenOffice BREAKS on WinXP if two people are logged in using it, or at least a version around 1.1 broke because "it was already being run" (as owner in the system tray on the owner log-in, when frank tried to use it from the start menu on the frank log-in, on my uncle's PC).

I've had several people who use win2000 tell me to "just run as administrator because Windows wasn't designed to be multiuser." This came from the fact that all kinds of crap breaks if you try installing stuff as administrator and then running it as other users, or as multiple users at once. This is common with some legacy software that only wants to be run once, or wants to store settings in its installation folder. This is NOT the direction we want to go.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 9:48 UTC (Sun) by khim (subscriber, #9252) [Link] (5 responses)

Linux is easy to use and usually comes with enough for a business and the average user at least. For gamers, people grasp at straws: Doom3, Quake3, ok. Older games like Quake1,2 and Duke Nukem, original Doom, etc, sure. New hits like WoW or FF11? No, sorry. Often people are even attached to programs that they have to be eased away from.

"Ready for the desktop" claims are also often made, but not substanciatable. New hardware pulls ahead of Linux. I have a USB WIFI (netgear WG111) and a PCMCIA WIFI (adm8211 based DLink DWL-650), neither of which work. Newer graphics hardware from ATi and nVidia has no OSS drivers, and the closed ones impede security work.

Scanning, scanning, scanning... Yes! You just described situation with Mac OS X! Situation is exactly like you described (not sure about your USB and PCMCIA cards but there more hardware incompatible with MacOS then there are hardware incompatible with Linux: you need special versions of videocarcds, you only can use limited nimber of priters and scanners, etc) so Mac OS X is totally not ready for desctop! Something strange goes on: Mac OS X is sold only as desktop OS and used by people as desktop OS. But it's totally unready for desktop... Hmmm...

Do not confuse "ready-for-desktop" and "ready-to-be-drop-in-replacement-for-Windows" - it's different things.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 19:07 UTC (Sun) by bluefoxicy (guest, #25366) [Link] (4 responses)

OSX supports binary drivers. Linux doesn't. It's a lot of work to trick Linux into doing a binary driver, by writing a glue driver and making the user compile that against his kernel so that the binary-only module links with it and makes a full driver. At least on OSX (as much as I hate the thing) there's some hardware that comes with OSX drivers (I've seen a few NIC cards with OSX drivers, but that's about it), so it's possible.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 0:35 UTC (Mon) by mikew (guest, #27697) [Link] (2 responses)

Linux supports a variety of Linux binary drivers (see ATI, nVidia, and ltmodem) and Windows drivers (via ndiswrapper and linuxant). Making it easy is the distro's job, and a number of them do make it easy by including a few binary drivers. I cannot comment on how tricky it is to write a glue layer, but users don't write glue layers, so it is not a concern. Easy distribution/installation of binary drivers is the main concern.

Binary drivers on Linux only seem hard because Linux makes GPL drivers (that are inside the tree) so much easier. Linux doesn't support anything - the distro does.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 5:56 UTC (Mon) by bluefoxicy (guest, #25366) [Link] (1 responses)

Linux doesn't support binary drivers. A .ko for 2.6.1 won't work on 2.6.11-grsecurity. If you make a specialized binary driver interface to support a compiled object and link to it, you can compile half of a module to stick it to your kernel. This isn't binary driver support.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 10, 2005 3:21 UTC (Thu) by melauer (guest, #2438) [Link]

> Linux doesn't support binary drivers. A .ko for 2.6.1 won't work on
> 2.6.11-grsecurity.

Well, if by "ready for the desktop" we're talking about the user's
perspective, this is an irrelevant point. What users need is good
installers. OS X might have binary drivers, but if the driver install
weren't point-and-click, it would be a problem for most users. The fact
that installing drivers on Linux involves some compiling (as compared to
stuff like registry/netinfo database editing which is equally arcane to
most users) isn't the problem. Users are going to run the installer, not
copy files from one /lib/modules subdirectory to another. Mikew said the
same thing when he stated that "Easy distribution/installation of binary
drivers is the main concern." Whether or not Linux has succeeded in this
area is a separate question.

Now, if you're talking about the developers' point of view, and how hard
it is for any given hardware vendor to write their own wrapper/binary
driver pair, then that's a good point, and could lead to a very
informative discussion (e.g. "How is Project Utopia getting along").
It's just that in the post which started this thread, it sounded like you
were talking about the user's point of view.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 10:52 UTC (Mon) by khim (subscriber, #9252) [Link]

And this is related to Desktop readiness exactly... how ? Just consider the whole linux kernel as set of drivers - and you'll be all set.

This is not hoax, you know:
Last login: Tue Jan 11 22:32:50 2005 from 213.85.115.74 Red Hat Linux release 6.2 (Zoot) 2.6.9 #3 Sun Nov 28 22:33:46 MSK 2004 [khim@mccme khim]$

I have more gadgets supported by Linux then I have gadgets supported by Mac OS X - despite "oh-so-woderfull" binary drivers support. My only grief is videocard drivers - but I can live with old cards supported by GPL drivers just fine.

Oh - and I have a lot of hardware unsuppored by Windows as well. GDI-printers with drivers for only Windows 9X, USB gadgets without drivers for Windows XP (only Windows 2000 is supported) - I've seen this all. Even SoundBlaster 16 is not supported by Windows 2000 anymore! And I've seen hardware without support for Windows 9X as well. Only if you need to use "latest and greatest" hardware with "latest and greatest" Windows - you are in luck. Otherwise you are reduced to shamanic dances with tambourines - not such big difference between Linux and Windows...

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 13:42 UTC (Sat) by zotz (guest, #26117) [Link]

"But I think it is important that the open source community demonstrate fairly that open source software presents a better cost/benefit case than Windows."

Look, if you were a slave and yearning to be free, and people were telling you how to get free, would you listen to other people telling you that you needed to wait to be free until the people telling you how to gain your freedom demonstrated that freedom presents a better cost/benefit case than slavery?

Not everything is money. I know I am much happier as a Free Software user than I was before I made the switch. (And here is a little secret, it is costing me less as well.)

Has anyone undertaken to do a legal cost benefit analysis for home and small businesses? Thats is, survey typical windows and linux desktops. Add up the costs if all the software on those desktops was legally purchased and compare. Also, if not all the software on said desktops is legal and provably so, what is the dollar figure at risk for the users. (I am not asking for the likelyhood of having to pay, I am asking how much is at risk in a worse case situation for an average desktop.)

all the best,

drew

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 17:43 UTC (Sat) by cpm (guest, #3554) [Link] (5 responses)

What's a corporate desktop exactly?

I always wonder.

Me, I am a corporation. I have a desktop. I use linux because;

It has a better cost/benefit ratio than *anything* else.

It works when others don't.

I can do things with it others can't do.

The recent IBM paper "Linux Client Migration Cookbook:
A Practical Planning and Implementation Guide for
Migrating to Desktop Linux"
http://www.redbooks.ibm.com/abstracts/sg246380.html
is a very nice read.
TCO is not an issue. Free is free. Paid support is paid support.
On my level, supporting client networks of up to about 100 users,
(but mostly more like 10) all of the usual stuff about management doesn't
even apply. A copy of MS Office costs a lot of money, 10 copies cost
10 times that much. A copy of Open Office costs a whole heck of a
lot less, even with a 1 or 2K donation.

There is no contest.

Yeah, maybe migrating the entire world-wide automotive industry
off of Windows might be a daunting task, esp considering the
deep investment they have in IT skills and knowlege on that
platform. So what? Sooner they do it, the sooner everyone benefits.

That aside, Migrating the local library is a no brainer.

TCO? there is no argument.

As to buying stacks of machines without windows preloaded,
Try ignoring that irritating guy on TV, and go visit your local
whitebox shops and find the local guy who is going to actually
offer you service, and buy from him/her. That's where you can
get machines without windows preloaded, and a smaller MS tax.

More work initially, sure. But worthwhile work.

On another side note, I'm finding that "fork-lift" migrations
are easier than incremental. Folks whine and carp about things
for a few weeks, then they are just working again. Every once
in a great while, a spreadsheet misbehaves, and the end-user
has to ask the author to save it as an older version and resend it.
That coupled with no activeX web-desktop contols, kinda puts
a few so-called web-services out of reach. But in the end,
if that's as clueful as the service provider is about "web" technology,
then no doubt,there is someone else out there providing a similar
service that will get the job done that doesn't require Internet
Explorer 5.5 or better (on Windows, macs need not apply). Choose
your vendors, and get back to work.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 17:46 UTC (Sat) by cpm (guest, #3554) [Link] (4 responses)

I'm sorry, I painted a brighter picture than reality.

Most of the "linux" migrations I've pulled off, there were
one or two holdouts who wouldn't release their MS boxes
until pried from their cold, dead fingers. So supporting
that platform is still a reality, even in shops
with a good amount of Linux. This is even more true of
Mac folks. But the newer Macs with OSX bother me a whole
lot less. Nearly anything that needs to be done, can be
done on a modern mac, and folks who choose to go that way
are fine by me. Little "tco" advantage there, but those
users don't seem to care.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 20:14 UTC (Sat) by MathFox (guest, #6104) [Link] (3 responses)

There also is something like "Return on investment". Assume someone is happy to use a Mac and produces more than he would produce with Linux. The difference in productivity (extra revenues) could pay for the higher cost of the Mac.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 5, 2005 20:45 UTC (Sat) by cpm (guest, #3554) [Link]

Good point.
You'll get no argument from me.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 6, 2005 13:17 UTC (Sun) by zotz (guest, #26117) [Link] (1 responses)

"The difference in productivity (extra revenues) could pay for the higher cost of the Mac."

It could, unless some of the problems that can pop up beacuse of the non-free nature of the code raise their ugly heads.

Also, how do you handle the costs that come with the added complexity if you let each employee use the platform and stack that makes them individually the most productive.

A wants linux and abiword. B wants linux and emacs. C wants bsd and vi. D wants windows 95 and lotus wordpro. E wants an epson qx-10. F wants win xp and office. G wants a mac and office. Etc. While each, individually, may be more productive for you with their platform of choice, will your company overall be more productive if you go this route?

all the best,

drew

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 9:42 UTC (Mon) by massimiliano (subscriber, #3048) [Link]

Personally, I think you can have it both ways.

The company should both have a set of supported applications, and allow each employee to go his own way, provided that file formats are the same and general practices/processes are honored.

Of course, those users that "go their own way" should just support their own sistems, without impacting the existing IT infrastructure.

In fact, I've been using a Linux desktop in Siemens for seven years in a row, and I was so productive that nobody ever dreamed of telling me "please use Windows".

It worked like a charm, and wasn't the only one doing so... and for me it ended only because I switched job.

But things here (Novell, mono team) are not that diffrent: in theory we should use NLD, but in practice if we work well with our distribution of choice everybody is happy.

Part I: Corporate Desktop Linux - The Hard Truth (OSDir.com)

Posted Feb 7, 2005 19:28 UTC (Mon) by thompsot (guest, #12368) [Link]

The author curiously skips over thin clients entirely. I have seen my corporation spend millions on desktop hardware, only to spend millions more to support it (maint. contracts, support staff, etc), only to spend millions more replacing it, and the cycle starts all over again.

With thin clients accessing a Linux instance (pick an underlying infrastructure... RLX, VMware, etc.), a company's divisional or departmental employees can have all the functionality they need to run the business without having to waste money on a low-end server class machine running a powerful virus/spyware/malware magnet sitting on each and every desk.

Why not take some of that money that's wasted on Microsoft CAL's, desktop hardware, extra desktop support staff and/or contractors, virus software, vendor lock-in (which requires you to buy stuff you never counted on having to buy in order to get product A to work with product B), etc., spend part of it on the underlying infrastructure, and allocate the rest on thin client hardware? I realize it won't all even out immediately, and this would need to be a multi-year project to allow currently leased machines to finish out their life-cycle, but wouldn't the long-term benefits warrant such a project?


Copyright © 2005, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds