Schaller: Preparing the ground for the Fedora Workstation
So when we are planning the Fedora Workstation we are not just looking at what features we can develop for individual libraries or applications like GTK+, Firefox or LibreOffice, but we are looking at what we want the system as a whole to look like. And maybe most important we try our hardest to look at things from a feature/usecase viewpoint first as opposed to a specific technology viewpoint."
Posted Apr 16, 2014 15:39 UTC (Wed)
by epa (subscriber, #39769)
[Link]
Posted Apr 16, 2014 16:54 UTC (Wed)
by hallock (guest, #96510)
[Link] (5 responses)
Why would anyone, who doesn't want to use a totally opaque shit-layer
If this is the future of the Linux workstation, then there is no future.
Posted Apr 16, 2014 22:23 UTC (Wed)
by mathstuf (subscriber, #69389)
[Link]
Posted Apr 16, 2014 22:49 UTC (Wed)
by daniels (subscriber, #16193)
[Link]
You don't like it, fine. Some people do. Leave each to their own.
Posted Apr 18, 2014 19:07 UTC (Fri)
by rgmoore (✭ supporter ✭, #75)
[Link] (2 responses)
Yeah. <sarcasm>I mean who would want "defined versioned library bundles that 3rd party applications can depend on regardless of the version of the operating system"? Who would prefer stability over endlessly changing APIs.</sarcasm>
Posted Apr 23, 2014 10:30 UTC (Wed)
by gvy (guest, #11981)
[Link] (1 responses)
When Red Hat is unable to fix their damn anaconda and broken packages linking across /usr they just tell there's no separate /usr anymore.
When their update procedure is broken beyond repair they just try to postpone facing it by having another iteration with initrd (I won't be surprised if they come up with UEFI upgrade helper app someday).
But trying to shove things under a carpet doesn't actually work.
I think that Cormier and the kind of "ex"-windows folks RHAT seems to have hired without hiccup over these several years are destructive. :(
Posted May 4, 2014 14:51 UTC (Sun)
by rwmj (subscriber, #5474)
[Link]
Posted Apr 16, 2014 17:13 UTC (Wed)
by HelloWorld (guest, #56129)
[Link] (9 responses)
I'm also deeply sceptical of his opinions on webapps and the “cloud”. We know that the US and UK governments and others do an insane amount of spying. Before we think about desktop integration, we need to think about how to make sure that those webapps don't spy on us! We need end-to-end public key cryptography everywhere *before* we get involved with webapps.
Posted Apr 16, 2014 18:04 UTC (Wed)
by pizza (subscriber, #46)
[Link] (2 responses)
I agree with your conclusion, but not for the same reasons.
> I'm also deeply sceptical of his opinions on webapps and the “cloud”. We know that the US and UK governments and others do an insane amount of spying. Before we think about desktop integration, we need to think about how to make sure that those webapps don't spy on us! We need end-to-end public key cryptography everywhere *before* we get involved with webapps.
Unfortunately, the webapps battle was over long before most of us even realised it had started.
At the most fundamental level, webapps are *services* provided by some third party. No matter how much you secure your local system/client, said third party is completely outside of your control. The only way to ensure your data is protected is to run those services yourself, on systems you control completely. Which costs money *on an ongoing basis*.
Posted Apr 16, 2014 18:31 UTC (Wed)
by louie (guest, #3285)
[Link]
or worse for most people, time.
Posted Apr 17, 2014 9:19 UTC (Thu)
by davi (guest, #18853)
[Link]
We know we lose control of data we inject in webapps.
What #56129 means is the fact that webapp clients can steal a lot of 'extra' information from you. And there is where part of the business seems to be, at least at US/UK.
Posted Apr 17, 2014 15:10 UTC (Thu)
by drag (guest, #31333)
[Link] (4 responses)
Because that would mean that Linux distributions would have to give up control.
The mentality were a distribution spends 10 months pre-packaging everything under the sun and releasing it as some great eruption of software updates every 10 months is a broken model.
It creates a huge swath of bugs. It eliminates the user's ability to install the software they want with the versions they need. Right now if you have some piece of software on your current Linux system that you need to have a updated version of the easiest and most common recommendation for solving this issue is to _reinstall your entire operating systems_ (or do a major upgrade, which essentially amounts to the same thing).
And what if, god forbid, there is a major regression in the software that is released as a new update and/or you need to run a older version of the software for whatever reason. There is no formal solution for this. As a user you are completely left in the dust.
And the 'rolling upgrade' type distribution, like Debian unstable, isn't any better. Always having to be forced to run the latest version of the software isn't any better solution then always being forced to use the version of software that happened to be the most convenient for the distribution to package 7 months ago.
I like Linux and I like Linux distributions, but the model for managing and updating software is ends up being a sort of 'soft tyranny' over users. Most of the time it's unintentional, of course. All of this is just stuff that happens. Nobody is evil or incompetent here.. it's just how things are.
If you want it to be easy to install any deb or rpm on any system then what is required is a Linux standard base AND to eliminate the traditional Linux approach that they need to try to pre-package everything under the sun. Instead it must be made easy for software authors to package and distribute their own software themselves and make it easy and supported that users can arbitrarily pick and choose the source of their own packages and use whatever versions of software they feel they need to use.
Linux distributions can, of course, continue to work with upstream authors to improve packaging and create package collections for users to easily find and install software.
I really don't see this happening though.
So the next best thing you can do is simply side step the whole distribution problem and use a new packaging system that is completely separate from the existing ones. Let the distributions continue to play in their own private ecosystem, while you use various techniques like sandboxing, containers, or abstractions to help isolate the users from the damage that distributions and library authors inflict on software compatibility. This way you create a universal software distribution method that is no longer tied to particular OS releases.
Hopefully then you can create something that is so useful and easy to use that end users stop trying to run 'apt-get' or 'yum' to fetch the software they need and start preferring your own 'universal' system. That way distribution authors themselves will begin so the value in creating a layered OS and take a huge burden off themselves and actually improve their own ability to change and improve the system.
Posted Apr 22, 2014 8:08 UTC (Tue)
by epa (subscriber, #39769)
[Link] (3 responses)
Like standards, I don't think that the solution to multiple incompatible package systems is to create yet another 'universal' system. Instead fix the ones we already have. A start would be to allow non-privileged users to install packages in their home directory without affecting the rest of the system.
If distributions also paid the slightest attention to cross-compatibility of packages then there might be some sliver of a chance that people would try to build packages that install on all distributions. Right now, if you file a bug saying that it is not possible to package program X as a single RPM that works on both Fedora and SuSE (even because of something trivial like different naming conventions for package dependencies) the bug would be rejected as invalid by both projects.
Posted Jun 9, 2014 12:41 UTC (Mon)
by Duncan (guest, #6647)
[Link] (2 responses)
Packages then install on existing systems, generally regardless of the versions of various dependencies installed on the system, and with the proper tools, reverse-deps in the case of a library upgrade are either automatically managed by the package manager, or by running a tool to scan for problems and do rebuilds as necessary.
With such a system in place, distros can carry multiple versions of various packages, leaving the local sysadmin to choose which ones to install systemwide. Packages that don't have a distro-supplied package management script never-the-less remain easy enough to either build and install manually/directly (without package management), or to create local package management script for so they too are tracked by the package manager.
That nicely avoids versioning issues altho it does leave those who refuse to respect the rights of their users and who won't ship sources out in the cold, to a certain extent. Tho even there, with multiple versions of various libraries, etc, available, package management scripts that specify the correct deps are very possible, and as always, if there's no such pre-made script available, the user is free to either create a package management script of their own, or simply bypass the PM and do the installation manually.
IOW, most of the problems discussed above are easily solved at the package sources level. It's mainstream Linux distro's (and user's) weird insistence on prebuilt binaries in this day and age of ever increasing CPU power such that building from sources is now trivial, that triggers most of the limitations and problems listed above.
Duncan
Posted Jun 9, 2014 15:12 UTC (Mon)
by mpr22 (subscriber, #60784)
[Link]
Building a Linux kernel (never mind any of the userland stuff running on it) takes tens of minutes on ext4fs on rotary storage with a 2.2GHz amd64 system with no memory pressure (I launched the build at 15:43; as of 16:11 it's partway through drivers/), so I contend that there is in fact nothing weird about wanting prepackaged binaries.
Posted Jun 9, 2014 15:49 UTC (Mon)
by pizza (subscriber, #46)
[Link]
When it takes upwards of a day to build the likes of a modern web browser on a typical user's system (and probably requiring more RAM than is available...) the notion of pre-built binaries is considerably more appealing.
Granted, most stuff isn't that insane to build, but those are the things that are most-often updated on my systems.
Posted Apr 21, 2014 14:34 UTC (Mon)
by Wol (subscriber, #4433)
[Link]
At least debs tend to install on any deb-based distro. That certainly didn't used to be true for rpms ... although Red Hat rpms did tend to work on Red-Hat-derived distros.
The only reason debs work everywhere debs are used is that all those distros are Debian derivatives and stick close(ish) to their parent. The two major rpm-based distros are not related (and indeed, Red Hat is the *younger* of the two). SuSE is a Slackware-derivative, although it's evolved an awful lot since then. So although it shares the same packaging format as Red Hat, it contains a lot of non-RH heritage.
If Lennart can successfully impose something similar to the LSB, it will give a major boost to the linux ecosystem - not least because it will make software vendors lives easier (and no, Free Software does *not* work that well in a fair few niches - if customers want to pay then Free doesn't cut it :-(
Cheers,
Cheers,
Posted Apr 16, 2014 17:42 UTC (Wed)
by hitmark (guest, #34609)
[Link] (12 responses)
RH is gunning for taking as complete control of Linux as they can.
Any project where a RH employee is top dog needs to be avoided like the plague by the larger community...
Posted Apr 16, 2014 17:57 UTC (Wed)
by pizza (subscriber, #46)
[Link]
You're also completely free to propose, design, and build your own system, building on as much or as little of the current status quo as you see fit.
Good luck.
Oh, one more thing:
> Any project where a RH employee is top dog needs to be avoided like the plague by the larger community...
I guess that means you won't be using gcc, binutils, glibc, or even the Linux kernel itself then. Try not to let the door hit you on your way out.
Posted Apr 16, 2014 18:17 UTC (Wed)
by zdzichu (subscriber, #17118)
[Link]
Posted Apr 16, 2014 19:58 UTC (Wed)
by JMB (guest, #74439)
[Link] (1 responses)
Posted Apr 16, 2014 20:56 UTC (Wed)
by einstein (subscriber, #2052)
[Link]
Well, I for one rather like KDE4. I've been using Kubuntu 14.04 on my primary desktop since an early alpha in December. Over the years I've used twm, olvwm, fvwm, cde, gnustep, gnome, icewm, xfce, enlightenment and others, but KDE is IMHO the best DE of the bunch by a mile, at least for my workflow. (online/social media, email, coding,movies & music and 3D FPS gaming)
Posted Apr 16, 2014 20:23 UTC (Wed)
by wmf (guest, #33791)
[Link] (7 responses)
It's interesting that you specifically mention Red Hat, since Ubuntu has also been following this same strategy for several years. One might say that RH has to do this to prevent Canonical from taking complete control of Linux.
Posted Apr 16, 2014 21:10 UTC (Wed)
by einstein (subscriber, #2052)
[Link] (6 responses)
I, too, am a Mac user. Well, once in a while anyway. While my primary platform is Linux, I am in the habit of using the HR block software to do my taxes, and that does present a bit of a problem, as they have, to date, only released versions for mac and pee cee, not Linux. OSX provides a fairly satisfactory interim solution, and it's fairly nice to use.
Posted Apr 16, 2014 23:20 UTC (Wed)
by mebrown (subscriber, #7960)
[Link] (5 responses)
Posted Apr 16, 2014 23:24 UTC (Wed)
by einstein (subscriber, #2052)
[Link] (4 responses)
Ah interesting. my tax situation is a bit complicated, so I have been using the full fat client version. I hadn't looked at any of the web versions lately but somehow had the impression that approach was only practical for very simple, very typical, very straightforward tax situations. Is that no longer the case?
Posted Apr 18, 2014 16:01 UTC (Fri)
by speedster1 (guest, #8143)
[Link] (3 responses)
> Ah interesting. my tax situation is a bit complicated, so I have been using the full fat client version. I hadn't looked at any of the web versions lately but somehow had the impression that approach was only practical for very simple, very typical, very straightforward tax situations. Is that no longer the case?
I've been using H&R Block online for years, since it is accessible from Linux, and it has served ok. Could you give an example of what sort of situation would make a tax return "not simple" in your classification scheme, such that online tax packages might not be practical?
Posted Apr 28, 2014 22:07 UTC (Mon)
by einstein (subscriber, #2052)
[Link] (2 responses)
Well, let me say that I have not checked the online versions in a few years, but last I checked, the sort of things that were problematic for them would include for instance, home mortgage deductions, home office deductions, sale of a house, in general anything too far removed from the short form. It would be a relief to hear from someone that they have successfully used an online form to deal with those sorts of tax scenarios.
Posted Apr 29, 2014 4:36 UTC (Tue)
by speedster1 (guest, #8143)
[Link] (1 responses)
Posted Apr 29, 2014 17:21 UTC (Tue)
by einstein (subscriber, #2052)
[Link]
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
(systemd + LinuxApps) between the kernel and himself, ever do development
on such a system?
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
It's rather about gross ignorance towards ABI compatibility efforts (both needed and already done by e.g. ISPRAS ABI compliance checker guys).
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
The wrong direction
What we need is not more stuff (i. e. another distribution), but getting rid of stuff. Why do we still have rpm and deb, when they solve essentially the same set of problems? Why is there no unified component technology? Windows has had OLE and COM forever, on Linux we still have Gnome- and KDE-specific technologies such as KParts and whatever Gnome uses these days. There are many such examples, and I think this holds Linux back more than anything else. And it is possible to fix this kind of situation: D-Bus is now the universally accepted IPC technology on Linux and systemd is fixing the plumbing layer. But things need to be taken much further than that. As Hans Reiser said almost exactly 10 years ago, “the expressive power of an operating system is NOT proportional to the number of components, but instead is proportional to the number of possible connections between its components”. It ought to be possible to take any random GObject and connect its signals with the slots of a QObject and vice versa. IOW, there should be a common object model between those. I don't see how Mr Schaller's work contributes to that kind of vision.
The wrong direction
The wrong direction
money
> third party. No matter how much you secure your local system/client,
> said third party is completely outside of your control. The only way
> to ensure your data is protected is to run those services yourself,...
The wrong direction
The wrong direction
The wrong direction
The wrong direction
It's mainstream Linux distro's (and user's) weird insistence on prebuilt binaries in this day and age of ever increasing CPU power such that building from sources is now trivial, that triggers most of the limitations and problems listed above.
The wrong direction
It's not straightforward to install an RPM on
Wol
Wol
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
RHEL could not use GNOME3 in the form it is used right now which was said by developers paid by RH to be good for everyone (I would not touch it - same for KDE4 and Unity) - but RH knows exactly this.
So maybe Fedora and several paid developers may to be avoided (if you don't like rants etc.), but not _all_ developers of RH.
It's a big company and there are really good persons working for them (like Alan Cox did - one of the best IMHO).
I don't expect RH to help the desktop - SuSE were superior before in this domain and Ubuntu really got it like Mint does now on top of Ubuntu.
I would still make my bet on Ubuntu in that respect - they should have stayed more user centric (Unitiy configuration was promised long ago; and Apple design looks ugly:).
Xubuntu 13.10 was nice and maybe 14.04 LTS will be even better.
But unification - why? Interoperability and configurability to common standards is enough - isn't it? We don't need any monopolist in the desktop field - the experience is there ... and should be even more avoided than said developers. ;-)
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation
Schaller: Preparing the ground for the Fedora Workstation