Apple has a history with gcc.
Apple has a history with gcc.
Posted Feb 23, 2015 1:48 UTC (Mon) by Cyberax (✭ supporter ✭, #52523)In reply to: Apple has a history with gcc. by pizza
Parent article: Emacs and LLDB
So does LLVM.
Patent MAD is fine with most companies, so that's also not a point. Besides, Apache License has much milder language than GPLv3 in that regard.
> Just off the top of my head, Apple's entire Swift language infrastructure, ARM's proprietary compilers, nVidia and AMD's shader compilers.
Swift is likely to be open sourced soon. ARM proprietary compilers are pretty much dead, AMD mainlined their LLVM shader compiler, and NVidia uses a completely proprietary infrastructure (no LLVM) for their internal OpenCL system.
So yes, there are certainly dark corners with proprietary forks of LLVM, but they are utterly insignificant.
I'd wager that there's no risk of significant proprietary forks of LLVM and other liberally-licensed projects. It just doesn't pay.
Posted Feb 23, 2015 5:45 UTC (Mon)
by magila (guest, #49627)
[Link] (2 responses)
Posted Feb 23, 2015 6:00 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
Posted Feb 23, 2015 6:41 UTC (Mon)
by magila (guest, #49627)
[Link]
Posted Feb 23, 2015 13:01 UTC (Mon)
by pizza (subscriber, #46)
[Link] (1 responses)
It requires a CLA in order to *contribute upstream* but there's no "poison pill" clause in LLVM's license that prevents you from going after other users of LLVM without also losing the rights to use it yourself. That won't protect from NPEs (ie true trolls) but it's at least a start.
> Swift is likely to be open sourced soon.
That's really not much of a counterpoint. But we shall see, in any case.
> ARM proprietary compilers are pretty much dead.
Not by a very long shot. I know this because my employer just re-upped their licenses to the ARM compilers, which as of v6 are built on top of LLVM. Which brought many improvements, but it's still just as proprietary as before -- We stop paying, we lose the ability to take advantage of our own work.
> AMD mainlined their LLVM shader compiler
Oh, that's news to me, good for them.
> and NVidia uses a completely proprietary infrastructure (no LLVM) for their internal OpenCL system.
I'm sorry, nVidia is more credible than you: https://developer.nvidia.com/cuda-llvm-compiler
> So yes, there are certainly dark corners with proprietary forks of LLVM, but they are utterly insignificant.
That's how it always starts.
> I'd wager that there's no risk of significant proprietary forks of LLVM and other liberally-licensed projects. It just doesn't pay.
Come on, you've been around long enough to know that tangible short-term gains nearly always trump longer-term views.
Posted Feb 23, 2015 13:10 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link]
> I'm sorry, nVidia is more credible than you: https://developer.nvidia.com/cuda-llvm-compiler
However, it compiles into an intermediate language which is later optimized and executed by proprietary NVidia infrastructure.
> Come on, you've been around long enough to know that tangible short-term gains nearly always trump longer-term views.
Apple has a history with gcc.
Apple has a history with gcc.
Apple has a history with gcc.
Apple has a history with gcc.
Apple has a history with gcc.
I've no idea why people still use them. Anyway, do they contain any significant improvements?
CUDA support is _mainlined_ in LLVM: https://github.com/llvm-mirror/llvm/tree/a7f8f932a67badb2...
And so? I expect that a lot of companies are going to produce their "Super Mega Fork Of LLVM, Now With A Big Red Button!". However, they won't contain anything of interest.
