LWN: Comments on "Google's dropping H.264 from Chrome a step backward for openness (ars technica)" https://lwn.net/Articles/422872/ This is a special feed containing comments posted to the individual LWN article titled "Google's dropping H.264 from Chrome a step backward for openness (ars technica)". en-us Thu, 09 Oct 2025 10:19:29 +0000 Thu, 09 Oct 2025 10:19:29 +0000 https://www.rssboard.org/rss-specification lwn@lwn.net Usefulness of patents https://lwn.net/Articles/428066/ https://lwn.net/Articles/428066/ Randakar <div class="FormattedComment"> <p> No. Patents don't work for physical things either. If it doesn't work for the steam engine (<a rel="nofollow" href="http://discardedlies.com/entry/?52916_">http://discardedlies.com/entry/?52916_</a>) which is about as physical as it gets, how on earth is it supposed to work for anything else?<br> <p> There's a pretty good research paper out there (no link at hand right now, sorry) going into the question wether the early patents - steam engine, light bulb, telegraph, and so on - had a positive effect on innovation. The answer is a resounding: "No".<br> <p> <p> </div> Tue, 15 Feb 2011 14:10:27 +0000 Usefulness of patents in some fields https://lwn.net/Articles/423925/ https://lwn.net/Articles/423925/ coriordan <div class="FormattedComment"> In software, the manufacturers (software developers) generally are the innovators.<br> <p> In some other industries, innovation mostly comes from R&amp;D which is performed by people who are completely removed from the manufacturing process. In those industries, patents might be a good thing, because they allow for a link between the manufacturers and the innovators.<br> <p> In software, the way to let innovators earn money is by allowing them to write software.<br> <p> It's really essential to distinguish between different fields when proposing a patent policy.<br> </div> Thu, 20 Jan 2011 16:38:57 +0000 Usefulness of patents https://lwn.net/Articles/423882/ https://lwn.net/Articles/423882/ Felix.Braun <div class="FormattedComment"> Maybe, but Real Innovators also need to pay for food and housing. They should be able to live off their innovations, otherwise they'd have to work as banking clerks during the daytime. That wouldn't necesserily prevent them from making their inventions, but it would come close to being cruel and unusual punishment, which is unconstitutional ;-)<br> </div> Thu, 20 Jan 2011 12:32:00 +0000 Usefulness of patents https://lwn.net/Articles/423563/ https://lwn.net/Articles/423563/ rahvin <div class="FormattedComment"> I'll answer your question with a question. If RSA couldn't patent PKI would they have invested the money to create the very creative mathematical models that allow PKI to exist? <br> <p> From the reports I read RSA invested millions creating the system, proving it worked and backing up the ideas behind it with hard proof. In fact that patent created a ton of research in cryptography, to the point where I'd argue that without the patent RSA probably would have never spent the money and if they had they probably would have refused to sell to anyone outside government. And beyond that had the patent not been granted all the secondary research to find other algorithms wouldn't have taken place and we would have a very limited cryptographic environment available now. How many crypto systems were developed to avoid the RSA patents?<br> <p> And see that's the whole point of patents and the basis behind them stated in the constitution. It was recognized even back in late 17th century that by allowing a limited monopoly on an invention you would create an environment where people and companies would invest in research and development that benefits everyone in society later. I'll argue that the RSA patent directly caused the creation of an entire encryption ecosystem that today provides security in a digital world. We have more than a dozen different algorithms today that might not have existed without the RSA patent. That's a hell of a benefit for a decade and a half of single ownership. <br> <p> As an aside I don't think you should be able to patent raw mathematics, but in the case of RSA and PKI IMO the math was only part of the equation, the application and use of the underlying mathematics was what was patentable not the mathematics themselves even if that's what appeared to be patented in the application it was the application and use which was the unique and patentable idea. Maybe I'm full of crap but I believe without the patent system you would have an economy much like China, where very few people invent anything and those that do are quickly copied and put out of business without the means to invent further. There's always someone out there that can copy ideas and either tweak the design or have better advertising and put the real inventor out of business. Maybe that's the way things should work, but why would the ones that come up with the original ideas bother even telling anyone about it if someone can come along and market it better?<br> </div> Tue, 18 Jan 2011 18:59:12 +0000 Usefulness of patents https://lwn.net/Articles/423526/ https://lwn.net/Articles/423526/ nix <div class="FormattedComment"> That sounds more like the end-game of the *economy* to me.<br> <p> </div> Tue, 18 Jan 2011 15:11:54 +0000 Usefulness of patents https://lwn.net/Articles/423478/ https://lwn.net/Articles/423478/ butlerm <em>the problem is the quality of inventions being patented</em><br> <br> There is nothing the Supreme Court can do about that. Nor Congress (by all appearances) because there does not appear to be any objective way to determine what innovations are "obvious", or what it is that might make one government granted monopoly social welfare enhancing and others social welfare destroying.<br> <br> What is going on now is the end game of the patent system. The very idea sowed the seeds of its own destruction. There is no legally defensible way to distinguish a "good" patent from a "bad" patent, so the consequence is that the vast majority of patents are bad, which is death to the economy. Tue, 18 Jan 2011 07:50:20 +0000 Usefulness of patents https://lwn.net/Articles/423477/ https://lwn.net/Articles/423477/ butlerm <div class="FormattedComment"> "Best" is subjective of course - selection would be by committee. I suggest this as a compromise only. Real innovators don't need government rewards, monopolies, or subsidies. <br> </div> Tue, 18 Jan 2011 07:32:54 +0000 Usefulness of patents https://lwn.net/Articles/423475/ https://lwn.net/Articles/423475/ dark If the RSA patents were good example, then what was their benefit? They were a massive obstacle in the deployment of secure internet protocols; I don't see any benefits to weigh up against that. Tue, 18 Jan 2011 07:22:44 +0000 Usefulness of patents https://lwn.net/Articles/423456/ https://lwn.net/Articles/423456/ rahvin <div class="FormattedComment"> I understand what you arguing, but the problem with the patent system right now isn't that you are creating a government monopoly using patents the problem is the quality of inventions being patented. The abuse of the system is through software patents and business method patents which are on idea's. The system as originally intended and run for most of this countries lifetime was only on actual physical objects. For example the light-bulb where someone had to spend years, tons of money and research finding a material and system that would create light using electricity. Although it may seem trivial today to discover that in a vacuum tungsten would create a white light was quite revolutionary even if several people were working on the problem simultaneously. In addition even once granted the light-bulb patent would have only covered that specific implementation. Replacing the filiment with another material would have avoided the patent. <br> <p> The problem is when you turn to software and business method patents, the patents themselves are so incredibly vague, they don't cover a specific implementation, they cover the very idea. That's the problem with the system, if software patents were restricted to a very specific implementation like we did in the past such that even a single change such as storing it one dimensional array versus a two dimensional array would evade the patent then the system would work. But the problem is that the patents as granted currently don't cover implementation, they cover the idea. Take multi-touch patents, they don't discuss the specifics such as input, variables, code and processing data, they discuss using more than one touch at a time. It's equivalent to patenting the existence of and use of a single button on a machine, something that in past would have never been granted although in todays patent office very well could. <br> <p> The RSA patents were mentioned, I'd argue they were the prime example of a proper software patent. They were patenting the very specifics of the system because the encryption system is extremely specific in how the mathematical models work. A single change could have avoided the patents but in this case probably would have broke the system and resulted in non-functioning encryption. For a functioning system to work the patent should cover and require disclosure and filing of the code in question with the patent applying to that specific implementation. Admittedly this would gut almost the entire software patent system, but IMO that's exactly what needs to happen. Only the best of idea's should be patentable and worth patenting. Right now there are so many garbage patents filed yearly it's destroyed the credibility of the system. The great hope is that the Supreme court will tackle this with an upcoming case, otherwise we are looking at the continual decline and destruction of the US and other western economies that adopt this crazy policy.<br> </div> Tue, 18 Jan 2011 02:01:46 +0000 Usefulness of patents https://lwn.net/Articles/423310/ https://lwn.net/Articles/423310/ dirtyepic <div class="FormattedComment"> Define "best".<br> <p> Not only are you just shifting the subjectivity from some idea of obviousness to some idea of importance, you've now created an environment of savage competition and rampant corruption, where everyone is focused on short-term results. Why fund a ten-year project that might not make the cut in the end when you can fund ten one-year projects and increase your odds?<br> </div> Sun, 16 Jan 2011 21:19:36 +0000 Usefulness of patents https://lwn.net/Articles/423272/ https://lwn.net/Articles/423272/ butlerm <em>You could make the same argument with land.</em><br> <br> You certainly could, but the argument with land would be entirely different, because land is a classic example of <em>natural</em> property, where there is no better example of <em>unnatural</em> property than a proprietary interest in what is rarely little more than an <em>idea</em>.<br> <br> Now while I hardly doubt that we would all be much better off in a world where patents were the rarest of exceptions (if that), how's this for an alternative? Have the government run a contest for the 1000 best basic technological innovations of the year. Grant each one a ten million dollar award. That is ten billion dollars a year, to promote (not pay for) basic research in a world (or at least a country) without patents. About the same as the yearly budget of the National Science Foundation. Sun, 16 Jan 2011 03:42:28 +0000 Usefulness of patents https://lwn.net/Articles/423266/ https://lwn.net/Articles/423266/ jthill <p><blockquote><i> I don't think there is any basis to claim that an idea of any usefulness is "proper" to anyone.</i></blockquote> <p>Which leads directly to regarding anyone wielding a patent as a troll: nobody has any right to, any proprietary interest in, any idea. Ok. You could make the same argument with land. Some cultures did, perhaps there are some left that do. Even here we know ownership is a gross tool. You can see the recognition of that with land use, where "owning" it doesn't allow you to stripmine prime farmland or to salt anything with toxic waste. <p>But if you're going to support the notion that patents are inherently bad, I think you're going to have to maintain that companies would fund and publish basic research even if they couldn't get patents on any resulting gizmos. Look at what IBM's doing with AI. Do you think we'd ever see the results of their work in the absence of patents? Sun, 16 Jan 2011 00:51:33 +0000 Usefulness of patents https://lwn.net/Articles/423263/ https://lwn.net/Articles/423263/ butlerm <div class="FormattedComment"> In an ideal world, if you come up with something that no one is liable to independently invent in the next *twenty* years there might be some small merit in the government granting a patent with mandatory licensing at regulated rates.<br> <p> The procedural problem is that the government has no means of telling whether that is actually the case, so even if such a standard was adopted thousands of illegitimate patents would still issue. The government is completely incompetent at evaluating a much looser standard. The USPTO's conception of "prior art" is so narrow as to be useless. Its conception of "obviousness" is vacuous - as in there is no procedural means to determine that _anything_ is obvious at the time of invention unless it is spelled out letter for letter in prior art, because obviousness is inherently subjective evaluation. Neither the USPTO nor the courts have any reliable means to determine the answer either way. <br> <p> Finally, I don't think there is an basis to claim that an idea of any usefulness is "proper" to anyone. Perhaps if it is so off the wall and unique that no one would independently duplicate in any amount of time (like a novel, or a musical composition). Anything less than that isn't readily attributable to an individual at all. Just an accident of timing and economics, if that. <br> <p> </div> Sat, 15 Jan 2011 21:58:25 +0000 Usefulness of patents https://lwn.net/Articles/423256/ https://lwn.net/Articles/423256/ jthill <p> Well, I think the RSA patent was a good one, no matter that I was champing at the bit for it to expire along with just about everyone else on the planet who could spell "RSA". That's a fairly weak example -- I think they'd have published even without a patent system -- but nothing else would ensure they could get some of the reward. <p> Or iow I think whether or not you regard someone demanding money for your use of "their" idea as trollish depends on whether you think they have a right to the idea and whether you think they're charging a fair price. Vorbis is technically a much better competitor for MP3 than Theora (but see the 1.2 preview) or VP8 for H.264, so Microsoft^WBright did raise one good point: if it were just about openness, Google would have dropped MP3. I suspect this is more about MPEG-LA's declared intent to abuse the monopoly position they're after. Sat, 15 Jan 2011 19:58:22 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423249/ https://lwn.net/Articles/423249/ kleptog <div class="FormattedComment"> The only decoder I have any experience with was the one in the Series 1 Tivo. It was an all-in-one chip essentially. You configured a circular buffer and it would DMA de MPEG-1 data straight from memory and decode it into TV-compatible signal. It had some stuff to do overlays with text. It could send interrupts to the CPU which would then read blocks into the buffers as needed. That's why the system only needed an 8MHz CPU to run. It also couldn't do anything useful other than decoding/encoding MPEG data.<br> <p> It doesn't seem logical to decode MPEG and then transport that data to another device. Decoded video is quite high bandwidth; often if you know the output format you can be more efficient about the decoding process.<br> </div> Sat, 15 Jan 2011 17:36:03 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423239/ https://lwn.net/Articles/423239/ RogerOdle <div class="FormattedComment"> That's today. Reprogrammable logic extends into all areas more and more everyday and it is adapting to the needs of battery powered devices. This is pushed by the need for field upgradable designs and reducing time to market. We are getting hardware in the field in advance of the standards they are supposed to implement and we are getting comfortable with that because we are getting used to connecting them to the Internet for little fixes. This is especially true for the phone market.<br> <p> Goggle is big enough to play the hardware game now. It can design chips for hardware decoding of non-H.264 video formats itself and offer them to the market. As more Internet content gets provided on alternative formats, the pressure for flexibility will increase. <br> <p> The problem with H.264 is that it got pushed on us whether we liked it or not. It is covered by vague and over-reaching patents that try to lock down the very concept of video encoding. It seems to me that patents were originally intended to cover whole things that were completely implemented. Now they are used to cover pieces of things which represent incomplete ideas and I do not think that this was intended when the idea of patents was created. Patents were created to promote innovation by encouraging people to share ideas and combine them in new ways. How does this modern interpretation of the patent system accomplish this?<br> <p> Remember that IP rights are not natural ones. They are not at the same level as the right to life, liberty, and the happiness which are natural rights. They are created by the state for a purpose and if that purpose is not fulfilled then they have no justification for existence. Is IP law being implemented and enforced in a manner that is in accord with its purpose? I do not think so. I see it as concentrating market share in the hands of the few who have the money to influence law makers. The law is being stood on its head where the IP system is used to stifle innovation instead of encouraging it.<br> <p> </div> Sat, 15 Jan 2011 16:48:35 +0000 re: don't knock all standards bodies https://lwn.net/Articles/423215/ https://lwn.net/Articles/423215/ rlhamil <div class="FormattedComment"> <font class="QuotedText">&gt; To me standards are something that should be implemented AFTER a implementation is made. </font><br> <p> I don't have a problem with that; the SVID came after SVR2, IIRC.<br> <p> The objective would be that the standard would be sufficient to guide the creation of an independent implementation capable of the specified interoperability.<br> <p> The end result should be that the standard is the reference, and the initial implementation(s) are simply _sample_ implementation(s). I understand that for any given implementation, the source is obviously authoritative as to what it does. But the point of the separation between standard and source is that the standard is authoritative for what it _should_ do (but not how).<br> <p> Most likely, I imagine as the ideal a prototype that becomes the first sample implementation, co-developed with the standard (but with some consideration to having the resulting standard able to work well on other platforms, etc), followed by one or more independent implementations based on the standard alone, and interoperability testing.<br> <p> </div> Sat, 15 Jan 2011 07:55:31 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423212/ https://lwn.net/Articles/423212/ butlerm <div class="FormattedComment"> I am aware of the conventional use of the term. I claim that it is a distinction without a difference, on the grounds that the entire purpose of patents - the reason why anyone wants one in the first place - is so that patent holders can engage in trollish behavior.<br> <p> What good is a patent that can be easily evaded? Nearly worthless, right? No better than a mere copyright. No, what everyone wants is a patent on an technique so fundamental that no rational person thinks it is an invention to begin with, so that they can collect royalties on entire fields of endeavor. An application so fundamental that there are no alternatives except to quit the business entirely.<br> <p> With perhaps the tiniest of exceptions the patent business is government malfeasance from first to last. The greatest impediment to science and the useful arts ever developed. Rotten to the core, enemy to human health and welfare, entirely contrary to its own stated objective. There doesn't appear to be anything that can be done to fix that, except allow patents for only the rarest of exceptions. Like pharmaceuticals with government mandated years long tests, perhaps.<br> </div> Sat, 15 Jan 2011 06:42:08 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423185/ https://lwn.net/Articles/423185/ rahvin <div class="FormattedComment"> Patent Troll was a term coined by the industry to describe holding firms created to hold patents but no assets such that patent lawsuits could be launched without fear of financial repercussions to a working business. The key being that the company is entirely a law firm LLC with no assets and no business that the patent is protecting. Troll was used because much like the child's tale they sit under a bridge (the market) and extract a toll on industry that uses technology.<br> <p> To say every firm with patents is by definition a patent Troll does not fit with the history or use of the word combination.<br> </div> Fri, 14 Jan 2011 21:13:43 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423173/ https://lwn.net/Articles/423173/ marcH <div class="FormattedComment"> I find it was worth it just for the discussion happening here.<br> <p> </div> Fri, 14 Jan 2011 19:56:10 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423152/ https://lwn.net/Articles/423152/ jzbiciak <P>DCTs are cheap these days. Problem is, that people keep changing them. You now have some integer DCTs that are specified to be <I>bit-exact</I>, for example, as opposed to relying on looser standards such as IEEE-1180. (Even IEEE-1180 sucks. I know, having implemented an IDCT that tested fine against it, very well in fact, but ended up still failing to be useful in some low bit-rate codecs due to limitations in IEEE-1180's testing model.)</P> <P>H.264's CAVLC and CABAC are a wonderous nest of complexity that are truly horrifying to behold. There are so many little twists and tweaks in there that it could drive an engineer mad trying to understand it all. (I watched my neighbor at work descend into that pit of madness, actually.) All the gates spent on hardware acceleration for those guys are likely to be wasted if you aren't running H.264 or something close enough to H.264 that it runs afoul of its patents.</P> <P>There's all sorts of other fun, such as loop deblocking filters and so on that also take part in the pipeline.</P> <P>Now, there's been work (and if you google for it, you'll find papers for them) on reconfigurable accelerators that hope to support standards other than H.264 and retain much of the efficiency. That's great, and I'd love to see such things shipping in volume. They aren't yet, though. In the meantime, there's oodles of smartphones and other gizmos out there today that won't benefit from that new work, but will take a hit when asked to display a codec their accelerators can't handle.</P> <P>I'm not arguing in favor of supporting H.264 everywhere, unless that also means donating its patents to the public domain and making it free for everyone. I'm just saying recognize that arguments mentioning hardware acceleration unfortunately have some merit for the time being.</P> Fri, 14 Jan 2011 17:31:40 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423149/ https://lwn.net/Articles/423149/ jzbiciak <div class="FormattedComment"> There are many SoCs out there with fixed or very-nearly-fixed function hardware accelerator blocks that support a small number of standards. These can't be upgraded to support arbitrary codecs merely by flashing a new image onto the device.<br> <p> Some SoCs have CPUs that are powerful enough to pick up the oddball codecs, but CPUs consume orders of magnitude more power to achieve the same end, and thus supporting arbitrary codecs represents a large battery life hit in a portable device.<br> <p> Some newer accelerators try to be more flexible, but there are limits to how much flexibility you can build in before you lose your energy efficiency advantage.<br> </div> Fri, 14 Jan 2011 17:17:57 +0000 re: don't knock all standards bodies https://lwn.net/Articles/423101/ https://lwn.net/Articles/423101/ farnz <p>It's necessary anywhere; the point of multiple independent implementations is to catch things that are unclear in the spec, and clarify them before they become an issue. <p>For a hypothetical example, imagine a spec that says "audio samples are stored as 24-bit values". The reference implementation may handle this as an unaligned 32-bit read from an array of bytes - in C-like pseudocode: <code> <pre> uint8_t *sample_array = get_samples(); for( size_t position = 0; position < sample_buffer_limit(); position += 3 ) { process_sample( *((uint32_t *) sample_array + position) ); } </pre> </code> <p>Now, you have a problem. Is this meant to be little endian or big endian? Is the picking up of 8 bits too many deliberate or accidental? Was the unaligned read deliberate, in which case I have to write code to handle it, or did the designer just not realise that unaligned reads are expensive on some architectures? <p>These are all questions that get resolved as more implementations come along. Lots of the answers are going to be obvious, but some aren't - and the extra thought the questions trigger may result in a better format at the end of it all. Fri, 14 Jan 2011 14:43:01 +0000 VP8 specification https://lwn.net/Articles/423095/ https://lwn.net/Articles/423095/ bawjaws <div class="FormattedComment"> I've heard that Google has sponsored Ronald S. Bultje to spend a year on xvp8, an encoder counterpart to ffmpeg's ffvp8 decoder (which he also worked on). Apparently some work has begun but he's just had a baby boy and so is taking a break to focus on being a dad. I don't know if the name is intended to indicate that it builds on the x264 encoder's codebase.<br> </div> Fri, 14 Jan 2011 12:16:53 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423090/ https://lwn.net/Articles/423090/ sorpigal Especially "De-facto standard." Remember GIF vs PNG, the early years? Or it's like being told "Jabber is not open, use the standard OSCAR instead." Time will tell. Fri, 14 Jan 2011 11:27:12 +0000 re: don't knock all standards bodies https://lwn.net/Articles/423089/ https://lwn.net/Articles/423089/ sorpigal There is value in multiple implementations, even open source ones. If there's only one consumer it's hard to know whether the spec is good because all you can say is that one consumer managed an interpretation which functions, not that the consumer was able to write an interoperable program from the spec. A spec which is so unclear that you can't expect two implementations to interoperate is a bad spec and you really need two or three implementations to find out whether this is the case. Fri, 14 Jan 2011 11:17:00 +0000 re: don't knock all standards bodies https://lwn.net/Articles/423083/ https://lwn.net/Articles/423083/ nhippi <div class="FormattedComment"> <font class="QuotedText">&gt; In my view, the first importance of a standard is that it is a useful specification _separate_ from an implementation, that (with or without royalties) could be used as the basis for multiple, interoperable implementations</font><br> <p> However, "multiple implementations" is truly only necessary in the proprietary software world. <br> </div> Fri, 14 Jan 2011 10:46:14 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423078/ https://lwn.net/Articles/423078/ bawjaws <div class="FormattedComment"> The Tegra2 has accelerated Theora, Vorbis and VP8 decode, and VP8 *encode* and "videoconferencing":<br> <p> <a rel="nofollow" href="http://www.nvidia.com/object/tegra-2.html">http://www.nvidia.com/object/tegra-2.html</a><br> <p> There was an avalanche of Tegra2-based tablets (and a smattering of phones) at CES. I believe the first one is due out this month (the Dell Streak 7"):<br> <p> <a rel="nofollow" href="http://blogs.nvidia.com/2011/01/tegra-2-tablets-tear-up-ces/">http://blogs.nvidia.com/2011/01/tegra-2-tablets-tear-up-ces/</a><br> <p> They guys that make the chips for all the cheap, interesting/rubbish tablets are also shipping support for 1080p WebM this quarter, so devices in users hands maybe first half of this year?:<br> <p> <a rel="nofollow" href="http://www.prnewswire.com/news-releases/rockchip-and-webm-release-rk29xx----worlds-first-soc-to-support-webm-hd-video-playback-in-hardware-113069829.html">http://www.prnewswire.com/news-releases/rockchip-and-webm...</a><br> <p> There was some valid arguments above that Google could have put WebM through a proper standardization process, or at least let the Xiph and FFMPEG guys a bit more time before locking everything down. The excuse at the time was that they needed to get the hardware guys working because of their longer lead times. Since they've delivered on that in spades, I'd say it was a good call in hindsight. The perfect is the enemy of the good etc.<br> <p> </div> Fri, 14 Jan 2011 10:12:45 +0000 re: don't knock all standards bodies https://lwn.net/Articles/423072/ https://lwn.net/Articles/423072/ salimma <div class="FormattedComment"> Indeed; so far we have two implementations of WebM (Google's and FFMPEG's); judging from the opinion of one of the lead developer of FFMPEG, it's probably a bit too early to get WebM standardized for now.<br> <p> Not saying that Google should not start documenting the codec better -- but this showcases what tends to happen when a codec is developed behind closed doors (when it was VP8); too much of the documentation is implicit, only existing in the heads of the developers working from the same office.<br> </div> Fri, 14 Jan 2011 09:44:42 +0000 VP8 specification https://lwn.net/Articles/423068/ https://lwn.net/Articles/423068/ tzafrir <div class="FormattedComment"> Good. Ffvp8 is a second independent implementation of the decoder. Is there a second independent implementation of the encoder?<br> </div> Fri, 14 Jan 2011 09:31:32 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/423014/ https://lwn.net/Articles/423014/ AndreE <div class="FormattedComment"> It's funny (or sad) that people willingly conflate "standard" with "open"<br> </div> Fri, 14 Jan 2011 00:02:31 +0000 re: don't knock all standards bodies https://lwn.net/Articles/423009/ https://lwn.net/Articles/423009/ drag <div class="FormattedComment"> To me standards are something that should be implemented AFTER a implementation is made. <br> <p> Actually ideally you want to have a good standard you should have no less then 3 separate competing implementations of the draft well in advance of it getting ratified. That way your more likely to arrive a superior solution with competing answers to unknown questions. <br> <p> If you make the standard first then you have no idea of whether it works right or not. <br> </div> Thu, 13 Jan 2011 23:19:35 +0000 re: don't knock all standards bodies https://lwn.net/Articles/422986/ https://lwn.net/Articles/422986/ rlhamil <div class="FormattedComment"> In my view, the first importance of a standard is that it is a useful specification _separate_ from an implementation, that (with or without royalties) could be used as the basis for multiple, interoperable implementations.<br> <p> Something meeting that definition is a valuable engineering discipline that should often improve the quality (if not necessarily speed) of the resulting code. Such a standard, even if after-the-fact to an existing implementation, can serve as the basis for future development. Consider SVID (System V Interface Definition), which evolved into the more participatory standards POSIX, XPG, and Single Unix Specification.<br> <p> Nevertheless, one could argue that for widest implementation and support, a standard should be<br> * free of royalties or fees (aside from reproduction costs or those to recover the costs of the standards body)<br> * capable of being _implemented_ independently without royalties or fees<br> <p> The whole open-source-as-an-ideology line of thinking leaves me totally cold. I like open source because it lets me do as much of my own troubleshooting (or occasionally modifications or extensions) as I want to. But I want code based on standards (or at least some co-development there) rather than code in _place_ of a standard. So I'm more interested in the ability to implement something royalty-free (yes, I agree that software patents generally stink) than in what the license on the source code is, as long as I can look at (but perhaps not even modify) the source code without spending an arm and a leg.<br> <p> Still, the unfortunate reality, especially for codecs and data formats, is that people will probably want to be able to<br> * access existing media<br> * originate (or sometimes edit) media in a format for which most others will already have the facilities to view or otherwise play back (or sometimes edit) the media<br> <p> That means that for some time at least, until not so much the ideology as the practical value sinks in (i.e. until those that create and consume content take control away from those that merely provide the tools to them), like it or not, there will be a demand for interoperability with non-royalty-free codecs and formats; and quite understandably so: most don't care so much about the broad currents of history as they do about just getting something done today.<br> <p> That's one example of where I think a pragmatic rather than ideological approach could end up at the common goal of opening up communications and interoperability more quickly than going straight for the jugular by eliminating encumbered codecs and formats right away. Just draw a line, and say that after that line, new standards must be unencumbered, and leave it at that. That simple approach also takes away the foolishness of trying to jockey for market position by dominating, not the implementations, but by trying to sponsor or pick the dominant specification and make sure one's competitors pay for using it in ways other than just coming up with their own implementation. Compete on total results and value please, not on "tricks".<br> <p> <p> </div> Thu, 13 Jan 2011 21:25:11 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/422985/ https://lwn.net/Articles/422985/ sgros <div class="FormattedComment"> Many RFCs published are only informational, and actually, every vendor has possibility of publishing something via RFC so that everyone can implement and use what's published.<br> <p> And, also, many RFCs that are on standard track never reached that status, and thus formally are not "standards". You can take HTTP as an example. Last RFC was published in 1999 as a draft standard and never moved from that status.<br> <p> </div> Thu, 13 Jan 2011 21:10:31 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/422983/ https://lwn.net/Articles/422983/ elanthis <div class="FormattedComment"> Nonsense! Nerds are ridiculed and picked on. Geeks are where it's at.<br> <p> Nonsense! Geeks are awkward and unattractive. Dorks are hip, though.<br> <p> Nonsense! Dorks are lame and only good for dodgeball targets. Nerds are the future.<br> <p> Nonsense! Nerds are ridiculed and... <br> </div> Thu, 13 Jan 2011 20:57:46 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/422973/ https://lwn.net/Articles/422973/ Trelane <div class="FormattedComment"> "nerd"s are actually kinda popular. I think you mean "neckbeard."<br> </div> Thu, 13 Jan 2011 20:37:41 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/422968/ https://lwn.net/Articles/422968/ b7j0c <div class="FormattedComment"> the rancor over this discussion is hilarious, as if all the h264 advocates were assured world domination prior to this decision<br> <p> the day mozilla decided h264 would never be part of the base ff build, h264-uber-alles was doomed, the google decision just put one more nail in the coffin as it was being lowered into the dirt. you can say what you want about mozilla, but with 350 million users, they most certainly can kill a codec if they want to<br> <p> <p> <p> </div> Thu, 13 Jan 2011 20:28:14 +0000 Google's dropping H.264 from Chrome a step backward for openness (ars technica) https://lwn.net/Articles/422964/ https://lwn.net/Articles/422964/ drag <div class="FormattedComment"> <font class="QuotedText">&gt; But it's amazing to see how many people are screaming about this decision.</font><br> <p> They attract a lot of Apple users with their articles. <br> <p> Apple says that Vp8 is slow and inefficient and H.264 is a standard, so therefore that is true and everybody who disagrees is a moron or a nerd and does not understand how the real world works.<br> </div> Thu, 13 Jan 2011 20:17:13 +0000 VP8 specification https://lwn.net/Articles/422958/ https://lwn.net/Articles/422958/ DonDiego <div class="FormattedComment"> The VP8 specification still has a very long way to go. It is neither complete nor does it agree with libvpx on all accounts. The FFmpeg implementors of VP8 had to use libvpx as reference in many places while doing ffvp8.<br> </div> Thu, 13 Jan 2011 19:47:55 +0000 This is why I never follow Ars links anymore https://lwn.net/Articles/422957/ https://lwn.net/Articles/422957/ Trelane <div class="FormattedComment"> The thing that really got me turned off was the sanctioned MSFT astroturfing.<br> </div> Thu, 13 Jan 2011 19:42:05 +0000