It seems to me that the problem here isn't the robustness principle per se, but that any flexibility comes at a cost. It's harder to write code that interprets variable-length headers versus fixed-length headers, for example. And it's also costly because that flexibility allows other people to complicate things.
The cost is incurred in those trying to obey the robustness principle. Those who assume that a reserved field can be used for their own principle aren't obeying it, but they're passing the likelihood of failure onto the people that obey it. As Eric says, reserved fields have been reused for proprietary vendors' own purposes, usually to exclude other implementations that don't make the assumptions (i.e. are more liberal) about that field's meaning. This is not the fault of the robustness principle per se, it's ultimately the fault of the abuser of the standard, even if they want to then hide behind it and say 'well, the standard doesn't say what that reserved field is for'.
Ultimately the process of setting these protocol standards and their implementation has to take place in the open. Trying to bastardise the standard for corporate gain may work in the short term but - crucially - it doesn't invalidate the original standard or the other implementations. Sometimes these abusive implementations gain some traction but rarely because of their changes to the standard per se - usually more because it's a large company foisting their adgerware on everyone else. The implementations that survive, ultimately, are the ones that are flexible and obey the robustness principle.