There is a way in which OTP does benefit from standardization: if you buy a ton of tokens, give them out to your users, send the users around the world, and then the server software starts crashing. You might eventually give up on the server-side "one" in the one-to-one interaction and want to replace it with a different one from a vendor that writes more reliable software, and not want to replace the tokens.
There's also the aspect that testing interoperability is a good way of catching implementation mistakes; the vendor may have worked out a secure method, but have bugs in both the token and server that make the implementation not follow the intended method. If there's another implementation of each, chances are that someone will notice if either doesn't work. (I could imagine a buggy system truncating at a hash input block size instead of padding and accidentally discarding all but a few bits of shared secret and using the same code on both sides and never noticing that there are only a few passwords to try at any given time.)