...as long as you normalize against language, etc. In this case, LOC is used as a relative metric. The effort required to produce 100 LOC in C for the kernel is different from the effort required to produce 100 LOC in, say, Ruby for a webapp
I saw a study long ago that had the remarkable result that there is nothing to normalize here. It was looking specifically at the cost to develop and test new software, and found that 100 LOC costs the same regardless of the language or subject. What I've seen is consistent with that.
The study did find a few variables that added precision to a LOC-based estimate. With modification of existing code, there were some measurements of the code base that helped. I think number of files touched added precision too.
Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds