Which are then studied, analyzed, and used to predict the future.
If the future is successfully predicted, then we have a workable scientific theory.
It really annoys me when people say "that's just an anecdote". No it isn't! It's a data point! It may be an outlier, it may be invalid, it may be wrong, but it is also well known that people will dismiss inconvenient data points without seriously trying to understand them. Feynman gives a very good example of a rogue data point setting quantum mechanics or something like that back 20 years - they didn't bother to see if any data points were rogue, so the "wrong" answer was used and messed up everything else until the problem was spotted - MANY years down the line.
Anyways - my take on this high/low-level stuff? For any given programmer, their productivity measured in LOC is roughly constant regardless of the language. If it takes 10 lines of C or 100 lines of C++, use C. You'll get it done in one tenth the time. On the other hand, if it takes 10 lines of C++ or 100 lines of C ... you get my drift.