I like programming languages and I like compilers. I especially like how the less-is-more rule applies to programming languages in the sense that what is not available to the programming, the compiler and the run-time environment can do what ever it want with. For example, a language with pointer arithmetic can not (easily) have a compacting garbage collector, while a language without pointer arithmetic can. This can also be said for what optimizations the compiler can perform.
In a language where the memory layout is implementation defined, the compiler can optimize the memory layout of data structures such that they are ideal for the application. For instance, using run-time profiling information, the compiler can opt for a tree-base map instead of a hash-based map; or select an array of bools instead of a array of ints addressed bit-wise.
However, this requires us programmers to let go of some of our control -- if we want to have the memory layout optimized by the compiler, then the programmer can't have control over it. In theory, the resulting application should be slower, since the programmer usually know more about the application than the compiler. However, in practice I believe the application will perform better, because the programmer rarely take advantage of his/her knowledge for improving performance (it takes to much time and effort).
So, if we accept the fact that programmers are lazy (and compilers aren't), and designed a language with this in mind we would see applications with higher performance. Especially considering that programmers should make it work, make it right, make it fast (in that order) applications rarely reach the make it fast phase.
If you would ask me a few years ago, I wouldn't think that performance is such a big issue -- there is enough computation power in an average computer anyway. But considering that everything is turning into a computer application (I can book laundry slots using my remote control for my TV), it is nothing other that waste to not use that computing power properly.
I use the word waste here in an environmental way. For instance, assume an application that runs on several million computers over 5 year had 25% better cache performance -- how much less energy would that consume? Also, assume that those computers are mostly servers located in huge server rooms -- how much less cooling would be needed? Maybe it would be possible to have fewer computers running in the server room, such that the building containing the computers could be smaller...
Seeing how more and more is driven by software, we should look more into how we use the available computing power. We don't want to waste such incredible resource.
No comments:
Post a Comment