Thursday, March 27, 2003

Programming in Poland, and the end of Moore's law

Brian Micklethwait has a very interesting piece entitled "The Polish Software Miracle", which discusses just why so many smart and capable programmers and technical people have come and are coming out of Poland. I will add that it isn't just Poland: there are Russian, Bulgarian, and other nationalities in there as well. (The Slavic computer expert with an accent like Count Dracula who knows things about Unix that the rest of us cannot imagine and who we suspect may want to take over the world is something of a cliche in the IT world, and is parodied hilariously in the character of Pitr in the User Friendly comic strip). But I am digressing. There was one particular point that Brian made that I wanted to address.

It so happens that I have a tiny moment of experience of these people, because I visited Warsaw in 1986, where I was supposed to collect information about the computer hardware needs of the Polish political underground. I was as completely out of my depth as I have ever been in my life. Talk about level of incompetence. These guys knew more then about computers than I will ever know.

I don't believe it mattered, because the message I took back to London was very simple. Just send us anything you can, they said. Whatever you send, we'll get it working, they said.

So I learned then of the nascent Polish computer software miracle, and I also learned the reason for it. At that time, computer hardware in places like London was rocketing forward, leaping ahead in power, plunging in price, much as it has been doing ever since. Not so in Poland. Hardware there was called "hardware" because it was so hard to come by, and once you got your hands on a computer, you made it do things scarcely dreamed of outside Silicon Valley. If you were Polish in 1986, for example, you made a laser printer print out the Polish alphabet. Only God and the Poles knew how you made that happen, then, if the thing wouldn't do it already. Thus the Eastern European software miracle. These guys were and still are largely self-taught.

By the mid 1970s, the computer industry and computer science were quite advanced. Mainframe and minicomputers had substantial amounts of memory, ran sophisticated operating systems and were programmed with high level languages. Then, in Silicon Valley and Texas in the mid 1970s, the microcomputer was invented. Although these were real computers, compared to the state of the art in the minicomputer and mainframe world, these things were incredibly unsophisticated. They had very small amounts of computer power, tiny amounts of memory, no hard discs or anything similar, and they lacked the power to run anything but the simplest of operating systems or languages.

To get microcomputers to do anything useful, it was necessary for programmers to gain a profound understanding of how the computers worked and were designed. It was also necessary to learn how to program incredibly efficiently and write extraordinarily tight code.

And people did this. They got these computers that were incredibly lacking in power to do things that their manufacturers would never have imagined. Remarkably, if you have to it is possible to write a word processor that runs in three kilobytes of memory, or quite sophisticated games that run in even less. Some of the software that was developed was extraordinary. The way in which such software was developed was different from the way in which the minicomputer and mainframe world developed software at the time, which had become a larger scale industrial process. (I suspect that there was a similar "guru" period at the beginnings of the mainframe and mini industries too, but I do not know very much about this).

This world did not last long. Moore's Law worked its magic, and microcomputers became twice as powerful every 18 months. One consequence of this is that there was less need for efficiency. If you wrote a program inefficiently today, that ran slowly and used the absolute limits of today's hardware, then within a very short while the computers running it would be so much more powerful that it would run quickly on the new generation of hardware, regardless of how inefficiently it worked. You didn't have to find ways of making programs more efficient, because there were always more resources today than yesterday. Moore's law means that programmers no longer needed to be smart. And as a consequence they weren't What is known as "software bloat" occurred. New versions of the same software than in a previous version used up ten kilobytes of memory used up ten magabytes. They may have run twice as slowly as the old version, but as the user's computer was four times as fast, there was still at least some illusion that progress was being made. I am not sure this is necessarily a good trend. One consequence is that nobody studies the old code, and nobody figures out where it is inefficient and why, and as a consequence programs are often buggy and less stable. If your resources are precious, then you cannot afford for there to be anything in the program that the programmer does not understand.

This is what happens when hardware is plentiful. In Poland, and in the rest of the Eastern bloc, hardware was scarce. People had to write code themselves, and if they wanted to add new features, they probably had to take old features away to make space. If they wanted the computer to do things that it was not designed to, they had to get extremely close to the machine to do so. What does this lead to? Really good programmers, but unconventional programmers who do not do it the way people are ever taught to do it. Something very similar to what came into being in the 1970s when microcomputers were first invented. And something quite similar to what like also came into existence in the 1940s and 1950s. In recent months and years, something similar has been the case with software for cellphones, which have far less computer power than do PCs. (Interestingly, some of the extremely efficient software from 1970s and early 1980s - mostly games - has actually been recycled for use on cellphones).

In the last five years, there has been a slight slowdown in this trend. This is not because Moore's law has stopped, but is because the key factor determining speed for most PC applications has ceased being the power of the CPU or the amount of memory, but has instead become the speed of the internet connection. A five year old computer is generally fine for most internet applications, because the speed of the average internet connection, although faster than it was five years ago, is not that much faster than it was five years ago. Rather than buying a new PC every two years, people are now buying a new PC every three or four years. (In addition, the average cost of PCs sold has dropped dramatically, whereas previously prices had stayed about the same and power had increased. People are aware they do not need the latest state of the art CPU, and are thus buying computers further from the cutting edge). Because of this, software companies are no longer as able to assume that everyone will have more power and memory next year, and I think that the average quality of software has improved because of this. Certainly the latest versions of Microsoft Windows are more efficient, more stable, and less bloated than was the case a few years back. And I think what I have just explained is a factor in this.

But this is a temporary setback. At some point in the next few years I think we will go back to upgrading at the same speed as Moore's Law. Moore's Law looks like it will continue for at least a couple of decades, and the ingenuity of hardware engineers may ensure that it goes further than that, but the laws of physics do put finite limits on how far it can go. At some point it must stop. When it does, programmers will no longer be able to rely on next year's computers being faster and having more memory than this year's and will instead have to face the fact that improving efficiency and intelligence is the only way to improve their products and develop new applications. The first way they will do this is by cutting through fifty years of software bloat. And that will be a lot. They will have to make their programs cleverer and smarter. They will have to understand their hardware better. We will be back to another version of Poland in the 1980s. It will not be quite the same, because the speeds and amounts of memory being worked with will be greater by a factor with a great many extra zeros on the end, but it will still be something similar. And it will certainly be an interesting time. However, it is still at least a few decades away.

No comments:

Blog Archive