I stumbled upon the Programmers at Work postings (they have been there over a year - shame, shame).
These are interviews with people who were successful programmers at about the time I first laid hands on a keyboard (on a Spectrum ZX+).
It was a time when computers where very few and very far apart in Greece (at the time only one other friend of mine was in possession of a zx81 and it was years before I had access to an IBM PC - nowadays it's hard to find anyone between 15 and 40 who does not own a computer). Thinking about it, this means that my generation is the last one with knowledge of what it means to live without a computer.
Maybe that is why John Warnock's (of Adobe fame) words from his 1986 interview resonate so much with me:
I went through the university, all the way to the master’s level, so I got a good, solid liberal education. I believe it’s really important to have a very solid foundation in mathematics, English, and the basic sciences. Then, when you become a graduate student, it’s okay to learn as much as you can about computers.
If you really want to be successful, being acculturated to the rest of the society and then going into computers is a much more reasonable way to approach the problem.
Substitute English for you own language there and you pretty much hit the nail on the head.
I have always believed that a good broad theoretical education is essential, more so for a software engineer. It is one of the reasons why I did not lament the sorry state of the labs and the lack of equipment in my university. Our profs could not let us loose in the labs to play so they crammed us with theory and to hell with knowing how to use a soldiering iron.
Learn to solve problems with pen and paper and learn as much theory as you can. Learn the way problems are solved, how to think in search of a solution. Computer science moves so fast, that by the time you have your degree the libraries you know, the programming language you learned and your tools will be if not obsolete, then going out of fashion very fast.
I will go as far to say that for engineers (software, civil, electrical etc.) and scientists (maths, physics, chemistry) go out and read some history, anthropology and political science. Learn a second language. Get as broad a theoretical basis as possible before you specialize in anything.
In the same batch of 1986 interviews Butler Lampson says much the same thing (I find it funny as he more or less says that undergraduate CS is not "respectable"):
LAMPSON: I used to think that undergraduate computer-science education was bad, and that it should be outlawed. Recently I realized that position isn’t reasonable. An undergraduate degree in computer science is a perfectly respectable professional degree, just like electrical engineering or business administration. But I do think it’s a serious mistake to take an undergraduate degree in computer science if you intend to study it in graduate school.
LAMPSON: Because most of what you learn won’t have any long-term significance. You won’t learn new ways of using your mind, which does you more good than learning the details of how to write a compiler, which is what you’re likely to get from undergraduate computer science. I think the world would be much better off if all the graduate computer-science departments would get together and agree not to accept anybody with a bachelor’s degree in computer science. Those people should be required to take a remedial year to learn something respectable like mathematics or history, before going on to graduate-level computer science. However, I don’t see that happening.
A broad theoretical education gives you adaptability, multiple viewpoints on a problem and when (not if) your specialization field disappears from the work radar, the ability to pick a new field. Simply put, don't put all your eggs in one basket.
It may be that this viewpoint is becoming fast utopic. Computer science has exploded and the amount of knowledge accumulated (and accumulating) requires more time than a post-graduate degree. On the other hand, optics help understand image editing algorithms, statistics are used heavily in natural language processing, group psychology and dynamics are key to applying development processes and building the next social web wonder, maths is in everything from compressing audio and video to encrypting communications. Not knowing the science amounts to being a user and users do not solve problems1.
Looks to me like a good computer science undergraduate degree should have less programming and more science in the curriculum.
But as university degrees become aligned with the current industry needs, cranking out work ready programmers versed in the current hot technology becomes more important than educating scientists with a solid knowledge of the hows and whys, which to me seems shortsighted and plain stupid. Butler Lampson had a few choice words about this in 1986:
INTERVIEWER: So how should we prepare ourselves for the future?
LAMPSON: To hell with computer literacy. It’s absolutely ridiculous. Study mathematics. Learn to think. Read. Write. These things are of more enduring value. Learn how to prove theorems: A lot of evidence has accumulated over the centuries that suggests this skill is transferable to many other things. To study only BASIC programming is absurd.
So yeah, leave the keyboard be. Pick up a book, a pencil and a few sheets of paper. See if you can figure out how much money your bank account will have in 6 years without a calculator.
(1) They create them :).