Several months ago I ran across a blog rant titled "
Why a career in computer programming sucks" by an aging CS geek. While I don't agree with some of his points, I found that the kernel of his first paragraph really held some truth:
Computer programming is a job that’s heavily dependent on temporary knowledge capital. It’s temporary because the powers that be keep changing the languages and tools that programmers need to do their jobs. In nearly all other professions, knowledge capital increases as you grow older because you keep learning more about your field. But in computer programming, the old knowledge becomes completely obsolete and useless.
I've been in CS for more than 11 years now, and every time I've changed jobs I found myself scrambling to learn new tools, operating systems, and libraries. This is despite the fact that I've intentionally steered my career choices within the narrow bounds of computational science and engineering on Unix, using C and C++ as my languages of preference. I started out developing on little purple and blue Irix workstations, but when
SGI execs flew their company into the ground I migrated to
Sun. On both of those systems I eventually became proficient enough to assist in system administration, as well as Rational Clearcase administration.
In my next job much of that knowledge was useless, and I had to quickly learn Redhat Linux, CVS and a variety of obscure cluster utilities. Suddenly, the code I wrote was now Perl, not C++. Today I'm still using Linux, but it's Fedora. Patchadd became Up2date, which became Yum. The clusters I use have completely different queuing utilities, so I'm starting from scratch again. I'm back writing C++ but it's Visual Studio on Windows XP. Raw
OpenGL has given way to
OpenSceneGraph. Motif GUIs are obsolete, so I use FLTK or another cross-platform toolkit these days. You get the picture.
My jobs have all been interesting and I like to learn, but sometimes I wonder how many more of these paradigm shifts I can weather. Will I be able to pick up
C-Plus-Plus-Cubed when I'm 49? Even though some of these technologies are similar, i.e. flavors of Unix like Linux/Solaris/Irix share many common commands, it's the last 10% (the patch administration, dark corners of the software development tools, and system libraries) that takes a couple years to master. My Dad is a Mechanical Engineer. He is 70 and he still uses the same solid mechanics equations and techniques that he learned in 1960.
The author tries to score a few more hits on the CS profession, but most of these fall flat:
- Low prestige - I don't care about prestige, my pay has been relatively good and the engineers and other coworkers have generally treated me respectfully. I've accepted that I'm a nerd.
- Outsourcing - Outsourcing and economic competition are a fact of life in the global economy, especially for ethereal quantities like software. Get a security clearance, defense-related software development cannot be outsourced.
- Project management sucks too - Yes, but somebody has to do it. This is certainly not limited to software or IT projects.
- Working conditions - Not that bad. I've had some pretty good (albeit shared) offices. I also generally get to order my own workstations with custom hardware.
The final recommendation compares the lowly IT position unfavorably with a career in law. But a JD is a graduate degree, wouldn't it be more fair to compare with a M.S. or PhD in Computer Science? Armed with a graduate degree, a CS major is more likely to have specialized skills that convey better job security, working conditions and prestige.
At least until he gets too old to learn...