How rapidly does programming knowledge really become out-of-date? Do things change so frequently that it has become unreasonable to expect programmers to keep up with the pace of technology? I’m not so sure the pace is really that fast.

A few days ago, Half Sigma posted an article claiming that a career in programming sucks. I responded that no, it doesn’t. In turn, I got quite a few comments supporting Sigma. Several of them were centered around Sigma’s first argument, that programming knowledge becomes obsolete too quickly.

Rapid Obsoletion of Programming Knowledge

In an attempt to prove that programming knowledge has a very short expiration date, it’s easy — and common — to drag out an expired technology (Punchcards, Z80, Turbo Pascal, etc.) and then point at it and say, “Look! Look! How does any of that still apply?”

Quite frankly, if you have to drag out punch cards to make your point, you don’t have one. Punch cards died 30 years ago. If that’s the best you can do, the computer science field must not be moving very quickly at all. The Z80 died in the mid-1980s. Even Turbo Pascal was gone before Windows 95 hit the streets. Each of those technologies had more than a 10-year lifespan before it became obsolete. Is it so unreasonable to tell programmers that they need to learn a new technology at least once every ten years?

Hey, remember when plumbers used to use lead-based solder? Plumbing knowledge is obsoleted so fast!

Just like the basics of soldering are partially independent of any specific solder compound, so are the basics of computer science separate from any specific technology. Punch cards are just a storage medium, like hard disks, so much of the knowledge gained during the punch card era is still relevant. Similarly, a programmer who has good knowledge of the Z80 should be able to map much of his knowledge to modern processors. And a skilled programmer who used Turbo Pascal should still be a skilled programmer in a new language, given a brief acclimation period.

If tomorrow morning we all switch to PowerPC computers, programmed in Python, using some funky crystal matrix for storage, my knowledge will not all be obsolete. Certainly, my knowledge of the more quirky aspects of C++ won’t be useful anymore, but the basics won’t have changed. It’s still a Von Neumann machine. Algorithms are still important. Software design practices are still relevant.

Nothing fundamental really changes very quickly. The superficial stuff evolves rapidly, but if your knowledge is only superficial, that’s a different problem altogether.

The Example Game, from the Other Side

If we want to play the “look at this example” game, I’ll just trot out C and the x86 architecture. Both of those have been around for more than 30 years, and they’re both still going strong. They’ve been around since before I was born, and I still use both every day at work. Likewise, C++ and IPv4 have both been around for more than 20 years, and show no sign of dying soon. Even Java and PostgreSQL have been around for 10 years now.

There are many examples of technologies that have been around for more than a decade. In fact, I challenge anyone to demonstrate a dead technology that was once popular and considered important, which didn’t last for at least ten years. I doubt there are very many examples. I think there’s a minimum lifespan before anything can really even be considered important, if for no other reason than it takes a while before a given technology becomes well-developed enough to be utilized by business.

Knowledge Carry-Over

Even when particular technologies die, they still influence future technologies, and so the knowledge base doesn’t disappear or become useless.

Pick any current technology, and you can trace its roots back toward previous technologies. C# was influenced by Java. Python was influenced by Lisp. Ruby was influenced by Python, Smalltalk, and Perl.

Knowledge of any predecessor technology will spill over to the newer technology. Certainly, not everything remains relevant, but there’s definitely some knowledge carry-over. If you’re skilled with Java, C# is not a huge leap. Things are different, but not completely alien. If you’re comfortable with functional and object-oriented programming, you can pick up Ruby.

What Do Employers Want?

Some commenters argued that employers will only hire programmers who are already skilled in the latest technologies. I agree that’s true for some employers. However, I’d argue that it’s not true for most, shouldn’t be true for any, and won’t be true for the employers good programmers should want to work for.

Would I hire an experienced Clipper/dBASE programmer for work on an Oracle project? Yes. Would I hire a good C++ programmer to work on a C# project? Yes.

Good people are far more valuable than specific knowledge. Any decent programmer can learn the syntax and APIs. If you have demonstrated a strong knowledge of database programming, why wouldn’t your knowledge carry over to Oracle? If you are a good C++ programmer, why wouldn’t you be a good C# programmer? If you cannot move from one language to another, then your knowledge is purely superficial, and you are not a good programmer.

If a potential employer cannot recognize that a good programmer is much more valuable than a mediocre programmer “skilled” in the latest buzzwords, then you don’t want to work there. You will almost certainly not be treated well, because the employer clearly doesn’t understand the value of a good programmer.

If your boss doesn’t want you to learn new technologies, then you’ve got a bad boss. Your boss should want, and expect, you to be constantly learning. What kind of idiot thinks that he’s hiring programmers with all the knowledge they’ll ever need?

Technology Changes

You people complaining about the obsoletion of knowledge sound like luddites. It frankly sounds like you’re afraid of progress, and unwilling to learn new technologies. You picked a fast-moving field. Accept that some of your specific knowledge will be subject to attrition. Knowing, for example, a particular object-oriented language’s syntax is transient knowledge. Understanding how to program using good object-oriented methodologies is not transient.

Expect to learn new technologies. It’s part of the field. Learn them at work, or learn them on your own time. But don’t complain to me that your knowledge of Clipper isn’t useful anymore because the world now uses Oracle. If you didn’t learn anything useful while you were using Clipper that would be applicable to Oracle, then you probably didn’t do anything useful while working with Clipper.

Now, I’m not going to pretend that the computer science field doesn’t have any problems. Certainly it has problems. Life has problems. That’s just the way things are. But pretending that the problems are insurmountable doesn’t help anything. Pretending that technology evolves so fast that no one could possibly keep up long term is just a way of hiding the fact that you aren’t interested in keeping up.

If you want to switch fields because you can’t or won’t keep up with the pace of technology, please do so. If you find something you love, then that’s far better than doing something you don’t care about. And if you find a field where you knowledge base never needs to evolve, let me know. I’ll be sure to pass the news on to others who don’t want to program anymore. I have to say though, I can’t think of a faster way to obsolete all your knowledge than by switching to a different field.

8 Comments on “How Quickly Does Programming Knowledge Become Obsolete?”

  1. William Furr Says:

    Bravo!

  2. Joe Bochinski Says:

    Owned.

  3. Chris R Says:

    Found this blog as I’m trying to find out if Oracle can export in ancient punchcard format. This is because a mandatory insurance reporting bureau is still requiring punchcard. And I don’t know whether I have to signify negative digits–character by character–over the whole number, or just the first significant digit and/or including leading zeroes. This technology covers more time than I’ve been alive.

    The green screen is alive and well—just ask your bank.

  4. vick Says:

    well, I’m majoring in computer science at a csu, but I still have my doughts as most companies want an “experienced programmer”, believe me, I’ve tried many companies such as HP, Raytheon, Boeing,Northrup, and all of them will choose the programmer that has the latest programming. They said I “had” to know ADA programming. And that I couldn’t just apply my experience of C++ of java and assembly language skills to any other language.

    Plus the above fella didn’t mention anything about out sourcing. Most of the worlds computer programmer reside in foreign countries doing all the programming for americans. What’s the point of learning and program, even when your good at it, when a peron in another country is willing to take up the same job for next to NOTHING. Trust me it happens. Why else do you think BILL GATES lobbied congress to allow 65,000 visas to be approved for software engineering every year.

  5. Derek Park Says:

    All companies want experience. That doesn’t mean they won’t hire people who don’t have it. You can also get experience without working a full-time job. Do internships. Take relevant classes. Study on your own time.

    Also, Ada? When were you applying for jobs, 1988? I have trouble believing that you were applying for Ada jobs at all, but even more trouble believing you were turned down for lack of experience. I’ve never touched Ada, and I’m 99.9% sure I could get a job programming Ada if I wanted/needed to. But I wouldn’t, because even the DoD is moving away from Ada.

    Outsourcing is an overhyped concern. People have been worried about outsourcing for years, and the wholesale outsourcing of programming is yet to be seen. Good overseas programmers are cheaper than Americans, but they are not free, and there are very real communication costs (which turn into monetary costs).

    If you want to be a programmer, I’d recommend getting some kind of projects under your belt, either on your own or with an employer. Go to your professors and see if any of them have interesting work, or can direct you to some. My first jobs were all from networking through my professors.

  6. Dustin Says:

    Actually, some colleagues of mine who worked at some rather large defense contractors within the last 3 years or so have told me that ADA is still alive and well in those environments — mostly for maintaining existing code. That they would be advertising for ADA programmers isn’t a stretch of the imagination at all. Of course, these companies would probably be an example of the type of place you’d want to avoid…

  7. john Says:

    I had to post because the Chris R comment made me lmao…
    But, all knowledge scenarios are covered nowadays. If your going corporate with large teams, they want you to be able to learn as a prerequ, but of course there are those that need you to be THEIR resource. This facilitates a specified need for the particular up to date languages or even the legacy code they still maintain or need to port…
    All answers are correct…

  8. Dave Says:

    Ada is well known for software that just HAS to work.

    Air Traffic Management Systems
    Commercial Aviation
    Railway Transportation
    Commercial Rockets
    Commercial Imaging Space Vehicles
    Communication and Navigational Satellites and Receivers
    Data Communications
    Scientific Space Vehicles
    Desktop and Web Applications
    Banking and Financial Systems
    Information Systems
    Commercial Shipboard Control Systems
    Television/Entertainment Industry
    Medical Industry
    General Industry
    Military Applications

    http://www.seas.gwu.edu/~mfeldman/ada-project-summary.html

    And not that it’s FLOSS it’s much more of a draw to programmers who are just tired of the shoddy practices of other languages. Ada was clearly designed for software engineering, and it excels. Ada will be around for a long time yet – it is alive and well, still a preferred high level language despite the plethora of other languages out there in the wild. Better to have a compiler which complains, than a binary which causes death and injury.

    http://adacore.com
    http://libre.adacore.com
    http://adaic.org

    FWIW my company’s next commercial project WILL be using Ada after very careful consideration of the alternatives.