Tuesday, 12 August 2008

Quantity Always Trumps Quality...

... but only if you learn from your experiences. Ryan sent me a link to Jeff Atwood's post of Quantity Always Trumps Quality via Hey! Heads Up. Go read Jeff's post before continuing with this one. Don't worry, I'll wait.

All done? Good.

The gist of it is that if you write lots of software and learn from your mistakes, you'll become better at writing software. But from my observation, the learning is the hard part, not the writing of software.

My current boss gave me a great analogy about "years of experience" and how it's a really crappy way to evaluate someone. One candidate might have 5 years of experience in job X, and a second might have 20 years. But for those 20 years, it could be the same year over 20 times. The person with 5 years, it could be 5 totally different years.

I'm not sure if I've explained it well. The differences in the 20 vs 5 years is if you learn from your mistakes. If you keep on writing software how you did last year because you were able to get it to work last year, then that means you're probably putting yourself into an experience loop. If you're writing software and go "how can I do this better / faster / cleaner than last year?" that shows you've move ahead.

For me, it's a shitty day when I don't learn something new. For example, I've been working with java for a while and yesterday I learned that I should be using Integer.valueOf(...) instead of new Integer(...). I didn't learn that because I stumbled across that blog post, but because I was trying out pmd in an effort to learn how to make software better.

Make Software Better.

4 comments:

  1. Yeah the title of that link is misleading. If you don't learn from your mistakes, quantity doesn't matter!

    ReplyDelete
  2. I read that article too. And I think it holds a lot of truth. I think the stuff I learned in University was really important. But it's the time I actually spent developing software that I found most beneficial. I remember getting really interested in a couple of my assignments, and asking myself, "how can I make this better". Even though I'm sure a lot of the time the extra effort didn't get me any extra marks, I learned a lot in the process. Writing lots of software, looking at lots of software, and asking yourself, "how can I make this better", can teach you a lot about how to actually do things better.
    BTW, since you mentioned Jeff Atwood, have you been listening to stackoverflow podcast (http://blog.stackoverflow.com/)? I find it pretty interesting, and have learned a few things just from listening to it. Also, it's nice to hear .Net development talked about in a positive light, rather than the regular MS bashing you see on /.

    ReplyDelete
  3. Just to be Devil's Advocate for a minute... :)
    Yes, the point of the article is to write a lot of code and learn from your mistakes. Very good advice. My best learning has been from actually "coding", not "reading" about coding.
    However, doing something "better" is pretty objective. If you want to look at something from the perspective that really counts (the client, or the finished product that end users will interact with), then the perspective shifts a bit.
    If you're the guy that's got 20yrs of experience doing the same thing year after year, and your code isn't the greatest, and you never make your code cleaner or "better" BUT the systems always meet the expectations of the user, who cares?
    So the guy didn't use Integer.valueOf(...), and instead always creates new objects. One of the posts you linked to has a link to an IBM article talking about how fast Java now is at allocating objects, so the difference between "new" and ".valueOf" would probably never be noticed by anybody other than a developer who thinks this is too inefficient.
    Anyway, I completely agree with the your post, and the post you linked to. I just felt like adding a different angle to the conversation. :)
    - Andrew

    ReplyDelete
  4. Ah, you're right. "Better" can mean many things.
    I'm assuming that when people are becoming "better" at coding / designing they are writing clearer, cleaner, and *less* code. They are able to write features and fix bugs faster.
    As you pointed out, the customers / users probably couldn't tell good code from bad, but they can see the $ burn away.
    Let's say 5 years ago (let's talk internet timelines) everyone would give an estimate of 4 weeks to implement a small project. Person x wins the job.
    Person x and the client are happy with their relationship over the last 5 years. To meet an RFP requirement, a project goes out and everyone else who kept on learning and figuring out how to write better code now sends in an estimate of 4 *days* while Person x says 4 weeks.
    Here is the kicker: with Person x's skill set, it's impossible for them to match the timelines etc of the other proposals. They are suddenly unemployed.
    The worst thing about this scenario is that the client might feel ripped off for the last 5 years of work without knowing how much extra they spent. They are left with a bitter taste in their mouth.
    Regardless of if they were actually ripped off, it's the perception that would be worse.
    ...
    For the valueOf vs "new" feature, I find that more interesting than applicable. Hopefully the compiler would switch the byte code of a "new Integer(...)" to "Integer.valueOf(...)" and you'd never know. It's pretty hard to keep up with small optimizations like that unless you're doing something crazy like calculating billions of things and actually need those tweaks.

    ReplyDelete