A few weeks back, at yet another conference, I watched a presentation about a coding club for young people: an "after school," extra-curricular activity where popular coding frameworks are taught so that those who attend regularly can build up their skills, perhaps going on to develop their own applications. Now, listening to the testimony of those involved tells a compelling story: young people, with no previous experience of coding, turning out mobile applications, thinking about even how to turn these early exercises into entrepreneurial endeavors and gaining new confidence and competencies in the process. It would be hard to raise your hand against that, wouldn’t it? Yet, this and similar projects leave me somewhat conflicted, and here is why.

    On one hand, I can pretty much put my entire career trajectory down to the casual decision, made well over 20 years ago, to borrow some books from the technical library at the company where I was temping. I borrowed just the two—one on SQL and the other on HTML. Just those two books, and the exercises within, allowed me to not only swap my temp status for a permanent one but to get my first development job in the early days of the consumer web and to remain a full-time developer for the following decade. I should be the poster boy for coding clubs. Yet, I’m not.

    For on the other hand, I come from, what would be referred to in the US as, a "liberal arts" background. That is to say, I studied subjects that these coding clubs of today wouldn’t expect their graduates to follow: sociology, linguistics, film and television production, script writing and political science come immediately to mind. Sure, when I entered the employment market in the tough of an economic downturn—albeit, nothing of the scale we’ve seen in the UK post-2008—these skills didn’t exactly make me especially employable, hence, the temping job. The fact that learning to write code made me immediately employable should be the object lesson here, right?

    I hear all the time about skill shortages in tech. It’s this sort of thing that, on the face of things, makes coding clubs part of the supply chain for the young developers, who will graduate, to meet this shortage. However, the skills shortage isn’t just one for young, largely theoretical developers (though that does exist). It’s a shortage of skilled project managers, business analysts, digital marketing practitioners—the list goes on—but we are somehow in the thrall of "rockstar developers" as the first and only resource requirement.

    There are two dangers we need to ensure that we don’t expose young people to: Firstly, there’s a tendency toward being overspecific in the training. There’s a reason why in a cookery class, young people are taught scratch cooking and not how to heat up a microwave meal or fry bacon (even though, it is likely that it’ll be the latter that will ultimately occupy most time spent in the kitchen). It’s not designed to be an exercise in rote learning but in understanding the theory of how cookery works, basic food science and food hygiene. If McDonald's had a shortage of fry cooks, I doubt that there would be serious thought given to teaching that in schools, even if it would seem to meet the market need at that point.
    Secondly, there is also the implication that those who either make the conscious decision to specialize elsewhere or don’t have the intellectual bent toward learning to code are somehow diminished as prospective employees. Spend even a few minutes talking to an human resources professional and they’ll pretty much put you straight here. Yet, the "Do you even code?" trope seems to be amplified a great deal of late. For some of those aforementioned roles, it might be that some high-level understanding of programming might be useful but so would the "soft skills" that come with non-technical study. There’s an irony that in every failed information technology (IT) project that I’ve been involved in, the problem has rarely been with the technology but, rather, with the humans trying to coordinate their efforts to make it fit the business problem that inspired it in the first place.

    I was not a good developer by any stretch, but I got by until I reached the point when I realized I had run out of talent. What I had learned during this decade, added to the previous non-technical education, enabled me to find a way to continue to be slightly useful and eventually led me to moving into my tertiary career as an analyst. I wouldn’t have got to the lucky position I am in today without learning to code as part of continued education that continues on a day-to-day basis, but that doesn’t mean that it is a path that should be in any way mandatory, even for those destined for the tech industry.

    Matt Mullen is a senior analyst of social business for 451 Research, where some of his primary areas of focus are digital marketing and social media technology. Follow him on Twitter @MattMullenUK.


    Most Read  

    This section does not contain Content.
    0