"Full-stack" and why I don't like it

10 Oct 2016

Last week I went to a jobs event, recruiting for my company. I was there on my own, and recruiting is a pretty new experience for me, so I was kinda excited about it. The attendees were a mix of graduates, bootcampers, and a few more senior developers, but mostly the crowd were looking for their first or second job. I enjoy going out and talking to people about their experiences with software -- I think breadth of knowledge is really valuable -- so I found it interesting seeing what other people are looking for in a job. I've only ever had one; maybe I've aimed for all the wrong things!

But there was one comment made that stuck out. It was from a developer at a similar stage in their career to me -- two years professional experience -- who was looking to switch companies. He'd been at a large firm, and found things all a bit too lumbering to get anything done. All of that sounded very reasonable, and I was agreeing with a lot of what he said, until he used the phrase

"... of course I'm full-stack, you know, all that standard stuff really"

Now I know a lot has been written about the myth of the full-stack developer, but the fact that at a recruitment event this was considered "standard stuff" really took me by surprise. Full-stack, by definition, means you're experienced with technologies at all levels of a production application. After two years professional experience? Even after ten? And even if that were true (hint: I don't think it is), it can't be a helpful way of identifying yourself to potential employers.

This myth is something that is a buzz-word on dodgy recruitment ads -- the same as the idea of being "10x" or a "rockstar" or "ninja" or "guru". Because you'd have to be at some kind of coding Mozart to have equal proficiency across all levels as someone who has specialised. There's only so much you can do by simply having raw brainpower or talent. But because it's a word on the job spec, it's the word applicants find themselves using to try to show they cover the requirements.

I think the word they're aiming for is generalist. Maybe I'm going to sound picky, but I think the distinction here is important. Being a generalist suggests that the skills you value and are good at are generally applicable. That the specific technologies you've used aren't your strengths as a developer.

Setting my hatred of buzzwords aside, it's worth thinking about what these two descriptions imply about you. When you're presenting yourself, especially to a potential employer, you need to be thinking about what impression you're giving your audience. By describing yourself as a generalist, you're describing something about your skill-set, and giving useful information about what you consider to be the most applicable parts of your experience -- your abilities to learn and turn your hand to new problems. "Full-stack" make you sound like you're either ticking boxes or think you're perfect at everything.

By focussing on the technology stack you invite negative criticism of your abilities. Any part of the Full Stack(TM) (which, by the way, will vary between companies) that you're not so proficient with becomes a flaw in your CV. And quite apart from this you are failing to highlight all of the other skills that you need to be a developer.

I think the point I'm trying to make is that the phrase "full-stack developer" shifts the conversation onto negatively looking at the technologies you've worked with, and that's not a good thing, whether you're the employer or the applicant. It would be great if we could stop saying it like it is.

More like this: