Hey folks, welcome to The Imposters Club, the podcast for misfits in tech. You know who you are? Who am I? I'm your host, Teddy Kim. I'm a director of software engineering at a SaaS startup, here in Minneapolis. Let's talk about experts. If you've ever read Malcolm Gladwell, his book Outliers, you're familiar with the 10,000-hour rule. The 10,000-hour rule is the idea that it takes around 10,000 hours of deliberate practice to achieve expertise in any given problem domain. Now, if you're a violinist or a tennis player, then the 10,000-hour rule is pretty much table stakes for a career in your field. You just need to put in the time, or you won't have the chops to be taken seriously in a professional setting and that makes sense.
The fundamental rules and gameplay of tennis haven't changed for hundreds of years. Similarly, everywhere on the planet, new violinists are still cutting their teeth on Mozart and Haydn. The point is that the path to expertise is easy to see in fields that are largely static and tradition-bound. But what about tech? Is true expertise really possible? In an industry where sea changes in basic computing paradigms occur seemingly every year? Even if it were possible, would it be worthwhile for tech workers to invest in expertise? Let me give you a specific example that illustrates the predicament. Back in 2003, I got it into my head to study for the OCP exam. That's the Oracle-certified professional certification. To be honest, I don't even know if OCP is still a thing. But back then an Oracle cert was basically a golden ticket. Sure, it took literally years of study and 1000s of dollars in study material, but employers were desperate for Oracle admins back then. Oracle devs were billing $300 an hour which was bananas in 2003. So for me, studying for the OCP was kind of a no-brainer.
Now fast forward to 2019. My hard-won knowledge of Oracle is worth exactly nothing, nada, zip, goose egg. And all those years of study and toil. Well, let's just say I'm not getting that time back. Nowadays, techies face different challenges. If you want to be effective as a techie, you need to take a different approach. For one thing, you probably don't want to specialize in anything. For example, it's kind of hard to justify special specialization in any single database when even small companies have multiple databases in play. You might have Firebase for your mobile apps. Elastic Search for search intensive use cases, MySQL as an operational data store, Redshift for a data warehouse, Ephemeral storage, like Redis, the list goes on and that's just the persistence layer. As you move up the stack, it gets even more fractured and incoherent.
For DevOps, over-specialization in any competency actually lowers your overall effectiveness because, above all else, DevOps is about dealing with unknowns. How can you be an expert at the unknown? DevOps rely on adaptation, experimentation, and resilience...expertise? Not so much. Those same pressures are how we ended up with the notion of a full-stack engineer, not to say that there isn't value in specializing in either front end or back end. The question is whether being pretty good at both is preferable to being superlative at one. Speaking as someone who hires and runs build teams, I can tell you that nine times out of 10, I would much rather hire a full-stack dev. Those who focus exclusively on the front end or back end turn into intellectual bottlenecks.
Collaborative knowledge work is not where you want intellectual bottlenecks to form. And it's not just tech where experts are losing stock. The US Navy is transitioning to a so-called "optimal manning model" for ships. What does that mean? Well, during World War II, a destroyer was staffed with 350 specialized sailors. Today, the Navy staffs combat ships with 40 sailors with hybrid skills. These hybrid sailors are jacks of all trades and masters of none. For more information about this trend, follow the link in the show notes to an absolutely fascinating article. If you're in tech, this article is required reading that here's one quote from the article that'll give you some food for thought. This is from Frida Polli, co-founder of pymetrics, "fluid learning-intensive environments are going to require different traits than classical business environments and they're going to be things like the ability to learn quickly from mistakes, use of trial and error, and comfort with ambiguity".
The author of the article Jerry Useem riffs on this idea with really important insight, he says, "a world in which mental agility and raw cognitive speed eclipse hard-won expertise as a world of greater exclusion of older workers, slower learners and the less socially adept". In other words, entire categories of people will be marginalized in this new world in which the ability to learn and adapt is the coin of the realm.
What will it be like when tech turns into a demographic monoculture? Oh, wait, we're there already? Well, it can get worse, much worse. But it's not all gloom and doom, I see one potentially beneficial trend developing. In the post-expertise world tech hiring will need to be completely blown up and reorganized. It just has to. A process of elimination system, meant to assess and eliminate candidates on functional competencies, is just completely worthless in a world where functional competencies are being displaced by AI. So say goodbye to fizz buzz interviews and programming exercises and even managers creeping on your GitHub. It was always bullshit to be honest. But companies have clung to sacred cows because they haven't had to confront the brutal reality.
In the new world, players are going to have to figure out ways to assess candidates based on behavioral rather than functional competencies. I'm not sure how this is gonna play out. But as someone who does a lot of recruiting and hiring, there's only one form of expert that I am interested in hiring nowadays and that is an expert learner.
I'm hiring software engineers! Check out the jobs page to see my open positions.