It was common in the Data/AI circles to discuss or lament about the skills gap in the market for candidates, around 2015-16. This hasn’t changed in five years, inasmuch as recruiters and hiring managers still have these lamentations when asked about the hiring scene. The challenge is real for organizations, but I also believe that there are multiple reasons for this to be a subject of discussion half a decade down the line.
Perceptions and Reality
Hiring managers in large firms that want to jump start their data and AI projects after having missed the early adopter window in the early 2010s, may come to the realization that the talent pool has since been flooded with candidates – not with the quality promised by the pace of research – but with quality indicative of education and certification opportunities out there. Education in AI has its share of problems – from an emphasis on tools rather than principles, to a lack of focus on business value and operationalization, it doesn’t foster the kind of mindset often required for scalable, enterprise grade data science and AI solution development.
Education in AI also prepares incumbents for roles in research/academia and also industry, and this two-pronged objective rarely yields benefits for recruiters. A greater focus on applied AI/ML has certain disadvantages as well, since it can lead to a lack of depth. There is no silver bullet, but perhaps one approach here is to build multiple levels of skill and provide opportunities to back-fill talent into roles gradually, and build progression as a consequence.
It is also true that AI/ML lateral hires tend to be expensive to hire, especially if they have real world production scale AI/ML experience. This makes the challenge even more acute for those looking to hire, who are on a budget. Upskilling existing employees is obviously the best long term solution. For startups, attracting talent who have an upside in terms of being able to grow into advanced AI/ML roles for niche products, may be an attractive selling strategy – and even if the said startup won’t help them build deep skill in AI, it could give them a shot at success in being part of an AI product journey.
Writing effective and meaningful job descriptions and controlling the hiring and evaluation process well is the key here – without meaningful job descriptions and with ludicrous titles, it is easy to get on a slippery slope.
Overall, whether hiring from university graduate pools, or in the lateral hiring market, recruiters and hiring managers have to manage perceptions and reality for AI hiring.
Inability to Understand the Fast-Changing AI Landscape
The AI landscape changes almost on a daily basis, with ground-breaking research published almost each week. In 2020 alone, we have seen significant new research, the highlight of which has been GPT-3. In earlier years, we reached very high levels of performance in language translation tasks, and significant breakthroughs in generative modeling such as GANs. We are now seeing significant new work in reinforcement learning that promises to make agent based models easier to build and train. Despite being an industry professional and leader in the AI and ML space who’s constantly learning and upskilling, new research can even be hard for professionals like me to keep up with. This is the gap between the research world in AI and practitioners such as myself.
There is another even more worrying gap in understanding – and that is the gap between the practitioners and the managers or recruiters. Managers and recruiters who don’t understand the fundamentals of AI and machine learning, and who aren’t clued into the fast-changing AI landscape will be at a disadvantage when trying to hire good talent.
With lateral hires, the challenge is more nuanced. When bringing onboard data science candidates with prior experience, one needs to look out for a vast range of skills and capabilities that need to match with the organization’s current state and needs. There’s the additional challenge of ensuring that the lateral hire candidates are contemporary and able to solve or at the very least, reason about complex and new problems.
Impostor Syndrome and Gatekeeping
The flip side of Steve Jobs’ quote, of course is that someone would choose not to hire a smart person at all – because they may be too smart, or too capable.
Impostor syndrome (and its close cousin, the Dunning-Kruger effect) is widely seen in the AI practitioner world – where those new to the space who nevertheless have credentials in building AI solutions or products find themselves falling short of expectations because of the sheer complexity of real-world requirements on the one hand, and because of an inability to retain or sustain the knowledge they have earned credentials for. This sends a signal about their competence which is contrary to their actual skill in a subject. Naturally, this leads to problems in executing on the job for those suffering from impostor syndrome. Ultimately, I believe this stems from an attitude of credentialism (to borrow a phrase from entrepreneur and investor Balaji Srinivasan) – that earning a credential will get you to a destination – rather than a genuine interest in AI and a curiosity and open mind that enables the development of new skill and capability.
Gate-keeping in the AI/ML world is a consequence of Impostor Syndrome, and it is essentially the prevention of good talent from joining an organization in the hiring process, due to the insecurity felt by less-skilled incumbent employees who fear replacement by new, higher skilled employees. This is not new to organizations or unique to the AI/ML hiring space. In areas like data science and AI/ML hiring, it is plausible that hiring managers with limited understanding of a subject may fall prey to gate-keeping. After all, power-poisoning as is seen in a lot of management positions is a strong enabler of mono-cultures and also gate-keeping. Power poisoning is a natural impulse for anyone in management who wants to keep their jobs, especially in a time of rapid change. One way to ensure this doesn’t become a slippery slope, is to build a culture of recognition and reward those who invest time in learning new skills to stay relevant on the job. Another way for leaders to make all feel welcome is to ensure that those on the job are constantly challenged, because this is one way in which additional bandwidth can be justified, and consequently one can justify new roles, positions and new hires to fill these positions.
I hope this discussion on the AI Skills Gap and the implications for AI jobs and hiring provided interesting and important insights from the frontlines in AI and ML. If you’ve faced challenges like this, whether from bosses or hiring managers who experience impostor syndrome, or from recruiters who don’t understand the landscape for AI/ML hiring, you’re likely to resonate with a lot of the ideas here. As with all things in leadership and management, there’s rarely ever one good answer. However, constantly scanning the AI landscape and being clued into the patterns and trends can help build great AI teams and sustainable Data and AI organizations, be it i product development, research, or consulting.