top of page
Search
  • Writer's pictureAnushruti

Recidivism models v/s Recruitment Techniques


One of my all-time favorite Tom cruise movies is- ‘Minority Report’. It is based on a concept that in future, a technology makes it possible for cops to catch criminals before a crime is committed. While the movie is brilliantly stunning and a work of fiction, the use of recidivism models to prevent crime in real world is not completely unheard of.


Recidivism is defined as the tendency of a convicted criminal to re-offend. For years now, such models are being used by the guardians of law to assess dangers posed by each convict and assign sentences accordingly.


The use of algorithms stemming from data, to predict human behavior/ability is not restricted to the field of law. One widely used practice in the field of human resources is personality assessment tests and automatic screening processes.


The personality tests claim to be predictors of how a candidate would perform in the company when hired. These are not clear indicators of how the candidate would perform when hired, so like many big data programs, they settle for proxies. And proxies are bound to be inexact and often unfair.


Candidate screening algorithms learn how to replicate the same procedures that human beings had been following for applicant screening. And it is understood that these inputs were the problems! The computers learned from humans how to discriminate and efficiently carried out this work with greater speed. The ideal way to circumvent these prejudices is blind screening of applicants.


A good algorithm is one which self learns over multiple iterations, through precious feedback loops. Let’s say for an example, Amazon platform keeps showing suggestions based on your past purchases and by matching your profile to other similar buyers’ profiles. But if you order something they suggested and you did not like it, you are likely to give a poor review for it, giving the site precious feedback to improve its suggestion algorithm. But in the case of personality assessment tools, the algorithm is calibrated by technical experts and it hardly receives any feedback. We hardly have candidates investigating as to why they were rejected by the personality assessment test and what could have been wrong.


In reality, they are not really a good indicator of future performance, but a tool simply crafted to filter out applicants and save man hours that would be invested in scouting through piles of CVs. It is to exclude as many as possible, as cheaply as possible.


The question, therefore, is that with the advent of such techniques in recruitment, have we eliminated human bias or simply camouflaged it with technology!

164 views0 comments
Post: Blog2_Post
bottom of page