The benefits of removing human bias

“It’s the great thing about code,” he said of computer language. “It’s largely merit-driven. It’s not about what you’ve studied. It’s about what you’ve shipped.” –Jade Dominguez quoted in NYT, 27 April 2013

Gild, a San Francisco start-up, is taking the Money Ball approach to identify potentially overlooked talented programmers. As reported in the New York Times, the company uses an alogrithm to find productive and well-respected programmers who may not have the traditional qualifications, such as graduating from a top school, working for a top company, or being referred by a current employee.

‘The start of something powerful’
This is more than a story of an interesting, possibly controversial new algorithm. Of course, the algorithm is limited to information that is measurable and publicly available. Even so, the traces we leave online potentially say a lot about us. Gild focuses on contributions to well-known programming forums, such as GitHub. The algorithm doesn’t stop at how many posts a programmer makes, but how each individual’s participation is valued by the programming community. Something that might not easily show on a resume, but could on a Google search is this participation and contribution, how much an applicant contributes to a community and how those bits of code are taken up. Skills-based, yes, but more so, how an applicant has demonstrated the capacity for engaging with a larger community through productive contributions.

The chief scientist at Gild, Dr. Vivienne Ming offers a unique perspective about gender bias. With a PhD in psychology and computational neuroscience, Dr. Ming had experience working as teacher and researcher before undergoing a gender transition. As a woman, she realized that colleagues treated her differently, for example asking fewer questions about mathematics, but more importantly, she points to a recent Yale study in which participants found women applicants less competent for management positions at a research university.

Perhaps a reduction of human bias isn’t necessarily a bad thing. Gild’s small team seems to think that the algorithm is more merit-based than traditional methods of employment. Perhaps reviewing tangible accomplishments is a powerful step toward reducing unnecessary legacy biases. Others might argue that the algorithm is limited to only measuring what can be measured.

Top schools have reputations for a reason, yet graduating from one doesn’t necessarily guarantee that a person plays well with others or is a creative, talented developer. Likewise, another study from Yale cited in the NYT article says that employee referrals depend on the productivity of the employee referring. On the other hand, selecting people who lack these traditional achievement markers is a gamble, one that Gild is currently testing through its own hires.

Returning to Jade Dominguez, who is quoted at the beginning of this post, he is a programmer found by Gild who scored highly through their algorithm and is one of their first 10 employees. He does not have a college degree, but does have a volume of code and a well-respected position in the coding community (according to Gild’s algorithm measures). His experience reflects substantial practice, trial and error, problem-solving, and a record of completing projects. Increasingly, there is pressure to teach these skills at early levels of education, even before high school. The practice, however, is something that has to be pursued by the individual and is traditionally a marker of success, as demonstrated by musicians, artists, athletes, any area that involves mastery. Can machines predict talent? Can they predict employability? It will be interesting to watch how potentially reducing human bias and focusing on merit might change the hiring landscape. Then, of course, no matter what the algorithm finds, there’s always the interview.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.