in ,

Can Résumé-Reviewing Software Be As Biased As Human Hiring Managers?

Computer software is increasingly involved in tasks like winnowing down job applicants’ résumés, or deciding whether a bank should grant a home loan. But what if that seemingly neutral algorithm was unwittingly built with human bias baked in? Researchers have developed a new test to measure whether these decision-making programs can be as biased as humans — as well as a method for fixing them. A team of computer scientists from the University of Utah, the University of Arizona and Haverford College presented research last week on a technique that can figure out if these software programs discriminate unintentionally. The group was also able to determine whether a program violated the legal standards for fair access to employment, housing, and other situations.4-2The test uses a machine-learning algorithm, which itself is similar to these decision-making computer programs. If the test is able, for example, to accurately predict an applicant’s race or gender based on the data provided to the algorithms — even though race and gender are explicitly hidden — there is a potential for bias.

Share your thoughts

What do you think?

0 points
Upvote Downvote

Total votes: 0

Upvotes: 0

Upvotes percentage: 0.000000%

Downvotes: 0

Downvotes percentage: 0.000000%

Leave a Reply

Your email address will not be published. Required fields are marked *

Audi’s Plan to Make an Electric SUV With a 300-Mile Range

Google Project Helps you Stick Your Solar Panels Where the Sun Shines