NYC Passes Bill To Restrict 'Racist, Sexist' Hiring Software
NYC Passes Bill To Restrict 'Racist, Sexist' Hiring Software
BY TYLER DURDEN SUNDAY, NOV 21, 2021 - 07:00 PM’
New York City is on track to become the first city in the nation
to ban automated hiring tools unless a yearly bias audit can prove
that the software won't discriminate based on an applicant's race or gender,
and would force makers of said AI tools to open up their black box algos to
scrutiny.
The bill, passed by the city council in early November and would
go into effect January 2023 if signed into law, would also give candidates the
option of choosing an alternative process (a human) to review their job
applications, according to the Associated Press.
"I
believe this technology is incredibly positive but it can produce a lot of
harms if there isn’t more transparency," said Frida Polli,
co-founder and CEO of New York startup Pymetrics, which has lobbied for the
legislation that favors firms such as hers which publish 'fairness audits.'
Advocates point to a 2018 Reuters report
that Amazon scrapped a similar AI recruiting tool because it
favored men over women.
Pymetrics, whose core product is a suite of 12 games that are
based on cognitive science experiments, paid a third party company to audit
their software for bias, and to see if it passed what's
colloquially known as the 'four-fifths' rule - an informal hiring standard in
the United States according to Technology Review.
Pymetrics
and Wilson decided that the auditors would focus narrowly on one specific
question: Are the company’s models fair?
They
based the definition of fairness on what’s colloquially known as the
four-fifths rule, which has become an informal hiring standard in the United
States. The Equal Employment Opportunity Commission (EEOC) released guidelines in 1978 stating
that hiring procedures should select roughly the same proportion of men and
women, and of people from different racial groups. Under the four-fifths rule, Kim explains, “if
men were passing 100% of the time to the next step in the hiring process, women
need to pass at least 80% of the time.”
If a
company’s hiring tools violate the four-fifths rule, the EEOC might take a
closer look at its practices. “For an
employer, it’s not a bad check,” Kim says. “If employers make sure these tools
are not grossly discriminatory, in all likelihood they will not draw the
attention of federal regulators.”
In theory, if Pymetrics' suite was selecting white men for jobs, the
software can correct for bias by comparing game data from
those men with the results of women and people from other racial groups in
order to eliminate data points which don't correlate with race or gender, but
do distinguish successful employees, according to the report, which notes that
Pymetrics's system satisfies the four-fifths rule.
More
problems
Despite Pymetrics meeting the four-fifths rule, Technology
Review points out that the audit didn't actually prove that
the tool is free of any bias whatsoever, nor that it picks the most qualified
candidate for the job.
For
example, the four-fifths rule only requires people from different genders and
racial groups to pass to the next round of the hiring process at roughly the
same rates. An AI
hiring tool could satisfy that requirement and still be wildly inconsistent at
predicting how well people from different groups actually succeed in the job
once they’re hired. And if a tool predicts success
more accurately for men than women, for example, that would mean it isn’t
actually identifying the best qualified women, so the women who are hired “may
not be as successful on the job,” says Kim.
Another
issue that neither the four-fifths rule nor Pymetrics’s audit addresses is
intersectionality. The rule compares men with women and one racial group with
another to see if they pass at the same rates, but it doesn’t compare, say, white men with Asian
men or Black women. “You could have something
that satisfied the four-fifths rule [for] men versus women, Blacks versus
whites, but it might
disguise a bias against Black women,” Kim
says. -Technology Review
We have a feeling that there's no AI in the world that will
satisfy various identity groups given how many genders, races, and species are
now recognized for preferential treatment in the name of nondiscrimination.
Comments
Post a Comment