FairPlay Targets Algorithmic Bias, Raises $4.5m For ‘Fairness as a Service’

FairPlay is a compliance startup that seeks to offer ‘fairness as a service’, or FaaS, by taking to task the algorithms that already play a role in our daily lives, for example those used in making mortgage decisions. It does this through data comparisons to a control group’s outcomes.

The US-based company also raised $4.5m a few weeks ago, which will allow it to grow. Their aim is to ‘work with leading financial institutions and our plan is to expand our reach into other domains’, the company said.

Its CEO and founder is Kareem Saleh, who at ZestFinance ‘worked with lenders to adopt AI underwriting’, and in the Obama Administration he helped manage the team that negotiated the Paris Climate Agreement.

The idea is that ‘FairPlay will be the go-to FaaS solution for any company using an algorithm to make a high-stakes decision about people’s lives – making it easier for all organisations that use algorithms to treat their customers fairly’.

They added that: ‘Decisions about whether to approve someone for a loan, a job, or even a kidney transplant were once made by humans, but today these decisions are made by algorithms. And many of these algorithms appear to be repeating the discrimination of the past.

‘This problem won’t go away on its own. We must take action to keep the bias of the past from being encoded into the algorithms that are deciding our future.’

How does it work? One area they have worked on is granting mortgages in America. They created a mortgage fairness map after analysing publicly available data for 2020 from the Home Mortgage Disclosure Act Database.

A screenshot from FairPlay that depicts the granting of mortgages in the US to female applicants. Red shows where their analysis finds a negative bias.

And as they explained: ‘We limited our review to applicants seeking to buy a new home, rather than applicants seeking to refinance or obtain a line of credit. Then we computed the Adverse Impact Ratio (AIR) for every county in the US. We also computed AIR values for 20 major metropolitan areas. [AIR is a measure of demographic parity; it does not control for risk.]’

They added that ‘Adverse Impact Ratio’ or AIR is ‘a calculation that regulators and courts use as a means of determining whether members of a protected class receive a certain outcome, such as getting approved for a loan, as often as a control group, such as white male borrowers, does.’

How they calculate the AIR.

They concluded: ‘Despite many efforts, today’s financial system remains unfair to people of colour, women and other protected communities.’

The $4.5m seed round was led by Third Prime, with participation from FinVC, TTV, Financial Venture Studio, Amara and Nevcaut Ventures.