Pesquisar
Close this search box.

Want to prove your business is fair? Audit your algorithm

Publicado em
algorithms_landlord-FINAL

YALE FOX’S BUSINESS doesn’t work unless everyone thinks its fair. His startup, Rentlogic, relies on an algorithm to score New York City landlords on how well they take care of their properties. It’s an easy way for tenants to avoid bedbugs and mold, and for landlords to signal they take good care of their properties. But it isn’t enough for Rentlogic’s score to just exist; Fox needs landlords and tenants to believe in it.

This was on his mind last fall when he heard Cathy O’Neil speak. O’Neil, a former Wall Street quant with a Harvard PhD in mathematics, wrote a book in 2016 to sound the siren of algorithmic injustice. That book, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, argued that poorly rendered algorithms reinforce discrimination and widen inequality.

It’s a common argument, but most critics offer few solutions for fixing this bias. But O’Neil has come up with a novel idea: an auditing process that asks companies to open up their technology for evaluation. After the lecture, Fox asked to meet with her. By the end of their coffee, he’d signed on as her first client for an external algorithmic audit.

As tech’s titans like Facebook, Google and Microsoft promise to better embed ethics into their practices, businesses are beginning to employ algorithmic audits to help them think through these systems. By opting-in to an audit, many businesses believe they’re getting early insight into tools that will eventually be required by regulators. In 2016, Obama’s White House called on companies directly to audit their algorithms. Deloitte has developed a practice to help companies manage algorithmic risk, and Accenture also reports advising clients on the topic. Still, such audits are in their very early days. “Right now the ability for algorithms for to be evaluated is an open question,” says DJ Patil, who was the United States Chief Data Scientist during the Obama administration. “We don’t know technically how to take the tech box and verify it.”

By opting-in to an audit, many businesses believe they’re getting early insight into tools that will eventually be required by regulators.

Fox isn’t alone in looking for outside approval. Frida Polli, who founded a company called Pymetrics that uses personality tests to help companies make hiring decisions, instituted an internal audit early on to make sure her hiring software didn’t discriminate. (She plans to make the auditing process available openly on GitHub shortly.)

Though there’s no standard protocol, generally an audit involves an outside entity coming in to review how a company develops its secret sauce—without compromising that company’s trade secrets. This measure is hardly altruistic. Businesses may need to eventually prove to regulators that their technology doesn’t discriminate against a protected class of people; such an an audit might prevent future litigation. For others, having a third-party seal of approval is good marketing, like the “organic” sticker on milk, suggesting to potential customers that they can trust that a company is being thoughtful in its approach. Regardless of reason, companies are opening themselves up to outside evaluations more frequently, a sign that such audits may become the norm.

Business Basics

Launched officially in 2013, Rentlogic’s algorithm grades the maintenance of every rental building in New York City, providing a label of A to F. To program the model, the company draws public information from the 311 calls that lodge violations against a building. When landlords choose to pay for an additional certification process, Rentlogic sends an inspector to ensure the information from the algorithm matches up with real life. For somewhere between $100 and $1,000 per year, depending on the size of the building, Rentlogic will certify its grade and provide a plaque for the company to display.

For Rentlogic to work, both landlords and tenants, two groups with competing interests, have to believe the business is fair. Fox thought that having a neutral third party, like O’Neil, review the company’s technology and offer a stamp of approval would be an easy way to prove Rentlogic’s protocol was bias-free.

Having a third-party seal of approval is good marketing, like the “organic” sticker on milk

Over the course of four months, O’Neill reviewed the way Rentlogic had designed its algorithm , and how the resulting technology was tested and deployed. She looked to see how the data was captured, how the code was tested, and how Rentlogic maintained the system. She also interviewed Fox as well as the programmer who was in charge of the algorithm.

As part of the audit, O’Neill created a tool she calls an ethical matrix, a worksheet that helps companies think through the consequences—intended and otherwise–of the algorithm’s results. Across the top of the matrix, are a half-dozen traits—accuracy, consistency, bias, transparency, fairness and timeliness. The vertical axis liststhe stakeholders Rentlogic must consider in its model: building owners, renters, the company, and NYC officials. O’Neil says the matrix creates “ a conversation around what you might need to worry about.” It’s intended to prompt programmers to consider important questions as they work. “Who cares if this algo works? Who cares if it fails?” she asks, “Who gets hurt if it’s wrong?” When these questions reveals ethically problematic consequences like, say, discriminating against a class of people, she flags the box yellow or red.

Rentlogic’s ethical matrix was peppered with a few yellow boxes, and had no red ones, prompting O’Neil to give the company her first official stamp of approval. She did have a few recommendations. She suggested, for example, that Rentlogic build in automated data checks so that as new data comes in, it doesn’t corrupt the existing data set. Another conversation concerned where Rentlogic’s data comes from when it certifies buildings. Right now, the company requests city inspectors visit buildings. The inspectors could be biased, says Fox, or accept bribes from building owners. To counter this, Fox is watching for patterns in the reports that come back from inspectors.

Of course, an audit doesn’t prove that a company has avoided all the unintended pitfalls of an algorithm. The auditor might not look at the right set of stakeholders, or pose the right set of questions. I asked O’Neil about this. “This is a subjective thing,” she said. “But I would say is our biggest problem isn’t that we have the wrong conversations.” Her concern is that the conversations aren’t happening at all.

Since she began working with Fox, O’Neil says she’s taken on a half-dozen clients. A law firm has employed her to review the use of a recidivism risk algorithm in a parole hearing. She’ll audit their white papers to determine whether the model is fair. Another client, Siemens, is a multinational enterprise tech company that employed her to help it develop an internal self-auditing system.

O’Neil’s seal of approval, which is graced with the logo from her newly formed company, called ORCAA (O’Neil Risk Consulting and Algorithmic Auditing), has already proven useful to Fox. For one, it has been helpful in getting funding. The company is close to completing a seed round fundraise. “One investor said just the fact that you thought of doing an audit where I’ve never heard about it impressed him” says Fox, adding that it motivated him to write a check.

It’s a baby step toward a more transparent data future: If we cannot strip algorithms of all their bias, at least we should rid them of the bias we can identify.

 

Por Wired

Fonte: https://www.wired.com/story/want-to-prove-your-business-is-fair-audit-your-algorithm/

 

 

 

 

 

 

 

COMPARTILHAR
VEJA TAMBÉM
photo-2024-03-14-15-26-17-1024x682

Na Dasa, IA alerta médicos de possíveis alterações em exames e acelera ressonâncias

Em entrevista ao JOTA, Leonardo Vedolin diz que ao menos 6 mil laudos passam diariamente pela IA, que é capaz de identificar 43 doenças
Esquenta AB2LEX24

Esquenta AB2LEX 24

O AB2L LAWTECH EXPERIENCE 24 está pronto para reunir mentes brilhantes e impulsionar a inovação no ecossistema jurídico.
web-summit

Web Summit Rio: a força do empreendedorismo e da tecnologia

O Web Summit Rio nos mostrou que a ativação de grandes eventos como esse trouxe também benefícios intangíveis para nos estimular na capacidade de perceber novamente a partir do acesso àquilo que não conhecíamos e, com isso, o país só tem a ganhar
1_ticiano-36284834

Ticiano Gadêlha: Entre a lei e a inovação

A Lei Complementar nº 182, de 2021, conhecida como Marco Legal das Startups, é um exemplo de como o país busca fomentar o ecossistema de inovação, oferecendo um ambiente regulatório mais flexível e adaptado às peculiaridades dessas empresas
EMPRESAS ALIADAS E MANTENEDORAS

Receba nossa Newsletter

Nossas novidades direto em sua caixa de entrada.