Pesquisar
Close this search box.

New York City moves to create accountability for algorithms

Publicado em
new-york-800x533

City Council passes bill addressing algorithmic discrimination in city government.

The algorithms that play increasingly central roles in our lives often emanate from Silicon Valley, but the effort to hold them accountable may have another epicenter: New York City. Last week, the New York City Council unanimously passed a bill to tackle algorithmic discrimination—the first measure of its kind in the country.

The algorithmic accountability bill, waiting to be signed into law by Mayor Bill de Blasio, establishes a task force that will study how city agencies use algorithms to make decisions that affect New Yorkers’ lives, and whether any of the systems appear to discriminate against people based on age, race, religion, gender, sexual orientation, or citizenship status. The task force’s report will also explore how to make these decision-making processes understandable to the public.

The bill’s sponsor, Council Member James Vacca, said he was inspired by ProPublica’s investigation into racially biased algorithms used to assess the criminal risk of defendants.

“My ambition here is transparency, as well as accountability,” Vacca said.

A previous, more sweeping version of the bill had mandated that city agencies publish the source code of all algorithms being used for “targeting services” or “imposing penalties upon persons or policing” and to make them available for “self-testing” by the public. At a hearing at City Hall in October, representatives from the mayor’s office expressed concerns that this mandate would threaten New Yorkers’ privacy and the government’s cybersecurity.

The bill was one of two moves the City Council made last week concerning algorithms. On Thursday, the committees on health and public safety held a hearing on the city’s forensic methods, including controversial tools that the chief medical examiner’s office crime lab has used for difficult-to-analyze samples of DNA.

As a ProPublica/New York Times investigation detailed in September, an algorithm created by the lab for complex DNA samples has been called into question by scientific experts and former crime lab employees.

The software, called the Forensic Statistical Tool, or FST, has never been adopted by any other lab in the country.

Council Member Corey Johnson, chair of the health committee, quoted two key findings of our investigation: that FST’s inventors had acknowledged a margin of error of 30 percent for one key input of the program, and that the program could not take into consideration that family members might share DNA.

New York City no longer uses the tool for new cases. But officials at the hearing said they saw no need to revisit the thousands of criminal cases that relied on the technique in years past.

“Would you be open to reviewing cases in which testing was done on very small mixtures, or do you feel totally confident in all of the methods and science that were used on every case that’s come through your lab?” Johnson asked the officials.

“We are totally confident,” answered Dr. Barbara Sampson, the city’s chief medical examiner.

The algorithm’s source code was a closely held secret for years until a federal judge granted a motion filed by ProPublica to lift a protective order on it in October. We then published the code.

Defense attorneys testified at the hearing, criticizing the medical examiner’s office for what they saw as a dangerous lack of transparency in the development of its DNA tools.

Some had joined together to write to the state’s inspector general in September, demanding an investigation into the lab and a review of past cases. The inspector general, Catherine Leahy Scott, has not yet indicated whether she will pursue it. Meanwhile, the New York State Commission on Forensic Science, which oversees the use of forensic methods in the state’s labs, has discussed the criticisms in executive session meetings. Those sessions are closed to the public, and commission members are prohibited from speaking about them.

After the hearing, Johnson said he was concerned by the discrepancies between the medical examiner’s testimony and that of advocates and intended to explore it further.

“This is a very, very, very important issue, and we have to ensure that methods that are used are scientifically sound, validated in appropriate ways, transparent to the public and to defense counsels, and ensure greater trust in the justice system,” said Johnson. “And I think that is what, hopefully, we can achieve, through asking more questions—and potentially thinking about legislation in the future.”

Fonte https://arstechnica.com/tech-policy/2017/12/new-york-city-moves-to-create-accountability-for-algorithms/

Por Lauren Kirchner

COMPARTILHAR
VEJA TAMBÉM
ia artigo

O Uso da Inteligência Artificial (IA) para Viabilizar a Controladoria Jurídica e o Combate às Fraudes Processuais

Imagem: Pixabay

Inovação na Tomada de Decisão

Imagem: Pixabay

O Impacto do DJE na Modernização do Sistema Judiciário Brasileiro

computer-4484282_1280

Uso de sistemas low/no code para gerenciamento de rotinas jurídicas

Imagem: Pixabay

Evolução da criatividade - da teoria para a prática

artigo obs

Ausência da parte Autora e do Preposto nas Audiências Judiciais Cíveis e Juizados Especiais. Quais as diferenças e quais as consequências?

Imagem: Pixabay

Advogados contra a Tecnologia: as máquinas irão substituir os advogados?

interface-3614766_1280

Desenvolvimento responsável da IA com a nova norma ISO/IEC 42001.

EMPRESAS ALIADAS E MANTENEDORAS

Receba nossa Newsletter

Nossas novidades direto em sua caixa de entrada.