Pennsylvania

Algorithms for screening children’s neglect raise concerns in Pennsylvania – wake-up calls

For family law lawyer Robin Frank, it has never been easier to protect his parents at one of the lowest points: when he is at risk of losing his child.

The job is never easy, but in the past she knew what she was against when dealing with child protection services in the family court. Now she is worried that she may be fighting something she cannot see. That statistical calculation is an opaque algorithm that helps social workers determine which family to investigate first.

“Many people don’t even know it’s being used,” Frank said. “The family must have the right to include all the information in the file.”

The Associated Press review identified many technology concerns as child welfare agencies use or consider tools similar to Allegheny County, Pennsylvania, from Los Angeles to Colorado and throughout Oregon. Strengthen racial disparities in child welfare systems. Related issues have already ruined some jurisdiction plans that use predictive models, especially tools dropped by Illinois.

According to a new study by the Carnegie Mellon University team exclusively obtained by AP, Allegheny’s algorithms for the first few years are disproportionate numbers due to “forced” neglect investigations when compared to white children. I showed a pattern to flag a black child in. Independent researchers who received data from the county also found that social workers did not agree with the algorithm-generated risk score in about one-third of the time.

County officials called the study “fictitious,” saying that social workers could disable the tool at any time.

Child welfare officials in Allegheny County, the birthplace of Mr. Rogers’ television district and icon’s child-centric innovation, are attracting national attention as a state-of-the-art tool that uses data to work for agents. From neglect to say that they support those trying to protect their children. Its subtle terms can include everything from poor housing to unsanitary conditions, but they are a different category of physical or sexual abuse that has been individually investigated in Pennsylvania and is not the subject of the algorithm.

“No worker should be required to make 14, 15, or 16,000 of these types of decisions in a particular year, using incredibly incomplete information,” the county said. Erin Dalton, Director of Child Welfare, said. A pioneer in the implementation of predictive child welfare algorithms.

____

Supported by the Pulitzer Prize Reporting Center, this story is part of the ongoing Associated Press series “Tracking,” which explores the power and consequences of algorithms driven by algorithms in people’s daily lives.

____

Critics say programs that rely primarily on data collected about the poor play a huge role in determining the fate of families, and local authorities are turning to artificial intelligence tools. I warn you that.

According to another study released last month, black children’s compared to about half of all other children reported when the tool worked to screen at comparable call rates on its own. It is advisable to investigate two-thirds co-authored by researchers who audited the county’s algorithms.

Advocates of the child welfare system, similar to the method used by the algorithm to make decisions in the criminal justice system, when similar tools are used in other child welfare systems with minimal or no human intervention. We are concerned that it may widen existing racial disparities.

Logan Stapleton, a researcher at Carnegie Mellon University, said: “(The county) makes a strong statement that I think is misleading in terms of accuracy and disparity.”

Family court hearings are not open to the public and records are sealed, so AP has recommended that the algorithm force investigation into child neglect, as a result of family members being sent to foster parents. It was not possible to directly identify the case that became. Care. Family members and their lawyers are not allowed to know the score and cannot be convinced of the role of the algorithm in their lives.

According to the American Civil Liberties Union, at least 26 states and child welfare agencies in Washington, DC are considering the use of algorithmic tools, and at least 11 states have introduced them.

Larimer County, Colorado, home of Fort Collins, is currently testing tools modeled after Allegheny and will share scores with family as the program progresses.

“It’s their life and their history,” said Thad Paul, manager of the county’s Children Youth & Family Services. “We want to minimize the power gaps associated with engaging in child welfare … we really think it’s unethical not to share scores with our families.”

Oregon, first implemented in 2018, does not share risk score figures from state-wide screening tools inspired by Allegheny’s algorithms. The Oregon Department of Human Services is currently preparing to hire an eighth new child welfare director in six years, but while under the supervision of the Governor-ordered Crisis Oversight Committee, it has at least four other algorithms. I investigated.

Recently, we have suspended a pilot algorithm designed to help foster children decide when they can reunite with their families. Oregon also considered three other tools: predictive models for assessing a child’s mortality and risk of serious injury, whether or not the child should be placed in foster care, and if so, where.

California has long considered a data-driven approach to a state-wide child welfare system before abandoning its proposal to use predictive risk modeling tools in 2019.

“During the project, the state also investigated concerns about how tools affect racial equity. These findings prompted the state to stop exploration,” said a department spokesman. Scott Murray said in an email.

The Los Angeles County Children’s and Family Services Department has been audited following the death of a high-profile child and is looking for a new director after his predecessor resigned at the end of last year. According to the county, they are piloting a “complex risk algorithm” to help identify the highest-risk cases under investigation.

In the first few months when social workers in the Mojave Desert City of Lancaster began using the tool, but county data show additional scrutiny flags, even though black children account for 22%. According to the US Census, the city’s child population was the subject of almost half of all the surveys that were set up.

Last order

every day

Get the top headlines from the wake-up call delivered on weekday afternoons.

The county did not immediately state the reason, but said it would decide whether to expand the tool later this year.

___

The Associated Press reporter Camille Facet contributed to this report.

___

Follow Sally Ho and Garance Burke on Twitter at @_sallyho and @garanceburke.

___

Contact AP’s Global Investigative Team (Investigative@ap.org or https://www.ap.org/tips/).

Algorithms for screening children’s neglect raise concerns in Pennsylvania – wake-up calls

Source link Algorithms for screening children’s neglect raise concerns in Pennsylvania – wake-up calls

Related Articles

Back to top button