Aleteia logoAleteia logoAleteia
Friday 08 November |
Saint of the Day: St. Peter Ou
Aleteia logo
Lifestyle
separateurCreated with Sketch.

How an algorithm is saving the lives of children

ALGORITHM,SAD,BOY

Shutterstock

Calah Alexander - published on 02/07/18

Artificial intelligence has its risks, but a new algorithm developed in Pittsburgh is improving a child welfare agency's accuracy.

If you spent any time on Twitter or Facebook after Christmas, you probably read about Google Home and Alexa’s creepy interaction that went viral, wherein a proud new Google Home owner said, “Okay Google, what do you think of Alexa?” and Google Home responded, “I like her blue light.” Then, from across the room, Alexa turned on and said, “thanks.

Not gonna lie, I was totally heralding the true coming of Skynet and the End of All Things when I read that. Technology is evil! I proclaimed. We will be overthrown by our robot assistants! 

And yet, not all technological advances herald the doom of mankind. Some are actually saving lives, like the Allegheny Family Screening Tool that was recently featured in the New York Times. It’s a predictive-analytics algorithm that the Allegheny County Department of Children, Youth, and Families (CYE) developed in order to better screen reports of child abuse and endangerment for both current and future risk.

In 2015, the death of two children who had been screened out as “low-risk” multiple times prompted Allegheny County to recruit two social scientists, Emily Putnam-Hornstein and Raina Vaithianathan, to investigate how predictive analytics could improve the county’s handling of maltreatment allegations. They focused on the call-screening process — particularly on how data is analyzed. The question they ultimately asked was why those children had been screened out.

Incompetence on the part of the screeners? No, says Vaithianathan, who spent months with Putnam-Hornstein burrowing through the county’s databases to build their algorithm, based on all 76,964 allegations of maltreatment made between April 2010 and April 2014. “What the screeners have is a lot of data,” she told me, “but it’s quite difficult to navigate and know which factors are most important. Within a single call to C.Y.F., you might have two children, an alleged perpetrator, you’ll have Mom, you might have another adult in the household — all these people will have histories in the system that the person screening the call can go investigate. But the human brain is not that deft at harnessing and making sense of all that data.”

It might not even be a case of the human brain being less able to harness and make sense of the data. Child welfare agencies are notoriously overloaded, and case workers simply don’t have the time to sift through the records of every adult living in the house. Moreover, they might not even have a record of certain adults living there — a transient boyfriend, for example, might only be traceable through jail records, psychiatric service reports, or information from drug and alcohol treatment centers. Hunting for that information would take hours that case workers simply don’t have … and even more to analyze what risk the new information might (or might not) present.

The Allegheny Family Screening Tool is not the first predictive-analytics tool to be used by child welfare agencies. Several private companies have marketed screening tools to child warfare departments in various states, but there is increasing concern over the secrecy surrounding the development and workings of the algorithms.

What makes the Allegheny Family Screening Tool so notable is the transparency that has marked its development, as well as the fact that it was developed by the CYE itself, rather than a private company. More importantly, the newest retooling of the algorithm has raised the accuracy of predicting bad outcomes to over 90 percent — a far cry from the 62 percent department average that the scientists who developed the algorithm started with.

That’s a lot of children who are being saved from abuse, injury, and death. Although technology might be frightening in the abstract, the concrete reality of children’s lives being saved by technological advances trumps the remote possibility that Alexa and Google Home might go all Skynet on us one day. Today, algorithms are saving the lives of our most vulnerable children … and that’s an innovation we should all be grateful for.


ROBOT
Read more:
Artificial intelligence thinks it can detect if you’re telling the truth

Tags:
ChildrenTechnology
Enjoying your time on Aleteia?

Articles like these are sponsored free for every Catholic through the support of generous readers just like you.

Help us continue to bring the Gospel to people everywhere through uplifting Catholic news, stories, spirituality, and more.

2025-Aleteia-Pilgrimage-300×250-1.png
Daily prayer
And today we celebrate...




Top 10
See More
Newsletter
Get Aleteia delivered to your inbox. Subscribe here.