Tuesday, August 23, 2022

The violence of family policing, in analog and digital form

J. Khadijah Abdurahman speaking at a virtual panel on predictive analytics
in “child welfare” in New York City.  Organizers had to be pressured to include her.
t

In this powerful story from Logic Magazine, J. Khadijah Abdurahman, who is both a parent with lived experience dealing with New York City’s family policing agency, the Administration for Children’s Services – and a Tech Research Fellow at the UCLA Center for Critical Internet Inquiry, ties it all together: her own experience of retaliation after she complained about one of the private foster care agencies with which ACS contracts, an overview of how “predictive analytics” makes things worse, and a call for everyone to dig deeper into how ACS is using it. 

In a couple of ways, Abdurahman was fortunate: She has the education and the experience with the system – she was a kinship foster parent for her brother’s children – to know how to fight back, how to document everything and how to go on offense.  Most parents facing the kind of assault her family faced wouldn’t stand a chance on their own.  (ACS is doing its best to keep it that way, successfully lobbying against legislation to simply inform families of the few rights they already have.  So JMac For Families has launched a campaign to tell them – including ads on city buses.) 

Oh, and Abdurahman had one other thing going for her: When the family police pounded on the door and demanded entry at 2 a.m., the home was tidy.  

I’ve often written that there is no line of work I know of that takes more seriously than family policing the idea that cleanliness is next to godliness – and none where the consequences can be so awful.  Sure enough, Abdurahman writes, one of the family police caseworkers said: “We don’t take kids from apartments that look nice like yours …”  (So remember, if you really are one of the very few parents who brutalizes their children, just be sure to keep the home spotless!) 

But even so, Abdurahman writes, the caseworkers 

insisted on completing the most dreaded aspect of an investigation: waking up the kids for strip searches to check them for bruises. I marched each of them out one at a time into the bathroom, where they had to remove all of their clothes down to their underwear, including the baby. 

I mention this because, for some reason, the single element of trauma inflicted on children by the family police that reporters find hardest to believe really happens is the widespread practice of stripsearching.  (That’s why I so often cite a story from The New Yorker, which has an outstanding record for fact-checking.) 

Perhaps reporters find it hard to believe because they are mostly white and middle-class. And, as Abdurahman writes: 

While what happened to us might seem shocking to middle-class readers, for [us,] family policing … is the weather. 

Abdurahman doesn’t stop with her own case. Rather, she uses it to show how much worse things would have been had she lived in, say, Pittsburgh, which has the most highly developed system of computerized racial profiling.  No, that’s not what they call it.  But it’s now well-documented that Pittsburgh’s highly-touted Allegheny Family Screening Tool predictive analytics algorithm bakes-in racial bias.   

Abdurahman writes: 

What AFST presents as the objective determinations of a de-biased system operating above the lowly prejudices of human caseworkers are just technical translations of long-standing convictions about Black pathology. 

In New York, one leader of ACS during the administration of former Mayor Bill de Blasio, Gladys Carrion, understood that.  Abdurahman reminds us that when asked about predictive analytics in child welfare, Carrion replied: 

“It scares the hell out of me… I think about how we are impacting and infringing on people’s civil liberties,” she replied. She added that she ran an agency “that exclusively serves black and brown children and families” and expressed her concern about “widening the net under the guise that we are going to help them.” 

But Carrion misunderstood her job.  She thought her first responsibility was to protect children. But in the de Blasio Administration the first responsibility of any commissioner was to protect de Blasio.  So after the death of a child “known to the system” made headlines, Carrion was out and David Hansell – who perfectly understood de Blasio’s priorities -- was in.  

He started the drive to bring predictive analytics to ACS.  As Abdurahman notes, ACS is notoriously “opaque,” so it is unclear how it is being used.  A 2019 PowerPoint presentation offers some hints, suggesting that, at least at that time, they were not using it the way Pittsburgh uses AFST – or its even more Orwellian “hello baby” algorithm.  But almost from the start, ACS officials have made clear they’re open to that. 

And as Abdurahman makes clear, had it been applied in her case, everything could have been far worse.  Maybe it’s time for reporters in New York to take a close look at exactly what ACS is doing with predictive analytics, what it plans to do and whether the current commissioner, Jess Dannhauser, is willing to put a stop to it. 

After reading the article in Logic Magazine, you’ll see why predictive analytics proponents were so desperate to try to keep Abdurahman off a panel that was billed as a virtual “examination” of the topic.  Family advocates and defenders had to fight to get her included.  Start the event video here and you’ll see why they were so afraid – and why the rest of us have reason to be grateful that J. Khadijah Abdurahman and others like her are fighting back.  Because, as Abdurahman writes: 

Data and predictive risk modeling is not something that exists outside obscene forms of analog violence; it is an inextricable part of it.