This post introduces a new public FSFN dashboard on permanency timing in Florida’s child welfare system. If you want to just play with the dashboard, you can find it here. All but one of the graphics in this post come from the dashboard.
Every year, in legislatures across the country, well-meaning people propose bills to speed up permanency for foster kids. Permanency is a psychological concept focused on attachment, belonging, and community. Those are hard to legislate, so people focus instead on procedural definitions. The legal meaning of permanency is to close the court case and get the state out of a family’s life for as long as possible. In the process, hopefully leaving the child better than the system found them.
That closure could happen by returning a child home to a parent, placing the child in a guardianship, or having the child adopted. It could also mean a child aging out. The end result is largely the same to the state: one less case on its docket, and varying ongoing financial obligations depending on the way the child exited care. The path a case takes can change a child’s life.
This post introduces a new public FSFN dashboard: the Placement Provider Info Dashboard. If you want to jump straight there, feel free. You should click the fullscreen button in the bottom right corner. Below is the why and how of it.
There is a well-meaning bill working through the legislature that would exempt the names of foster parents from Florida’s public record laws. (Current law exempts their addresses, financials, and the floorplans of their houses.) The bill cites four “public necessities” to bar access to foster parents’ names: (1) it will help keep foster children’s names confidential, (2) it will prevent “unwanted contact” by the press, (3) it will prevent “unwanted contact” by the child’s relatives [i.e., parents], (4) not doing so would compromise foster parents’ privacy. The reasons don’t really stand up to scrutiny. More importantly, public access to information on foster placements is actually a good thing.
The elephant in the room is named Candi Johnson
Let’s start by acknowledging that Candi Johnson, the mother of two children in foster care, orchestrated the shooting of an elderly foster parent in Miami. She went to the foster home with her teenage son and demanded the children. When the foster parent fought back, the son shot her and fled with Candi Johnson and the kids. The foster parent is a hero for defending the kids even when she had no idea who was after them. The media reports that Candi Johnson had a long history of violence and had absconded with her children before. She is currently pending trial for attempted murder, kidnapping, armed burglary, and interfering with child custody.
The public records exemption would not have prevented Candi Johnson and her son from shooting the foster parent. The list of foster parents in the FSFN database has nearly 68,000 people on it. Candi Johnson’s kids would have aged out before she figured out which provider was caring for her children that way. More importantly, Candi Johnson did not use public records to find the foster home. Everyone in the neighborhood knew the woman was a foster parent. Everyone on the case knew Candi Johnson was violent. The foster mother didn’t know who Candi Johnson was when she banged on the door — or else she wouldn’t have opened it and maybe wouldn’t have accepted the placement. Having foster caregivers meet with parents in a supervised setting when they first take in kids could have actually prevented this. Further increasing the separation between them would not.
The bill isn’t about the kids
Candi Johnson stirred up a lot of latent anxieties that some (but certainly not all) foster caregivers feel about the families of the children they take in. The sponsor of the bill says that DCF received calls from “several” foster parents that they would quit if their names were not protected. I received a comment from one foster parent saying the same. The problem is that the bill doesn’t protect foster parents from the people they (rightly or wrongly) are afraid of. It does, however, make it harder to identify wrongdoing by DCF or other foster parents towards the kids they care about.
Let’s start with the bill’s first goal: protecting children’s privacy. The bill doesn’t actually do that. Having the names of foster parents does not tell me the names of the kids in their homes. If we want to keep foster children’s information confidential then we would also include provisions making it illegal for foster care providers to post about the child online, including pictures and over-sharing facebook posts. That’s how most parents in the system find their kids — a mutual friend spots the pictures and forwards them.
If we were serious about not disclosing a child’s foster care status, the bill would have provisions aimed at school personnel who tell a child’s classmates and protective investigators who question neighbors and disclose more than they should. We would also shut down National Adoption Day and Heart Gallery events where kids are brought to one place with giant signs that say “foster care” and television cameras rolling. Nobody is particularly worried about any of that.
A public records request on foster parents would currently give you a name and maybe a zip code, but nothing on the individual kids. You can get that much from a google search (and more). There is actually a much bigger and more immediate leak of information about foster children: the court hearings are open to the public. I have been in countless hearings where essentially this exchange happened in front of a room full of strangers waiting on other cases:
CLERK: Calling the Case of [insert actual name of the child or parents]. All parties please announce.
[everyone, including the parents and children go around and say their actual legal names]
JUDGE: Are the foster parents here? Please just use their initials.
[nobody mentions that there is no law that says foster parents get to be anonymous in court hearings]
FOSTER PARENT: J.M. Good morning, Your Honor.
JUDGE: Ok, we’re here today for a status on medical treatment. Did the child go to the gynecologist for an STD check? She was sexually assaulted. I am very concerned that you did not take her sooner.
CASE MANAGER: Yes, the child is present and can report. She tested negative.
We accept that the hearings are open because we believe the system is better when it works in public. In that context, foster parents’ names are not more sensitive than a child’s history of abuse. Foster parents are good people who largely volunteer to help kids and families that need it. They do not, however, have stronger privacy interests than the actual children and families in the system. When you sign up for this work, you sign up for it in public.
Second, the bill seeks to limit foster parents’ names because disclosure could lead to “unwanted contact” from the child’s “relatives.” This also won’t work and is actually a bad policy goal. Foster parents, parents, relatives, case managers, guardians ad litem, and therapy dogs all sit outside court together, sometimes waiting hours for the case to be called. In most courthouses it’s impossible to hide. And they shouldn’t want to hide: if foster parents are following the Quality Parenting Initiative co-parenting guidelines, then they use that time to talk with the parents and relatives to get to know them and the children better. I hope this sounds as direct as I mean it: a foster parent who doesn’t want contact with a child’s relatives should look for other ways to help children. Fostering isn’t for you.
There may be times when it’s not safe to engage in co-parenting with the child’s family, such as in Candi Johnson’s case. In those situations, court orders and injunctions directed at the parties on the case are the right remedy. Limiting the public’s access to information about foster care providers doesn’t solve the problem: the child’s family already has the foster caregiver’s identity information, or can easily get it by reading the Case Plan or by waiting in the parking lot for 20 minutes. In cases where there are serious safety concerns, there should be serious security responses. A general public records exemption is not a cure.
Finally, I’m not a First Amendment scholar, but “unwanted contact by the press” is why we have public records laws — the press and other watchdogs are supposed to investigate and sometimes that investigation is unwanted. The foster care system is a billion-dollar-a-year government industry. The fact that it recruits and underpays volunteers to perform some of its essential functions does not insulate it from scrutiny.
Now for why it’s actually good to make this information public.
Public information helps make better decisions
After publishing the Visualizing Foster Care Instability project, I received a lot of comments asking for a dashboard that gives information about the foster care providers. I attended a Florida Youth Shine quarterly meeting where a young person still in foster care said in a session, “There should be some way for us to know about placements before we go there. We should know as much about them as they do about us.” That resonated with me. We wouldn’t stay at a hotel without reading the reviews, but we expect foster kids to just show up at a house in the middle of the night and take it on faith that it will be safe.
So I made something: The Florida Foster Care Provider Dashboard. (I’m not really good at naming things.) The goal is to put everything we know about foster care providers in one place so that advocates and the public can make better decisions on how the system operates.
It’s functional, not pretty. I recommend viewing it in full screen because it has a lot of parts. Here’s what it looks like:
Here’s what you can learn from it:
How long do kids stay with ____? This chart in the top-right shows the distribution of how long a provider’s placements lasted. A provider’s average placement length may be skewed due to a few kids they kept for years. This chart shows the real breakdown. You can set it to measure in days, weeks, months, or years. Above, you can see that this provider had 315 placements in total and 134 that lasted less than a week. The average placement length was 45 days, but the median was only 9 days. This is not a stable placement for most kids who go there.
What kind of placement is this? The two boxes in the bottom left break down DCF’s own designation of the placement type. Above you can see that this provider was almost exclusively a foster care provider (orange bar), but spent 16 days as a relative placement, and 4 as a group home. The Service Type shows that this home was mostly for kids aged 13-17.
Why do kids leave this placement? The third box on the bottom left shows the reasons that placements with this provider ended. You can see above that 101 placements ended “in accordance with the case plan,” which usually means pursuant to a court order, while 61 placements ended because the child ran away. Twenty kids aged out of this home.
How has it changed over time? The box in the bottom right corner show the complete placement history for this provider. You can see that they started fostering in 2003 and had their last placement in 2018. Over time, placements with this provider have gotten shorter and shorter. That’s fairly normal for the long-time placements. The first few placements are usually the longest.
What about some summary stats? Right in the middle are the summary stats that I’ve calculated for the provider: number of children, number of placements, average placement length, median placement length, average miles kids moved from the last placement, and average concurrent kids (meaning how many children were placed there simultaneously on average).
I’ve also created Provider Flags that alert you to certain questions you might want to ask about a placement before putting a child there. They are found in the pink box right in the middle of the dashboard. The flags are based on the objective criteria below.
Death of Child: The provider had at least one placement where the end reason was “Death of Child”.
High Reunification, Adoption, Age Out, or Guardianship: The provider was placement to at least 6 children and more than 50% of them went on to reach the stated permanency goal. This does not mean the children exited care from the provider directly.
High Runaway: The provider had at least 6 placements and more than 25% of placements ended in the child running away.
High Turnover: The provider had at least 25 placements and more than 50% of placements ended in under 30 days.
High Disruption: The provider had at least 25 placements and more than 50% of placements ended because the provider requested a change, the child requested a change, or the placement “disrupted.”
High Concurrency: Children with this provider had an average of 10 or more other children placed there concurrently. This could be either because the provider’s capacity is 10 or more children or because the high turnover rate caused 10 or more children to pass through the provider.
High Mileage: The provider had at least 25 placements and the average child moved more than 50 miles from their previous placement. Miles are calculated from the center of a provider’s zip code region.
High Hospitalization: The provider had at least 6 placements and more than 25% of placements ended due to hospitalization of the child.
First Run Warning: The provider had at least 6 placements and more than 10% of children placed there ran for their first time while with the provider.
Baker Act Warning: The provider had at least 6 placements and more than 5% of placements ended because the child was Baker Acted. Baker Acts were calculated by finding children whose next placements were for “Routine/Emergency Mental Health Services”. Note that for small placements, even one Baker Act will raise this warning.
Arrest Warning: The provider had at least 6 placements and more than 5% of placements ended because the child was arrested. Arrests were calculated by finding children whose next placements were for “Correctional Placement” or whose placement end reason was “Incarceration/Detention”. Note that for small placements, even one arrest will raise this warning.
Night to Night Warning: The provider had at least 25 placements and more than 25% of placements were for 2 or fewer days.
I joked that this would be like Yelp for Foster Care, so I went ahead and added three more tabs to make that a reality:
School Map: This tab allows you to click on a provider and see the greatschools.org map for the school in its zip code.
Walk Score: This tab allows you to click on a provider and see the walkscore.com ratings for the zip code.
Yelp: This tab allows you to click on a provider and see the yelp.com most popular places in the zip code.
What can we learn from this?
The Herald Tribune did a story on the public records bill a few days ago. In it, the sponsor is quoted as saying:
“The foster parents are not the people who have been suspected of doing anything wrong,” Roach said. “It’s the parents themselves. … Those are the people that need scrutiny, not the foster care parents.”
Hold on now, nobody said bio parents shouldn’t be scrutinized — and they are heavily scrutinized, in the form of evaluations, classes, supervised visits, and home inspections. The question is whether a higher-than-zero level of public scrutiny of foster parents is warranted.
Yes, it is.
If we didn’t know who they are, we wouldn’t know that foster parent “Lor. Hic” has taken in 525 kids for 626 placements and asked for their removal 360 times (“High Disruption”, “Night to Night Warning”). If she’s agreeing to be a night-by-night placement, then she’s running a a shelter with a foster care license. We need to talk about that practice.
We wouldn’t know that foster parent “Tif. Gip.” has an average placement length of 29 days, but a median placement length of only 2 days (“High Turnover”). The average child placed with Tif. Gip. would experience over 7.5 roommates during their time there. That’s essentially a group home with a foster home license. We need to talk about that, too.
We wouldn’t know that foster parent “Sha. Rob.” experienced the Death of a Child in 2002, or know whether the teams involved in placing kids there for the next three years were aware of that fact.
We also wouldn’t know that foster parent “Kat. Mel.” had over 10% of her 138 kids run for the first time while in her care (“First Run Warning”). Same for “Pat. Fau.”, “Ann. Gre.”, “Gen. Zie.” and many others. What’s going on in these homes?
If the confidentiality gets expanded to institutional providers, then we wouldn’t know that the Hibiscus Vero Group Home has six flags: Arrest Warning, High Turnover, High Concurrency, High Mileage, First Run Warning, and the Death of a Child in 2012. It may be a perfectly lovely place, but I would want to ask questions about all of those issues before I sent a child there.
I’ve requested the provider payment database from DCF as well, and it’s pending. I plan to add an overlay on how much providers get paid. An earlier version of the payment data that I have showed that there was a foster parent in Miami (Jef. Hor.) that received $15,000 per month to care for one child. That is not a typo. If foster parent info is exempted from public records, we wouldn’t know that. The public has the right to scrutinize how the government spends its money. Fifteen-thousand a month is either excessive or what everyone else should get.
I’m also working on a version that overlays the Florida Sex Offender database. The foster home for “Tra. Dav.” that I used in the first example above is in a zip code with nearly 250 registered offenders. Most zip codes have a fraction of that. Maybe we want to think about that when we place kids there, especially certain kids.
Beyond just risk factors, there’s also the problem of the foster parents who have literally hurt kids. If their names are exempt from public records, we don’t know who they are unless they kill a child. (And kids are placed with providers even after others die, as seen above.) I appreciate what the sponsor is saying, but the statewide Guardian ad Litem Program was created and funded with taxpayer money and then significantly strengthened in response to a foster parent murdering a child. The potential for misconduct with our most vulnerable children warrants constant vigilance regardless of who the caregiver is. Trust should not be blind.
Okay Robert Latham, you’re a hypocrite for not publishing the foster parents’ names
Here is where some astute commenter sends me a pointed note that I have chosen to use initials instead of names. I must, therefore, agree that publishing the names would be dangerous.
I believe that publishing the full names would kick the anthill and close down access to a valuable source of public information. I’m using the initials so that readers can focus on the importance of the information and not the hypothetical problems that can come from releasing it (even though it’s been public literally forever). For the public version, I think the three-letter initials are sufficient to find a specific foster parent you’re looking for if you already know the name. I am happy to share the unredacted version with anyone working directly in the system who could use the information.
Public information helps us make the system better, but we can’t do that if we’re not allowed to know things.
Christopher O’Donnell and Nathaniel Lash at the Tampa Bay Times recently published an outstanding investigative piece on the harmful number of placement changes some kids experience while in foster care. They write:
Foster care is intended to be a temporary safety net for children at risk of neglect and abuse at home. Those children, many already traumatized, need love and stability to recover and thrive, child psychologists say.
But thousands of Florida’s foster children were put at risk of further psychological damage by an overburdened system that repeatedly bounced them from home to home and family to family, a Tampa Bay Times investigation found.
Times reporters analyzed more than one million child welfare records recording the movements or placements of about 280,000 foster children under Florida’s care between 2000 and 2017. They show that thousands of foster children led transient lives, many staying only a few nights in one place before being moved on to the next foster family or group home.
For those of us working in the system, placement instability isn’t news (many professionals are numb to it). But it is news for the rest of the world, whose picture of foster care is based on the heartstrings marketing of charitable agencies or the five o’clock stories of deaths and abuses seen in the news. The daily pains and indignities of foster care are rarely discussed by a public who doesn’t have the information or language to talk about them. I was so happy for the Times article because it gave people a new idea: many foster kids move around a lot and that’s a bad thing.
This blog has a different audience, though. The readers here know about the system, often from deep in the weeds, handling cases or overseeing agencies and programs. We have seen placements disrupt both in 30-person staffings and via unexpected text messages that our client’s been kicked out of a home we thought would last — if not forever, at least for a week. We need no emotional priming on this topic. Short of telling a child he can’t go home, the hardest thing we sometimes have to say to them is they “can’t stay there anymore.”
It’s awful. Hold on to that feeling for these next parts. I want to show you placement instability from a thousand miles up, where the people look like ants. I want to multiply that gut-wrench feeling by 17,000 to break through the numbness and help you remember that this is not okay.
The database that the Times used in its reporting is a public record in Florida. I don’t know that any newspaper had ever written a story using it before, and I commend them for doing so. I also have it. I’ve been reluctant to share it largely because it is 77.8 million data points and completely overwhelming. The article and the discussion around it, though, made me believe that it’s time.
Instead of presenting the data with statistics and aggregates, I’m giving it to you how I first began to really understand it: as maps. Every dot on the map is at least one placement for a child. The colors show what type of placement: blue is foster home, purple is relative, orange is group homes, and so forth. The size of the dot shows the length of time the child spent there, and the lines show the moves from placement to placement. Sometimes there are breaks in the lines when run episodes, visitations, or administrative entries intervene. For the most part, though, it’s one continuous path from a child’s first removal placement to their last.
Here is an example. Below is the child with the second most number of placement entries in the database: 286 lines out of a million. He was removed twice: once in 2009 and once in 2011. He spent 1,211 days in institutions, 678 with relatives, 543 in group homes, and 201 on run. Only 22 days were spent in foster care. This child averaged a new placement every 9.3 days, and was moved over 3,700 miles from placement to placement, back and forth along the I-4 corridor. He had approximately 817 DCF roommates over the years and his last entry in the database was a juvenile facility in Orange County. He’s probably long gone now.
Every subsequent placement dot on the map means another “you can’t stay there anymore.” It means leaving your things behind or taking what you can carry. It means a new house with new people and rules, including other foster kids who may have already staked out their territory. You have to learn a new way to turn on a shower and hope there’s a toothbrush for you if you didn’t bring one. You get a new time to eat dinner, go to bed, and wake up — and if you don’t adjust fast enough then you might get kicked out just for that and start all over again. Imagine if you woke up with a different family every 9.3 days for years. That is not okay.
When I was working on this project last year, I showed a former foster youth his map. It was complicated, with lots of dots and lines crisscrossing Florida. He stared at it quietly for a while, looked up and said, “I remember every one of those places.” He asked me to print it out, and now he keeps it in a folder and takes it out when he wants people to understand what foster care was like for him.
I’m publishing the maps for 17,305 anonymized kids using Tableau. Instead of showing all 280,000 children’s maps, I’ve instead created groups of children by notable categories. For best results, I suggest opening it on a computer or tablet and hitting the full-screen button in the bottom right corner. If you’re interested in the details on each category or the database itself, there are tabs at the top of the Tableau with more information.
Below is a list of the categories you can view using the drop-down menu on the Tableau. I’ll do write-ups on them later, but I hope you will take the time to explore through the maps and imagine what life was like for the kids in these groups. It’s important to note that most kids in foster care have 3 or fewer placements and reach permanency in reasonable times. Those aren’t the kids we’re looking at here.
10+ Baker Acts
10+ Correctional Placements
Incarcerated Over a Year
Substance Abuse Programs
10+ Night-to-Night Placements
Longest time in care
Top Movers – No Admin
Top Movers – Post Privatization
Mom & Baby Placements
Most Non-relative Placements
Went to Camp
Failed Reunifications (<30 days)
Group Home Dwellers
Before I end, there is one more map below that captures what it can mean to be in foster care. Child 310000648701 came into care on February 11, 2005. We don’t know his exact age (or his gender actually), but his placements are marked as “Traditional 0-5” foster homes. (In 2010 he makes the transition to “6-12” homes, so he has to be on the young side of 0-5 in 2005.) By the end of February, five foster parents had kicked him out. He had two placements in March, three in April, two in May, and only one in June — that one was a non-relative placement and lasted 41 days before they kicked him out, too. Then another placement for two days; then one for one day. Then he was placed with a relative who kept him for 918 days — that’s two and a half years — before the placement ended “in accordance with case plan” (which I think means pursuant to a court order) and he went back to foster care.
He bounced around some more through regular and therapeutic foster homes, landed briefly in a group home in 2010 for eight days of “respite” care, and was finally placed (in entry 42) in a non-relative placement that adopted him after 175 days.
This child had 36 placement providers and only one was a group home. Families kicked him out, again and again, and for much of that time he was under the age of five. He was with relatives for two and a half years without permanency, and then removed presumably by a judge. After six or so years, it ended in adoption, which is good. We can celebrate the adoption while simultaneously asking hard questions about his experience with 34 other families who failed to make that connection or possibly even try.
These maps tell stories that placement stability statistics cannot. Over the next few weeks I’ll share examples and more thoughts on the categories above. I hope they will have the same impact on others as they did on me.
A video of a child being forcibly removed from his mother has been in the news lately. It’s brutal to watch. A group of police officers and security guards yank at the one-year-old while another swings a taser wildly around the room at anyone who gets too close. The woman is on the floor. Her sin is apparently trespassing, i.e,. sitting on the floor instead of standing when there were no seats available in the four-hour line. The charges are later dropped because she was trespassing at a government office (to extend daycare for her child) where people are actually allowed to be, and people sit on the floor in airports all the time and nobody rips their kids from them. The harm’s already done to the child, though. Nobody in the crowd intervenes — they don’t want to get arrested, shot, or lose their place in line — but they document it for the world to see.
(For added absurdity and likely thanks to the word “baby” in the title, the video I watched was preambled with an ad for Zales’s Enchanted Disney collaboration wherein a rich-looking white lady finds a diamond ring on a table, puts it on, and imagines she’s become a Disney Princess®. She has no idea whose ring it even is, but it’s hers now (hey she deserves it). Nobody arrests her for theft and wrestles her baby out of her arms. She does not spend a few nights in jail for having the audacity of self-worth. She, in fact, lives happily ever after.)
Children are removed from their parents every day, and it frequently looks just like that video. It is traumatic for everyone (especially the child) and is supposed to be only for very exigent reasons. In Florida, we know that anywhere from 1,000 to 1,500 kids enter the foster care system each month. The video got me thinking about when those removals happen.
I had a lot of assumptions. I’ve heard that removals go up around the holidays, and that they go down in the summers when kids are home. I felt sure that fewer removals would happen on the weekends, but also that there should be no reason for that because kids would be in more danger when not in school. Removals are supposed to happen when they are unavoidable, not when they are convenient to the investigator.
And I thought: how many kids are actually removed on Christmas? That would be horrible.
As part of a project we are working on, our office came into possession of the entire Florida DCF placement database (anonymized and unforgivably massive). This database includes the removal dates and details of 280,839 kids going back to some who entered care in the 1980s. Looking only at the removals from January 1, 2007 to December 31, 2017, we have data on 156,357 kids. Some of those kids came into care multiple times, so the time period covers 181,799 removals. (Caveat: as with all real-world records, the data is only as good as it is. Given the large numbers here, it is reasonable to assume that errors are evenly spread out and not biased in any given direction.)
How many of those removals were on a holiday?
Since we have the dates for all the removals, it isn’t hard to count them. On a normal, non-holiday day, the average number of removals is about 46. For Christmas, the average is six. For Thanksgiving, it’s eight. Only 67 kids were removed on Christmas Day in Florida from 2007-2017.
There are a few takeaways there. First, holidays seem to suppress removal numbers, at least on the day of the holiday itself. The only holiday that appears above average is Columbus Day; but, with year-to-year variations, it — along with President’s Day and New Year’s Eve — is not statistically different than a normal non-holiday. There is no holiday where more kids are removed than average (though there are periods of the year when removals are up, discussed below).
So what about the theory that more kids are removed around the holidays? If you thought (like I did) that there would be a giant spike in removals before or after, say, Christmas — well, there isn’t. In fact, removals bump up slightly and then start dropping off around the week before Christmas (a.k.a. now). The slight bump isn’t enough to call a correction. A glance at DCF’s dashboard on investigations shows that Decembers are usually high points for closing investigations during the year, so these numbers are even more pronounced. Removals don’t pick up again until January when school is back in session.
Here is the year-round chart. This confirms that removals go down in the Summer and rise again in the new school year. If you thought sentimentality was what kept DCF from removing kids on Christmas, think again. You see similar holiday drops on the 4th of July and Veterans Day. The dips would be more pronounced for MLK Day, Labor Day and Memorial Day, except those holidays don’t happen on the same calendar day each year. Kids don’t get removed on holidays because investigators are on vacation.
So that’s the question: if only 15 kids had to be removed on the 4th of July, why did 48 have to be removed on the 7th of July?
What about the weekends?
On average only 14 kids per day were removed on Saturdays and Sundays. The highest day of the week for removals was Thursday — probably because court is held one day after a removal and nobody wants to go to court on a Saturday, for sure. At 14 removals per day, the weekend was on par with the 4th of July.
And, while we’re at it, what about time of day? Kids get removed during business hours. The later in the day, the higher the number of removals. Below you can see three distinct peaks in the following hours: 9:00am, 1:00pm, and 5:00pm. Only 416 kids in this dataset were removed in the 6:00am hour. Maybe children are safer before dawn? (Those 5,580 kids removed at midnight include entries without a valid date — i.e., “00:00:00”. Don’t read into that spike.) I’m not including a graph because it’s messy, but if you look at the whole week the highest rates of removals happened during the 5:00pm hour on Thursdays. No surprise.
So removals happen all the time, except holidays, weekends, and usually not outside of business hours. And they especially happen when investigators first get to work, after lunch, and right before they go home for the day. Something feels very wrong about that. It’s as if “risk” is also a product of convenience, which is not how child protection is supposed to work.
If the averages hold, around six (and up to twelve) kids will be removed on Christmas Day this year. Let those removals be necessary and kind.
The First DCA published statistics on its caseloads and decisions. But notably (as appellate judges like to say), the length of time they take to resolve cases was not reported. It motivated me to update the How Long do Appeals Take tableau.
The answer? Probably 120 to 170 days for a Dependency case, 260 to 576 days for a Criminal case, and 345 to 603 days for a Civil case.
In July of this year, the Florida foster care system did something unseen since February 2014: it shrunk. For the first time in over 50 months, the year-over-year (YOY) change in out-of-home care numbers went down by 45 children. By August it was down 118, and the reports out this month for October show a contraction of 165.
While a reduction of 165 kids does not seem like much in a system with over 24,000 children in it, the slowing actually started in January 2016, when the system was growing at a staggering YOY rate of 2,540 kids. Just the month before held a YOY increase of 2,683 — the fastest growth since data is available in 2003.
There’s no official definition of a contraction period, or any way to tell if one is real or a blip. I actually sat on this post for a few months to make sure the trend was stable — we’ve had hurricanes, elections, resignations, and other unusual events recently, so I wanted to let those pass.
In reality, though, changes from positive to negative OOHC growth (expansions to contractions) do not happen quickly and appear largely driven by intentional policies and not outside events. The expansion under Secretary Hadi in 2004-2006 lasted 19 months and ended abruptly with the Secretary’s resignation from office in December 2006. The subsequent contraction during the Butterworth and Sheldon administrations lasted 50 months and never wavered until three months into Secretary Wilkins’ term. That change of direction occurred in March 2011, right in the middle of the public hearings and media frenzy on the Barahona case, though the contraction had been slowing since 2009 and was well on the way to reversing course even without the public outrage to speed it along. (That is, media frenzy tends to reinforce — not set — existing child welfare policy positions.)
Oddly, Secretary Wilkins’ DCF changed its expansionary course by August 2012 and entered a contraction period that continued sharply until the month that he resigned in July 2013. (I’ve never heard a good explanation for that period.) The tide immediately turned back toward expansion, continuing through Interim Secretary Jacobo and halfway through Secretary Carroll’s tenure. Growth peaked in December 2015 and then precipitously fell, flattened, and then fell again. (Note that steady growth is still growth — the chart above shows change. The charts below show the actual counts.)
Even though the system as a whole tends to move in unison, not every geographic area shifts course at the same time. The current contraction has been driven largely by sharp decreases in OOHC in three circuits — 17 (Broward), 11 (Miami), and 18 (Brevard) — which shrunk a total of 670 children over the previous year in October. The top three growth circuits — 1 (Pensacola), 7 (DeLand), 9 (Orange/Osceola) — only grew by 127 kids in all.
Decreases were clustered largely, but not exclusively, in the southern regions. Here are the changes by county.
The contractions appear driven largely by reductions in removals. (I’ve chosen to use seasonal trends below to make the changes over time more clear. The actual numbers for removals and discharges have large but regular oscillations month to month due to seasonal effects like summer and national adoption day. The raw numbers are much harder to read.)
You can see the same decreases in removals statewide. Here’s the statewide seasonal trend graph.
Here are the seasonal trend graphs for all circuits. If you notice anything interesting or know why any of these charts look the way they do, let me know.
The ACLU of Florida did a fantastic (and super data-heavy) study of racial and ethnic disparities in the Miami criminal justice system called Unequal Treatment. It’s amazing and you should check it out. The study reminded me that DCF publishes its own statistics on race, but they are buried in the Trend Report excel graveyard. This weekend I decided to dig them up for folks to see.
All of the diagrams in this post are in tableaus here:
The analysis is based on data from May 2017 to April 2018.
The gist: DCF’s out-of-home care population is racially disparate. You start with the hypothesis that child abuse is equally likely across all racial populations and the system will treat everyone the same, therefore the OOHC population will mostly look like the general population. It doesn’t. Black kids are over-represented by 33.1% in OOHC. So-call “Other” kids (which are mostly mixed race and Asian kids) are over-represented by 37.4%. White kids, on the other hand, are under-represented by 15.5%. If you divide those numbers to get the ratio, you get approximately 1.59. This means non-white kids are 1.59x represented over white kids.
The differences aren’t uniform across the state. So your next hypothesis might be that whatever is causing the differences would be systemic across the state. It’s not. Racial disparity in OOHC varies greatly among the counties, with some even having a bias towards White kids. The map below shows the disparity index (i.e., the ratio of non-white to white bias in the system). Orange counties have a Non-white bias. Blue counties have a White-bias. (Counties with no statistically significant difference are shaded a neutral taupe color.)
What does a White-bias county look like? Dixie County has the out-of-home care numbers most biased toward White kids (it’s the dark blue county in the map above). The county has approximately 16,000 people, skews slightly Democrat, and has about 14.5% of its population below the poverty line. It is 77% rural and approximately 9.0% Black. It is the third-whitest county in Florida. Based on the race demographics, you would naively expect about four Black kids and 48 White kids in its OOHC population. What you get is 0 Black kids and 51 White. It’s not huge, but it is statistically significant. Compare the next example to see why.
What does a Non-white bias county look like? Miami. Miami is obviously huge and Latin — it has 2.7M people, and is 65% Hispanic (any race). It is 17.1% Black (non-Hispanic) and 15.4% White (non-Hispanic). About 51% of its population was foreign-born. It voted 63% Democratic in the 2016 elections. It’s racial disparity is extreme: Non-white kids are over-represented by 140%, while White kids are under-represented by 45%. You would expected about 1,400 white kids in foster care in Miami — you get around 775. Meanwhile, you would expect 435 Black kids, and you find about 1,050. The racial disparity index is 4.25.
Racial disparity generally increases the deeper into the system you get. Your next hypothesis may be that once kids are in the system they are treated by the same rules and same players, and should therefore have similar outcomes. No again. DCF breaks its numbers down by the stage of a case: Investigation, Verification, Removal, OOHC, Spending more than 12 months in OOHC, and Discharge from care. Racial disparity tends to rise the farther into a case you get.
The disparity index numbers go something like this. Remember that a positive number means that Non-white kids are represented that many times more than White kids. A negative number is biased towards White kids. A (*) indicates no statistically significant value.
If you look at the Statewide column, you can see that Investigations have a stronger bias than Verifications. Once a child is in care, Discharges tend to be less racially biased than Removals, which actually increases OOHC and 12+ bias over time. The pattern is on steroids for Miami where non-White kids are 4.44x more represented in the 12+ population than White kids.
What about placements? If the process itself has racial bias in it, then it may be safe to bet that placements have a similar bias. This time we assume that the breakdown of kids in a given placement type will be the same as the general OOHC numbers. It’s not. Statewide, Non-white kids are over-represented in the Runaway, Facility, and Other populations, while White kids are slightly over-represented in the Relative and Foster Care populations. The non-Relative caregiver placement did not show any statistically significant differences, possibly because it’s a smaller population and therefore requires more difference to be significant.
The expected vs. actual values for Facility placements look like this.
Breaking the data down by county makes it harder to find statistically significant values. For example, only eight counties show significant differences in their facility placement numbers.
Four counties had significant disparities in their foster home placements, and three of those were White-biased.
This isn’t to say that the other counties are perfectly balanced. When we parse the numbers down to the tiny levels of “the four kids on runaway in Dixie county” then differences have to be more pronounced to distinguish a real difference from just random noise and the techniques I’m using here aren’t very good at small numbers. This data says “we can’t see a difference with the tools we’re using,” not “there is no difference.”
We can’t tell why from this data. This is also important: this type of observational data does not show causation or even hint at underlying causes. A lot of writing has been done on systemic racism in the child welfare system, and the expert consensus is that the disproportionalities we see here are a consequence of (1) interplay between poverty and race at the individual and community level, (2) heightened governmental surveillance and intervention in non-white communities (like the ACLU report highlights), and (3) personal bias in individual decision-makers (for example the family that only wants to adopt a child of their own race or the judge who is less likely to approve the removal of a child of their own race).
Even if these effects may be undetectable in an individual case (or, more likely, they’re one of a hundred other things going on in a case), when you multiply them across tens-of-thousands of kids and decades, you can start seeing the cumulative impact. You only have to remove one more kid than you discharge each month to grow a population over time. If racial factors increase removals and suppress discharges even marginally, that can explode into real differences that must be addressed. For a full discussion see Shattered Bonds: The Color of Child Welfare by Dorothy Roberts.
Our office has been handling more appeals lately, and I am learning the rhythm of the process a little better each day. Appeals seem to go like this: (1) you lose or win at trial and feel really emotional about it, (2) you file your appeal or get noticed that someone filed one on you, and (3) you wait until you don’t feel anything at all anymore. Somewhere in there you file a brief. Then you wait some more and file other briefs. Sometimes a court reporter loses your transcripts and tells you your trial never happened. That can rouse some feelings, but they pass. Because mostly you just wait.
And while you’re waiting, everyone is constantly asking you how much longer they’ll have to wait. I haven’t yet mastered delivering earnest but vague statements of reassurance, such as “waiting is good because it means you haven’t lost yet.” I’ve heard that’s what appellate lawyers do. The people waiting don’t think waiting is good, because it means they haven’t won yet either.
I wanted a real answer to the question how much longer? I looked all over the internet. There were reports (cited below) on dependency and TPR apppeals from 2010 and 2015, but no follow-ups or ongoing data on whether those reforms were successful. There were also lengthy reports on trial court clearance statistics. There was nothing (that I could find) on the district courts. So I decided to create something.
But first, an answer to How long do I have to wait on my appeal?
Probably at least 122 days for a dependency or TPR case.
Probably at least 293 days for anything else.
Probably a little longer if your case is in the Second DCA.
There. Quit asking.
I put it all in a tableau so you can play with it.
The details are really interesting, if you’re into numbers. I put it all into a tableau, a quick version of which should appear here:
A full version with more stats is available here. (You can also use the link if the embedded tableau above didn’t show.) The full version breaks things down by DCA, case type, wins and losses, and originating divisions. I will commit to updating it for a few months to test for stability. I can’t promise after that.
The process – also, why didn’t this already exist?
My plan was basically to dive in, coming up for air every now and then to run the same “florida district court statistics” google search to see if I missed something. If anyone wants to recreate (or check) my work, here’s how it went.
Step 1 to finding an answer was to see what information I even had access to. All of the DCAs report their opinions on their websites. Three of them use a searchable system that creates spreadsheets by month. Two publish weekly text lists that you have to go through on your own. All of the DCAs use an online docket system that has a very convenient URL interface for going right to the case you want, unless that case is a dependency case.
Step 2 was figuring out how many cases they’re even putting out. My curiosity knows no bounds, but my actual time to spend on this was limited to a week or so. The answer was about 200 cases per month per DCA. That wasn’t bad. I planned to do a 10% sample of three months anyway, so 60 cases per DCA felt reasonable.
Step 3 was dealing with the fact that dependency cases are restricted from the public, so they are not available on the online docket system. Instead, I had to look them up on Westlaw and pull out their appellate case numbers and the outcomes. Fortunately, all of the DCAs use a linear case numbering system (for example 15-001 was filed earlier than 15-055 in the year 2015). Once I had case numbers and filing dates of known cases, I could interpolate the dependency filing dates to within a few days. That was good enough for these purposes.
Step 4 was pulling all of the data on 459 cases and punching it into a spreadsheet. I then crunched some probabilities, ran some ANOVAs, generated a few survival reports, and made some tableaus based on what was statistically relevant. Some people have other hobbies, I guess.
What I could tell you about appellate cases would not fill a book
The sample size of three months was enough to get a big picture number, but not enough to do a lot of fine parsing of the data. As I add months in the future, maybe things will stand out. In the meantime, here is what I can say with a reasonable amount of confidence.
More people won than I expected, but still not that many. About 11% of the cases were “wins.” I defined win very broadly to include anything that wasn’t a straight affirmance or dismissal of a petition.
The DCAs were surprisingly similar. I was concerned that a 10% sample would result in garbage. It didn’t. All of the samples were roughly normal. The 1st, 3rd, 4th, and 5th all had numbers that were statistically indistinguishable. (A bigger dataset may eventually tease them apart, but this one didn’t.) Only the 2nd DCA stood out as statistically higher than the rest. For example, the 2nd DCA processed half of its cases in 282 days (+/- 18), while the statewide average was 208 days (+/- 11).
Below is a survival graph. Imagine the top left corner as the starting line, and each district racing to the bottom. The cumulative survival of 1.0 equals 100% of cases still open (“surviving”), and 0.4 would equal 40% of cases still open. The first to the bottom (measured in days across the bottom) is the fastest. As you can see below, four of the DCAs reach the bottom at about the same time. The 2nd DCA stands out as statistically different, in large part because it was slower off the line and struggled with its last 20% of cases compared to other districts.
There wasn’t much variation among the types of cases, except for dependency. The average of 208 days also applied to case types, but dependency stood out as significantly faster. It took the DCAs only 121 days (+/- 2) to process half of their dependency cases. Civil and criminal were indistinguishable in this dataset, though more info later may tease them out as well. There weren’t enough probate, worker’s comp, family, or administrative appeals to say much about them individually yet.
You can see below that dependency cases resolved much faster than anything else. Civil, criminal, and family are pretty consistent in the middle. (Civil starts out slower, but eventually catches up to criminal.) The jaggy curves are probate and worker’s comp cases, which only had a few examples of each.
There was no measurable difference between writs and appeals. Again, a larger dataset may tease out a difference, but the line for writs and appeals were indistinguishable in this one.
Dependency “wins” follow the curve, but exaggerate it a little. Again again, there aren’t that many dependency wins either. But in this dataset at least, they tended to come out faster at first, then move closer to the win curve above after a case has already taken about 150 days. This is a slight exaggeration of the full win curve above, which also flips somewhere around 150 days.
You can’t predict a win based solely on amount of time open. Again, I want to stress that there are very few wins in general (11%) and they are scattered across the timeline. Knowing that an appeal has been open for 600 days doesn’t tell you much about its eventual outcome because the last 10% of the “loss” line accounts for far more cases than the last 10% of the “win” line.
Even though wins are a little faster or slower as a group, you can only know that after you know the outcome of the case. I ran the numbers — if you only know how many days the appeal took, you can predict a win with 5% accuracy. Adding in the DCA, appeal type, and division only gets you to 11% accuracy. That’s worse than guessing.
The good news is that this data supports a claim that the Court’s previous efforts (below) to speed up dependency appeals actually worked. Only time will tell if that is a stable finding or if I just happened to look at a particularly fast few months. Stay tuned.
I was wondering who holds the largest DCF contracts in Florida. The answer was right on the Florida Department of Financial Services website (thank you, Mr. Atwater), which lists public contracts with an ending date of February 29, 2012 or later.
I created a tableau where you can explore the DCF vendors by name, and see the list of contracts with details on their purpose, dates, and amounts. Click on the contracts to see their entry in the Florida Accountability Tracking System, including the contract documents, deliverables, payments, and audits.
The answer is that (depending on how you count) 12 organizations have received about half of DCF’s business since DFS started keeping track online. Of that dozen, six organizations were CBCs, four were behavioral health networks, and the final two work with sexually violent offenders and psychiatric patients. Smaller CBCs and BHNs make up the next 25%, with the final quarter split among hundreds of small organizations, all the way down to air conditioner repair jobs and copying fees.
The total contract amounts need to be understood with a dose of context. Our Kids, for example, is the vendor for $1 billion over 10 years (5 years original, with 5 years renewed). The payment amounts get adjusted year to year based on statutory and contractual terms. And the contract amount is not the total cost of the child welfare system when you also factor in state, county, municipal, and charitable funding for all of the people and organizations who make their living adjacent to the system (including, for now at least, me).
Still, a billion dollars is a huge contract and the question of how it is being managed in Miami is particularly relevant today when Our Kids’ leadership team has resigned but not left office and DCF is holding stakeholder interviews to determine how people fighting to drink from that spigot think things are going.