Categories
Charts & Graphs

This is not okay – Visualizing foster care placement instability

Christopher O’Donnell and Nathaniel Lash at the Tampa Bay Times recently published an outstanding investigative piece on the harmful number of placement changes some kids experience while in foster care. They write:

Foster care is intended to be a temporary safety net for children at risk of neglect and abuse at home. Those children, many already traumatized, need love and stability to recover and thrive, child psychologists say.

But thousands of Florida’s foster children were put at risk of further psychological damage by an overburdened system that repeatedly bounced them from home to home and family to family, a Tampa Bay Times investigation found.

Times reporters analyzed more than one million child welfare records recording the movements or placements of about 280,000 foster children under Florida’s care between 2000 and 2017. They show that thousands of foster children led transient lives, many staying only a few nights in one place before being moved on to the next foster family or group home.

(Nowhere to call home: thousands of foster children move so much they risk psychological harm)

For those of us working in the system, placement instability isn’t news (many professionals are numb to it). But it is news for the rest of the world, whose picture of foster care is based on the heartstrings marketing of charitable agencies or the five o’clock stories of deaths and abuses seen in the news. The daily pains and indignities of foster care are rarely discussed by a public who doesn’t have the information or language to talk about them. I was so happy for the Times article because it gave people a new idea: many foster kids move around a lot and that’s a bad thing.

This blog has a different audience, though. The readers here know about the system, often from deep in the weeds, handling cases or overseeing agencies and programs. We have seen placements disrupt both in 30-person staffings and via unexpected text messages that our client’s been kicked out of a home we thought would last — if not forever, at least for a week. We need no emotional priming on this topic. Short of telling a child he can’t go home, the hardest thing we sometimes have to say to them is they “can’t stay there anymore.”

It’s awful. Hold on to that feeling for these next parts. I want to show you placement instability from a thousand miles up, where the people look like ants. I want to multiply that gut-wrench feeling by 17,000 to break through the numbness and help you remember that this is not okay.

The database that the Times used in its reporting is a public record in Florida. I don’t know that any newspaper had ever written a story using it before, and I commend them for doing so. I also have it. I’ve been reluctant to share it largely because it is 77.8 million data points and completely overwhelming. The article and the discussion around it, though, made me believe that it’s time.

Instead of presenting the data with statistics and aggregates, I’m giving it to you how I first began to really understand it: as maps. Every dot on the map is at least one placement for a child. The colors show what type of placement: blue is foster home, purple is relative, orange is group homes, and so forth. The size of the dot shows the length of time the child spent there, and the lines show the moves from placement to placement. Sometimes there are breaks in the lines when run episodes, visitations, or administrative entries intervene. For the most part, though, it’s one continuous path from a child’s first removal placement to their last.

Here is an example. Below is the child with the second most number of placement entries in the database: 286 lines out of a million. He was removed twice: once in 2009 and once in 2011. He spent 1,211 days in institutions, 678 with relatives, 543 in group homes, and 201 on run. Only 22 days were spent in foster care. This child averaged a new placement every 9.3 days, and was moved over 3,700 miles from placement to placement, back and forth along the I-4 corridor. He had approximately 817 DCF roommates over the years and his last entry in the database was a juvenile facility in Orange County. He’s probably long gone now.

Every subsequent placement dot on the map means another “you can’t stay there anymore.” It means leaving your things behind or taking what you can carry. It means a new house with new people and rules, including other foster kids who may have already staked out their territory. You have to learn a new way to turn on a shower and hope there’s a toothbrush for you if you didn’t bring one. You get a new time to eat dinner, go to bed, and wake up — and if you don’t adjust fast enough then you might get kicked out just for that and start all over again. Imagine if you woke up with a different family every 9.3 days for years. That is not okay.


“I remember every one of those places.”

former foster youth

When I was working on this project last year, I showed a former foster youth his map. It was complicated, with lots of dots and lines crisscrossing Florida. He stared at it quietly for a while, looked up and said, “I remember every one of those places.” He asked me to print it out, and now he keeps it in a folder and takes it out when he wants people to understand what foster care was like for him.

I’m publishing the maps for 17,305 anonymized kids using Tableau. Instead of showing all 280,000 children’s maps, I’ve instead created groups of children by notable categories. For best results, I suggest opening it on a computer or tablet and hitting the full-screen button in the bottom right corner. If you’re interested in the details on each category or the database itself, there are tabs at the top of the Tableau with more information.

The Tableau is located here.

Below is a list of the categories you can view using the drop-down menu on the Tableau. I’ll do write-ups on them later, but I hope you will take the time to explore through the maps and imagine what life was like for the kids in these groups. It’s important to note that most kids in foster care have 3 or fewer placements and reach permanency in reasonable times. Those aren’t the kids we’re looking at here.

10+ Baker ActsHotel KidsShelter Dwellers
10+ Correctional
Placements
Incarcerated
Over a Year
Substance Abuse
Programs
10+ Night-to-Night
Placements
Institution DwellersTop Movers
Adult JailLongest time in careTop Movers –
No Admin
APD KidsMarried OutTop Movers –
Post
Privatization
DeathsMom & Baby
Placements
Unlikely
Adoptions
Failed AdoptionsMost Non-relative
Placements
Went to Camp
Failed Reunifications
(<30 days)
Most Roommates
Group Home DwellersOut-of-State Dwellers

Before I end, there is one more map below that captures what it can mean to be in foster care. Child 310000648701 came into care on February 11, 2005. We don’t know his exact age (or his gender actually), but his placements are marked as “Traditional 0-5” foster homes. (In 2010 he makes the transition to “6-12” homes, so he has to be on the young side of 0-5 in 2005.) By the end of February, five foster parents had kicked him out. He had two placements in March, three in April, two in May, and only one in June — that one was a non-relative placement and lasted 41 days before they kicked him out, too. Then another placement for two days; then one for one day. Then he was placed with a relative who kept him for 918 days — that’s two and a half years — before the placement ended “in accordance with case plan” (which I think means pursuant to a court order) and he went back to foster care.

He bounced around some more through regular and therapeutic foster homes, landed briefly in a group home in 2010 for eight days of “respite” care, and was finally placed (in entry 42) in a non-relative placement that adopted him after 175 days.

This child had 36 placement providers and only one was a group home. Families kicked him out, again and again, and for much of that time he was under the age of five. He was with relatives for two and a half years without permanency, and then removed presumably by a judge. After six or so years, it ended in adoption, which is good. We can celebrate the adoption while simultaneously asking hard questions about his experience with 34 other families who failed to make that connection or possibly even try.

These maps tell stories that placement stability statistics cannot. Over the next few weeks I’ll share examples and more thoughts on the categories above. I hope they will have the same impact on others as they did on me.

Categories
Charts & Graphs

Only 67 kids have been removed on Christmas Day in Florida (since 2007)

A video of a child being forcibly removed from his mother has been in the news lately. It’s brutal to watch. A group of police officers and security guards yank at the one-year-old while another swings a taser wildly around the room at anyone who gets too close. The woman is on the floor. Her sin is apparently trespassing, i.e,. sitting on the floor instead of standing when there were no seats available in the four-hour line. The charges are later dropped because she was trespassing at a government office (to extend daycare for her child) where people are actually allowed to be, and people sit on the floor in airports all the time and nobody rips their kids from them. The harm’s already done to the child, though. Nobody in the crowd intervenes — they don’t want to get arrested, shot, or lose their place in line — but they document it for the world to see.

(For added absurdity and likely thanks to the word “baby” in the title, the video I watched was preambled with an ad for Zales’s Enchanted Disney collaboration wherein a rich-looking white lady finds a diamond ring on a table, puts it on, and imagines she’s become a Disney Princess®. She has no idea whose ring it even is, but it’s hers now (hey she deserves it). Nobody arrests her for theft and wrestles her baby out of her arms. She does not spend a few nights in jail for having the audacity of self-worth. She, in fact, lives happily ever after.)

Children are removed from their parents every day, and it frequently looks just like that video. It is traumatic for everyone (especially the child) and is supposed to be only for very exigent reasons. In Florida, we know that anywhere from 1,000 to 1,500 kids enter the foster care system each month. The video got me thinking about when those removals happen.

I had a lot of assumptions. I’ve heard that removals go up around the holidays, and that they go down in the summers when kids are home. I felt sure that fewer removals would happen on the weekends, but also that there should be no reason for that because kids would be in more danger when not in school. Removals are supposed to happen when they are unavoidable, not when they are convenient to the investigator.

And I thought: how many kids are actually removed on Christmas? That would be horrible. 

As part of a project we are working on, our office came into possession of the entire Florida DCF placement database (anonymized and unforgivably massive). This database includes the removal dates and details of 280,839 kids going back to some who entered care in the 1980s. Looking only at the removals from January 1, 2007 to December 31, 2017, we have data on 156,357 kids. Some of those kids came into care multiple times, so the time period covers 181,799 removals. (Caveat: as with all real-world records, the data is only as good as it is. Given the large numbers here, it is reasonable to assume that errors are evenly spread out and not biased in any given direction.)

How many of those removals were on a holiday?

Since we have the dates for all the removals, it isn’t hard to count them. On a normal, non-holiday day, the average number of removals is about 46. For Christmas, the average is six. For Thanksgiving, it’s eight. Only 67 kids were removed on Christmas Day in Florida from 2007-2017.

There are a few takeaways there. First, holidays seem to suppress removal numbers, at least on the day of the holiday itself. The only holiday that appears above average is Columbus Day; but, with year-to-year variations, it — along with President’s Day and New Year’s Eve — is not statistically different than a normal non-holiday. There is no holiday where more kids are removed than average (though there are periods of the year when removals are up, discussed below).

So what about the theory that more kids are removed around the holidays? If you thought (like I did) that there would be a giant spike in removals before or after, say, Christmas — well, there isn’t. In fact, removals bump up slightly and then start dropping off around the week before Christmas (a.k.a. now). The slight bump isn’t enough to call a correction. A glance at DCF’s dashboard on investigations shows that Decembers are usually high points for closing investigations during the year, so these numbers are even more pronounced. Removals don’t pick up again until January when school is back in session.

Here is the year-round chart. This confirms that removals go down in the Summer and rise again in the new school year. If you thought sentimentality was what kept DCF from removing kids on Christmas, think again. You see similar holiday drops on the 4th of July and Veterans Day. The dips would be more pronounced for MLK Day, Labor Day and Memorial Day, except those holidays don’t happen on the same calendar day each year. Kids don’t get removed on holidays because investigators are on vacation.

click to expand

So that’s the question: if only 15 kids had to be removed on the 4th of July, why did 48 have to be removed on the 7th of July? 

What about the weekends?

On average only 14 kids per day were removed on Saturdays and Sundays. The highest day of the week for removals was Thursday — probably because court is held one day after a removal and nobody wants to go to court on a Saturday, for sure. At 14 removals per day, the weekend was on par with the 4th of July.

And, while we’re at it, what about time of day? Kids get removed during business hours. The later in the day, the higher the number of removals.  Below you can see three distinct peaks in the following hours: 9:00am, 1:00pm, and 5:00pm. Only 416 kids in this dataset were removed in the 6:00am hour. Maybe children are safer before dawn? (Those 5,580 kids removed at midnight include entries without a valid date — i.e., “00:00:00”. Don’t read into that spike.) I’m not including a graph because it’s messy, but if you look at the whole week the highest rates of removals happened during the 5:00pm hour on Thursdays. No surprise.

So removals happen all the time, except holidays, weekends, and usually not outside of business hours. And they especially happen when investigators first get to work, after lunch, and right before they go home for the day. Something feels very wrong about that. It’s as if “risk” is also a product of convenience, which is not how child protection is supposed to work.

If the averages hold, around six (and up to twelve) kids will be removed on Christmas Day this year. Let those removals be necessary and kind.

Categories
Charts & Graphs

How long do Florida appeals take?

The First DCA published statistics on its caseloads and decisions. But notably (as appellate judges like to say), the length of time they take to resolve cases was not reported. It motivated me to update the How Long do Appeals Take tableau.

Statewide – 10% sample of all cases Feb. to Nov. 2018

The answer? Probably 120 to 170 days for a Dependency case, 260 to 576 days for a Criminal case, and 345 to 603 days for a Civil case.

You can explore each district and case type at the tableau here.

Categories
Charts & Graphs

It’s sort of official: Florida foster care is in a contraction period

In July of this year, the Florida foster care system did something unseen since February 2014: it shrunk. For the first time in over 50 months, the year-over-year (YOY) change in out-of-home care numbers went down by 45 children. By August it was down 118, and the reports out this month for October show a contraction of 165.

While a reduction of 165 kids does not seem like much in a system with over 24,000 children in it, the slowing actually started in January 2016, when the system was growing at a staggering YOY rate of 2,540 kids.  Just the month before held a YOY increase of 2,683 — the fastest growth since data is available in 2003.

There’s no official definition of a contraction period, or any way to tell if one is real or a blip. I actually sat on this post for a few months to make sure the trend was stable — we’ve had hurricanes, elections, resignations, and other unusual events recently, so I wanted to let those pass.

In reality, though, changes from positive to negative OOHC growth (expansions to contractions) do not happen quickly and appear largely driven by intentional policies and not outside events. The expansion under Secretary Hadi in 2004-2006 lasted 19 months and ended abruptly with the Secretary’s resignation from office in December 2006. The subsequent contraction during the Butterworth and Sheldon administrations lasted 50 months and never wavered until three months into Secretary Wilkins’ term. That change of direction occurred in March 2011, right in the middle of the public hearings and media frenzy on the Barahona case, though the contraction had been slowing since 2009 and was well on the way to reversing course even without the public outrage to speed it along. (That is, media frenzy tends to reinforce — not set — existing child welfare policy positions.)

Oddly, Secretary Wilkins’ DCF changed its expansionary course by August 2012 and entered a contraction period that continued sharply until the month that he resigned in July 2013. (I’ve never heard a good explanation for that period.) The tide immediately turned back toward expansion, continuing through Interim Secretary Jacobo and halfway through Secretary Carroll’s tenure. Growth peaked in December 2015 and then precipitously fell, flattened, and then fell again. (Note that steady growth is still growth — the chart above shows change. The charts below show the actual counts.)

Even though the system as a whole tends to move in unison, not every geographic area shifts course at the same time. The current contraction has been driven largely by sharp decreases in OOHC in three circuits — 17 (Broward), 11 (Miami), and 18 (Brevard) — which shrunk a total of 670 children over the previous year in October.  The top three growth circuits — 1 (Pensacola), 7 (DeLand), 9 (Orange/Osceola) — only grew by 127 kids in all.

Decreases were clustered largely, but not exclusively, in the southern regions. Here are the changes by county.

The contractions appear driven largely by reductions in removals.  (I’ve chosen to use seasonal trends below to make the changes over time more clear. The actual numbers for removals and discharges have large but regular oscillations month to month due to seasonal effects like summer and national adoption day. The raw numbers are much harder to read.)

You can see the same decreases in removals statewide. Here’s the statewide seasonal trend graph.

Here are the seasonal trend graphs for all circuits. If you notice anything interesting or know why any of these charts look the way they do, let me know.

 

Categories
Charts & Graphs

Florida’s foster care system is racially biased, too

The ACLU of Florida did a fantastic (and super data-heavy) study of racial and ethnic disparities in the Miami criminal justice system called Unequal Treatment. It’s amazing and you should check it out. The study reminded me that DCF publishes its own statistics on race, but they are buried in the Trend Report excel graveyard. This weekend I decided to dig them up for folks to see.

All of the diagrams in this post are in tableaus here:

The analysis is based on data from May 2017 to April 2018.

The gist: DCF’s out-of-home care population is racially disparate. You start with the hypothesis that child abuse is equally likely across all racial populations and the system will treat everyone the same, therefore the OOHC population will mostly look like the general population. It doesn’t. Black kids are over-represented by 33.1% in OOHC. So-call “Other” kids (which are mostly mixed race and Asian kids) are over-represented by 37.4%. White kids, on the other hand, are under-represented by 15.5%. If you divide those numbers to get the ratio, you get approximately 1.59. This means non-white kids are 1.59x represented over white kids.

Actual vs. Expected child counts in OOHC

The differences aren’t uniform across the state. So your next hypothesis might be that whatever is causing the differences would be systemic across the state. It’s not. Racial disparity in OOHC varies greatly among the counties, with some even having a bias towards White kids. The map below shows the disparity index (i.e., the ratio of non-white to white bias in the system). Orange counties have a Non-white bias. Blue counties have a White-bias. (Counties with no statistically significant difference are shaded a neutral taupe color.)

Racial disparity in DCF out-of-home care by county

What does a White-bias county look like? Dixie County has the out-of-home care numbers most biased toward White kids (it’s the dark blue county in the map above). The county has approximately 16,000 people, skews slightly Democrat, and has about 14.5% of its population below the poverty line. It is 77% rural and approximately 9.0% Black. It is the third-whitest county in Florida. Based on the race demographics, you would naively expect about four Black kids and 48 White kids in its OOHC population. What you get is 0 Black kids and 51 White. It’s not huge, but it is statistically significant. Compare the next example to see why.

Dixie County, OOHC expected vs. actual counts

 

What does a Non-white bias county look like? Miami.  Miami is obviously huge and Latin — it has 2.7M people, and is 65% Hispanic (any race). It is 17.1% Black (non-Hispanic) and 15.4% White (non-Hispanic). About 51% of its population was foreign-born. It voted 63% Democratic in the 2016 elections.  It’s racial disparity is extreme: Non-white kids are over-represented by 140%, while White kids are under-represented by 45%. You would expected about 1,400 white kids in foster care in Miami — you get around 775. Meanwhile, you would expect 435 Black kids, and you find about 1,050. The racial disparity index is 4.25.

 

Miami-Dade County, OOHC expected vs. actual counts

 

Racial disparity generally increases the deeper into the system you get. Your next hypothesis may be that once kids are in the system they are treated by the same rules and same players, and should therefore have similar outcomes. No again.  DCF breaks its numbers down by the stage of a case: Investigation, Verification, Removal, OOHC, Spending more than 12 months in OOHC, and Discharge from care.  Racial disparity tends to rise the farther into a case you get.

The disparity index numbers go something like this. Remember that a positive number means that Non-white kids are represented that many times more than White kids. A negative number is biased towards White kids. A (*) indicates no statistically significant value.

Statewide Dixie Miami
Inv 1.93 2.11 3.78
Ver 1.73 * 2.97
Rmvl 1.6 * 3.79
OOHC 1.59 -19.03 4.25
12+ 1.65 4.44
Dschrg 1.57 * 3.62

If you look at the Statewide column, you can see that Investigations have a stronger bias than Verifications. Once a child is in care, Discharges tend to be less racially biased than Removals, which actually increases OOHC and 12+  bias over time. The pattern is on steroids for Miami where non-White kids are 4.44x more represented in the 12+ population than White kids.

What about placements? If the process itself has racial bias in it, then it may be safe to bet that placements have a similar bias. This time we assume that the breakdown of kids in a given placement type will be the same as the general OOHC numbers. It’s not. Statewide, Non-white kids are over-represented in the Runaway, Facility, and Other populations, while White kids are slightly over-represented in the Relative and Foster Care populations. The non-Relative caregiver placement did not show any statistically significant differences, possibly because it’s a smaller population and therefore requires more difference to be significant.

The expected vs. actual values for Facility placements look like this.

Breaking the data down by county makes it harder to find statistically significant values. For example, only eight counties show significant differences in their facility placement numbers.

Disparity in facility placements

 

Four counties had significant disparities in their foster home placements, and three of those were White-biased.

Disparity in foster homes

 

This isn’t to say that the other counties are perfectly balanced. When we parse the numbers down to the tiny levels of “the four kids on runaway in Dixie county” then differences have to be more pronounced to distinguish a real difference from just random noise and the techniques I’m using here aren’t very good at small numbers. This data says “we can’t see a difference with the tools we’re using,” not “there is no difference.” 

We can’t tell why from this data. This is also important: this type of observational data does not show causation or even hint at underlying causes. A lot of writing has been done on systemic racism in the child welfare system, and the expert consensus is that the disproportionalities we see here are a consequence of (1) interplay between poverty and race at the individual and community level, (2) heightened governmental surveillance and intervention in non-white communities (like the ACLU report highlights), and (3) personal bias in individual decision-makers (for example the family that only wants to adopt a child of their own race or the judge who is less likely to approve the removal of a child of their own race).

Even if these effects may be undetectable in an individual case (or, more likely, they’re one of a hundred other things going on in a case), when you multiply them across tens-of-thousands of kids and decades, you can start seeing the cumulative impact. You only have to remove one more kid than you discharge each month to grow a population over time. If racial factors increase removals and suppress discharges even marginally, that can explode into real differences that must be addressed.  For a full discussion see Shattered Bonds: The Color of Child Welfare by Dorothy Roberts.

Categories
Charts & Graphs

Is my appeal taking a long time? Probably not.

Our office has been handling more appeals lately, and I am learning the rhythm of the process a little better each day. Appeals seem to go like this: (1) you lose or win at trial and feel really emotional about it, (2) you file your appeal or get noticed that someone filed one on you, and (3) you wait until you don’t feel anything at all anymore. Somewhere in there you file a brief. Then you wait some more and file other briefs. Sometimes a court reporter loses your transcripts and tells you your trial never happened. That can rouse some feelings, but they pass. Because mostly you just wait.

And while you’re waiting, everyone is constantly asking you how much longer they’ll have to wait. I haven’t yet mastered delivering earnest but vague statements of reassurance, such as “waiting is good because it means you haven’t lost yet.” I’ve heard that’s what appellate lawyers do. The people waiting don’t think waiting is good, because it means they haven’t won yet either.

I wanted a real answer to the question how much longer? I looked all over the internet. There were reports (cited below) on dependency and TPR apppeals from 2010 and 2015, but no follow-ups or ongoing data on whether those reforms were successful. There were also lengthy reports on trial court clearance statistics. There was nothing (that I could find) on the district courts. So I decided to create something.

But first, an answer to How long do I have to wait on my appeal?

Probably at least 122 days for a dependency or TPR case.

Probably at least 293 days for anything else. 

Probably a little longer if your case is in the Second DCA.

There. Quit asking.

I put it all in a tableau so you can play with it.

The details are really interesting, if you’re into numbers. I put it all into a tableau, a quick version of which should appear here:

A full version with more stats is available here. (You can also use the link if the embedded tableau above didn’t show.) The full version breaks things down by DCA, case type, wins and losses, and originating divisions. I will commit to updating it for a few months to test for stability. I can’t promise after that.

The process – also, why didn’t this already exist?

My plan was basically to dive in, coming up for air every now and then to run the same “florida district court statistics” google search to see if I missed something. If anyone wants to recreate (or check) my work, here’s how it went.

Step 1 to finding an answer was to see what information I even had access to. All of the DCAs report their opinions on their websites. Three of them use a searchable system that creates spreadsheets by month. Two publish weekly text lists that you have to go through on your own.  All of the DCAs use an online docket system that has a very convenient URL interface for going right to the case you want, unless that case is a dependency case.

Step 2 was figuring out how many cases they’re even putting out. My curiosity knows no bounds, but my actual time to spend on this was limited to a week or so.  The answer was about 200 cases per month per DCA. That wasn’t bad.  I planned to do a 10% sample of three months anyway, so 60 cases per DCA felt reasonable.

Step 3 was dealing with the fact that dependency cases are restricted from the public, so they are not available on the online docket system. Instead, I had to look them up on Westlaw and pull out their appellate case numbers and the outcomes. Fortunately, all of the DCAs use a linear case numbering system (for example 15-001 was filed earlier than 15-055 in the year 2015). Once I had case numbers and filing dates of known cases, I could interpolate the dependency filing dates to within a few days. That was good enough for these purposes.

Step 4 was pulling all of the data on 459 cases and punching it into a spreadsheet. I then crunched some probabilities, ran some ANOVAs, generated a few survival reports, and made some tableaus based on what was statistically relevant. Some people have other hobbies, I guess.

What I could tell you about appellate cases would not fill a book

The sample size of three months was enough to get a big picture number, but not enough to do a lot of fine parsing of the data. As I add months in the future, maybe things will stand out. In the meantime, here is what I can say with a reasonable amount of confidence.

More people won than I expected, but still not that many. About 11% of the cases were “wins.” I defined win very broadly to include anything that wasn’t a straight affirmance or dismissal of a petition.

The DCAs were surprisingly similar.  I was concerned that a 10% sample would result in garbage. It didn’t. All of the samples were roughly normal. The 1st, 3rd, 4th, and 5th all had numbers that were statistically indistinguishable. (A bigger dataset may eventually tease them apart, but this one didn’t.) Only the 2nd DCA stood out as statistically higher than the rest. For example, the 2nd DCA processed half of its cases in 282 days (+/- 18), while the statewide average was 208 days (+/- 11).

Below is a survival graph. Imagine the top left corner as the starting line, and each district racing to the bottom. The cumulative survival of 1.0 equals 100% of cases still open (“surviving”), and 0.4 would equal 40% of cases still open. The first to the bottom (measured in days across the bottom) is the fastest. As you can see below, four of the DCAs reach the bottom at about the same time.  The 2nd DCA stands out as statistically different, in large part because it was slower off the line and struggled with its last 20% of cases compared to other districts.

There wasn’t much variation among the types of cases, except for dependency. The average of 208 days also applied to case types, but dependency stood out as significantly faster. It took the DCAs only 121 days (+/- 2) to process half of their dependency cases. Civil and criminal were indistinguishable in this dataset, though more info later may tease them out as well. There weren’t enough probate, worker’s comp, family, or administrative appeals to say much about them individually yet.

You can see below that dependency cases resolved much faster than anything else. Civil, criminal, and family are pretty consistent in the middle. (Civil starts out slower, but eventually catches up to criminal.) The jaggy curves are probate and worker’s comp cases, which only had a few examples of each.

There was no measurable difference between writs and appeals. Again, a larger dataset may tease out a difference, but the line for writs and appeals were indistinguishable in this one.

Dependency “wins” follow the curve, but exaggerate it a little.  Again again, there aren’t that many dependency wins either. But in this dataset at least, they tended to come out faster at first, then move closer to the win curve above after a case has already taken about 150 days. This is a slight exaggeration of the full win curve above, which also flips somewhere around 150 days.

You can’t predict a win based solely on amount of time open.  Again, I want to stress that there are very few wins in general (11%) and they are scattered across the timeline. Knowing that an appeal has been open for 600 days doesn’t tell you much about its eventual outcome because the last 10% of the “loss” line accounts for far more cases than the last 10% of the “win” line.

Even though wins are a little faster or slower as a group, you can only know that after you know the outcome of the case. I ran the numbers — if you only know how many days the appeal took, you can predict a win with 5% accuracy. Adding in the DCA, appeal type, and division only gets you to 11% accuracy. That’s worse than guessing.

The takeaway

The good news is that this data supports a claim that the Court’s previous efforts (below) to speed up dependency appeals actually worked. Only time will tell if that is a stable finding or if I just happened to look at a particularly fast few months.  Stay tuned.

More info on the appellate courts

Recommendations for Dependency and Termination of Parental Rights Appeals (Florida Supreme Court – Commission on District and Trial Court Performance & Accountability – April 2016)

Dependency and Termination of Parental Rights Appeals (Florida Supreme Court – Commission on District and Trial Court Performance & Accountability – June 2014)

A Review of the Florida District Courts of Appeal Boundaries and Workload (OPPAGA – February 2017)

Examining the Work of State Courts: An analysis of 2010 state court caseloads (Court Statistics Project – December 2012)

Clearance Rate Dashboards (flcourts.org – September 2009 to present)

Categories
Charts & Graphs

The Florida foster care stats you’ve been waiting for: 24,152 kids in out-of-home care & which CBC is the biggest?

It’s been awhile since I posted a pile of graphs, so here we go. Out-of-home care numbers are still rising. That’s probably not a good thing.

Oct Supervision Size
The number of children in out-of-home care continues to rise. The number of children in their homes under DCF supervision is a little down. This suggests we’re removing kids faster than we can obtain permanency for them.

 

All is not equal in the types of placements. Relatives are still shouldering most of the burden for foster care.

Placement Type Breakdown Oct 2017
Relative placements continue to be the largest single placement type for kids, growing faster than licensed family care. Facility placements have been flat for some time. Non-relative placements have seen an over 60% increase since funding was authorized for these placements in 2014.

 

Finally, which CBC is really the biggest? It has always depended on how you measure.

regino by size
The relative ranking of each region has been fairly consistent, even with increases and decreases in the statewide number of children in care. Most regions are rebounding to their pre-2007 days, with the exception of the Southern Region which is still comparatively low.

 

cbc by size
The rankings of CBCs by size has also been fairly consistent over time. Our Kids and Eckerd Hillsborough have consistently supervised the largest number of children. Again, some CBCs have rebounded, others haven’t. (Note these numbers don’t show where new CBCs took over for closed ones.)

 

Sheet 13
The rankings are very different when viewed as children in care per 1,000 children in the community. For example, Our Kids has consistently ranked as one of the smallest CBCs in proportion to its geographic area.

 

The numbers here are from DCF’s Child Welfare Trend Reports. DCF keeps their own visualizations at the Child Welfare Dashboard. If you see anything wrong, interesting, or that-I-missed, let me know.  Have a great weekend!

Categories
Charts & Graphs

Florida DCF contracts are worth billions – where does the money go?

I was wondering who holds the largest DCF contracts in Florida. The answer was right on the Florida Department of Financial Services website (thank you, Mr. Atwater), which lists public contracts with an ending date of February 29, 2012 or later.

I created a tableau where you can explore the DCF vendors by name, and see the list of contracts with details on their purpose, dates, and amounts. Click on the contracts to see their entry in the Florida Accountability Tracking System, including the contract documents, deliverables, payments, and audits.

The answer is that (depending on how you count) 12 organizations have received about half of DCF’s business since DFS started keeping track online.  Of that dozen, six organizations were CBCs, four were behavioral health networks, and the final two work with sexually violent offenders and psychiatric patients. Smaller CBCs and BHNs make up the next 25%, with the final quarter split among hundreds of small organizations, all the way down to air conditioner repair jobs and copying fees.

boxes

The total contract amounts need to be understood with a dose of context. Our Kids, for example, is the vendor for $1 billion over 10 years (5 years original, with 5 years renewed). The payment amounts get adjusted year to year based on statutory and contractual terms. And the contract amount is not the total cost of the child welfare system when you also factor in state, county, municipal, and charitable funding for all of the people and organizations who make their living adjacent to the system (including, for now at least, me).

Still, a billion dollars is a huge contract and the question of how it is being managed in Miami is particularly relevant today when Our Kids’ leadership team has resigned but not left office and DCF is holding stakeholder interviews to determine how people fighting to drink from that spigot think things are going.

Categories
Charts & Graphs

Florida foster care numbers in historical context

I’ve added a new tab to the Child Welfare System Dashboard that shows the out-of-home care population annotated with historical events: governors’ tenures, legislative history, and Florida Supreme Court opinions. Each picture tells a different part of the story about what drives child welfare policy and the rise and fall of the OOHC population.

Leadership

The saying goes “personnel is policy.” The chart below shows historical trends in the statewide out-of-home care numbers as a factor of both who was secretary and who was governor. Be careful about the vertical axis — it starts at 14,000 to make room for the labels, so the proportions may be misleading. The current OOHC population is 19% lower than when Governor Crist took office and 30% higher than when Governor Scott took office.

history-gov

 

Legislation

The next chart shows major state legislative enactments. It’s a little hard to read because the major overhaul bills do lots of things all at once. That’s not exactly how you want to run an evidence-based system. (The chart below is just major legislation — the tableau.com version lets you view all legislation during the tenure of each secretary.)

The chart shows that Secretary Butterworth took over right after the passage of SB 1080, which greatly expanded both the permanency options and case planning procedures. OOHC plummeted during this time. Secretary Sheldon took over right after the passage of HB 7077, which restricted case plan duration to 9 months before triggering a TPR ground. The size of OOHC continued to decrease through this period. At the end of Secretary Sheldon’s tenure, the Legislature passed HB 5303, which changed the funding and risk pool models for CBCs.

With a new governor and a new secretary in 2011, the Legislature passed SB 2146 creating the Equity Allocation Model in statute, which based funding on proportions of children in the CBC’s area (30%), proportion of children in care (30%), proportion of hotline workload (30%), and proportion of reductions in the size of OOHC (10%). (This is a gross oversimplification.) By 2015, the formula had been tweaked multiple times to condition 80% of funding on the CBC’s size of OOHC, 15% on the hotline workload, and only 5% on the size of the child population. Between January and June 2015 when the bill that cemented OOHC as the primary driver of funding was being considered in the legislature, CBCs permitted their OOHC populations to grow by 2,000 children in what appears to be the largest and steepest consecutive increase in documented history. Aside from seasonal variations, OOHC rates have increased ever since.

history-major-bills

 

Adjudication

This last chart shows Florida Supreme Court opinions. You can see that in the early 2000’s, when the OOHC population was still high, the main issue was the level of due process afforded parents in dependency and TPR proceedings. (Very little.) Most opinions were answered with a legislative amendment. In 2004, the Court issued an opinion requiring a showing of substantial risk of harm to a child in order to terminate their parents’ rights. That was the Court’s last substantive child welfare opinion until 2015, when the Court held that parents have a constitutional right to effective assistance of counsel in TPR proceedings. The next year, the Court ruled that the existence of a bond between the parent and the child is not fatal to a TPR under Least Restrictive Means analysis.

It’s difficult to say that any particular Florida Supreme Court decision had a steering effect on child welfare policy. Instead, the opinions seem to have nudged the Legislature and Department to modify existing procedures to achieve their desired results.

history-flsct-opinions

 

I suppose the take-away is that if you want to shift child welfare policy you should become the Governor or Secretary. If you can’t do that, you should at least become a legislator. If you’re not interested in all that work, filing a lawsuit here or there can’t hurt. I’m apparently in the wrong business.

Categories
Charts & Graphs

Florida DCF numbers for Nov: up 5.2% over last year, racial disparity dropping, Suncoast and NW still on fire

statewide-dec-2016

The DCF trend report numbers are out, and the expansion is continuing statewide. You can see in the chart below that, due to its size, the Suncoast Region continues to be the largest driver of the statewide expansion, but the Northwest Region continues to show the largest individual growth. The two Southeast Region contracted again this month, and the Southern Region flattened and may be entering an expansion soon.

The statewide racial disparity index has been dropping, driven largely by an increase in white children entering care in the northern parts of the state. That raises the question of whether policy changes are removing any “white bonus” that may factor into the decision to remove children. The southern areas still show incredibly high racial disparity indices that are worth digging into deeper.

I’m running short on time today, so the rest of the charts are below. Or you can explore in unbearably more detail at tableau.com.

Region Nov 2015 Nov 2016 Change % of OOHC Contribution to Change
Statewide 22556 23737 5.2% 100.0% 5.2%
Suncoast 6063 6774 11.7% 28.5% 3.3%
NW 1894 2291 21.0% 9.7% 2.0%
Central 4959 5267 6.2% 22.2% 1.4%
NE 3232 3438 6.4% 14.5% 0.9%
S 2094 1930 -7.8% 8.1% -0.6%
SE 4314 4037 -6.4% 17.0% -1.1%