The First DCA published statistics on its caseloads and decisions. But notably (as appellate judges like to say), the length of time they take to resolve cases was not reported. It motivated me to update the How Long do Appeals Take tableau.
The answer? Probably 120 to 170 days for a Dependency case, 260 to 576 days for a Criminal case, and 345 to 603 days for a Civil case.
In July of this year, the Florida foster care system did something unseen since February 2014: it shrunk. For the first time in over 50 months, the year-over-year (YOY) change in out-of-home care numbers went down by 45 children. By August it was down 118, and the reports out this month for October show a contraction of 165.
While a reduction of 165 kids does not seem like much in a system with over 24,000 children in it, the slowing actually started in January 2016, when the system was growing at a staggering YOY rate of 2,540 kids. Just the month before held a YOY increase of 2,683 — the fastest growth since data is available in 2003.
There’s no official definition of a contraction period, or any way to tell if one is real or a blip. I actually sat on this post for a few months to make sure the trend was stable — we’ve had hurricanes, elections, resignations, and other unusual events recently, so I wanted to let those pass.
In reality, though, changes from positive to negative OOHC growth (expansions to contractions) do not happen quickly and appear largely driven by intentional policies and not outside events. The expansion under Secretary Hadi in 2004-2006 lasted 19 months and ended abruptly with the Secretary’s resignation from office in December 2006. The subsequent contraction during the Butterworth and Sheldon administrations lasted 50 months and never wavered until three months into Secretary Wilkins’ term. That change of direction occurred in March 2011, right in the middle of the public hearings and media frenzy on the Barahona case, though the contraction had been slowing since 2009 and was well on the way to reversing course even without the public outrage to speed it along. (That is, media frenzy tends to reinforce — not set — existing child welfare policy positions.)
Oddly, Secretary Wilkins’ DCF changed its expansionary course by August 2012 and entered a contraction period that continued sharply until the month that he resigned in July 2013. (I’ve never heard a good explanation for that period.) The tide immediately turned back toward expansion, continuing through Interim Secretary Jacobo and halfway through Secretary Carroll’s tenure. Growth peaked in December 2015 and then precipitously fell, flattened, and then fell again. (Note that steady growth is still growth — the chart above shows change. The charts below show the actual counts.)
Even though the system as a whole tends to move in unison, not every geographic area shifts course at the same time. The current contraction has been driven largely by sharp decreases in OOHC in three circuits — 17 (Broward), 11 (Miami), and 18 (Brevard) — which shrunk a total of 670 children over the previous year in October. The top three growth circuits — 1 (Pensacola), 7 (DeLand), 9 (Orange/Osceola) — only grew by 127 kids in all.
Decreases were clustered largely, but not exclusively, in the southern regions. Here are the changes by county.
The contractions appear driven largely by reductions in removals. (I’ve chosen to use seasonal trends below to make the changes over time more clear. The actual numbers for removals and discharges have large but regular oscillations month to month due to seasonal effects like summer and national adoption day. The raw numbers are much harder to read.)
You can see the same decreases in removals statewide. Here’s the statewide seasonal trend graph.
Here are the seasonal trend graphs for all circuits. If you notice anything interesting or know why any of these charts look the way they do, let me know.
The ACLU of Florida did a fantastic (and super data-heavy) study of racial and ethnic disparities in the Miami criminal justice system called Unequal Treatment. It’s amazing and you should check it out. The study reminded me that DCF publishes its own statistics on race, but they are buried in the Trend Report excel graveyard. This weekend I decided to dig them up for folks to see.
All of the diagrams in this post are in tableaus here:
The analysis is based on data from May 2017 to April 2018.
The gist: DCF’s out-of-home care population is racially disparate. You start with the hypothesis that child abuse is equally likely across all racial populations and the system will treat everyone the same, therefore the OOHC population will mostly look like the general population. It doesn’t. Black kids are over-represented by 33.1% in OOHC. So-call “Other” kids (which are mostly mixed race and Asian kids) are over-represented by 37.4%. White kids, on the other hand, are under-represented by 15.5%. If you divide those numbers to get the ratio, you get approximately 1.59. This means non-white kids are 1.59x represented over white kids.
The differences aren’t uniform across the state. So your next hypothesis might be that whatever is causing the differences would be systemic across the state. It’s not. Racial disparity in OOHC varies greatly among the counties, with some even having a bias towards White kids. The map below shows the disparity index (i.e., the ratio of non-white to white bias in the system). Orange counties have a Non-white bias. Blue counties have a White-bias. (Counties with no statistically significant difference are shaded a neutral taupe color.)
What does a White-bias county look like? Dixie County has the out-of-home care numbers most biased toward White kids (it’s the dark blue county in the map above). The county has approximately 16,000 people, skews slightly Democrat, and has about 14.5% of its population below the poverty line. It is 77% rural and approximately 9.0% Black. It is the third-whitest county in Florida. Based on the race demographics, you would naively expect about four Black kids and 48 White kids in its OOHC population. What you get is 0 Black kids and 51 White. It’s not huge, but it is statistically significant. Compare the next example to see why.
What does a Non-white bias county look like? Miami. Miami is obviously huge and Latin — it has 2.7M people, and is 65% Hispanic (any race). It is 17.1% Black (non-Hispanic) and 15.4% White (non-Hispanic). About 51% of its population was foreign-born. It voted 63% Democratic in the 2016 elections. It’s racial disparity is extreme: Non-white kids are over-represented by 140%, while White kids are under-represented by 45%. You would expected about 1,400 white kids in foster care in Miami — you get around 775. Meanwhile, you would expect 435 Black kids, and you find about 1,050. The racial disparity index is 4.25.
Racial disparity generally increases the deeper into the system you get. Your next hypothesis may be that once kids are in the system they are treated by the same rules and same players, and should therefore have similar outcomes. No again. DCF breaks its numbers down by the stage of a case: Investigation, Verification, Removal, OOHC, Spending more than 12 months in OOHC, and Discharge from care. Racial disparity tends to rise the farther into a case you get.
The disparity index numbers go something like this. Remember that a positive number means that Non-white kids are represented that many times more than White kids. A negative number is biased towards White kids. A (*) indicates no statistically significant value.
If you look at the Statewide column, you can see that Investigations have a stronger bias than Verifications. Once a child is in care, Discharges tend to be less racially biased than Removals, which actually increases OOHC and 12+ bias over time. The pattern is on steroids for Miami where non-White kids are 4.44x more represented in the 12+ population than White kids.
What about placements? If the process itself has racial bias in it, then it may be safe to bet that placements have a similar bias. This time we assume that the breakdown of kids in a given placement type will be the same as the general OOHC numbers. It’s not. Statewide, Non-white kids are over-represented in the Runaway, Facility, and Other populations, while White kids are slightly over-represented in the Relative and Foster Care populations. The non-Relative caregiver placement did not show any statistically significant differences, possibly because it’s a smaller population and therefore requires more difference to be significant.
The expected vs. actual values for Facility placements look like this.
Breaking the data down by county makes it harder to find statistically significant values. For example, only eight counties show significant differences in their facility placement numbers.
Four counties had significant disparities in their foster home placements, and three of those were White-biased.
This isn’t to say that the other counties are perfectly balanced. When we parse the numbers down to the tiny levels of “the four kids on runaway in Dixie county” then differences have to be more pronounced to distinguish a real difference from just random noise and the techniques I’m using here aren’t very good at small numbers. This data says “we can’t see a difference with the tools we’re using,” not “there is no difference.”
We can’t tell why from this data. This is also important: this type of observational data does not show causation or even hint at underlying causes. A lot of writing has been done on systemic racism in the child welfare system, and the expert consensus is that the disproportionalities we see here are a consequence of (1) interplay between poverty and race at the individual and community level, (2) heightened governmental surveillance and intervention in non-white communities (like the ACLU report highlights), and (3) personal bias in individual decision-makers (for example the family that only wants to adopt a child of their own race or the judge who is less likely to approve the removal of a child of their own race).
Even if these effects may be undetectable in an individual case (or, more likely, they’re one of a hundred other things going on in a case), when you multiply them across tens-of-thousands of kids and decades, you can start seeing the cumulative impact. You only have to remove one more kid than you discharge each month to grow a population over time. If racial factors increase removals and suppress discharges even marginally, that can explode into real differences that must be addressed. For a full discussion see Shattered Bonds: The Color of Child Welfare by Dorothy Roberts.
Our office has been handling more appeals lately, and I am learning the rhythm of the process a little better each day. Appeals seem to go like this: (1) you lose or win at trial and feel really emotional about it, (2) you file your appeal or get noticed that someone filed one on you, and (3) you wait until you don’t feel anything at all anymore. Somewhere in there you file a brief. Then you wait some more and file other briefs. Sometimes a court reporter loses your transcripts and tells you your trial never happened. That can rouse some feelings, but they pass. Because mostly you just wait.
And while you’re waiting, everyone is constantly asking you how much longer they’ll have to wait. I haven’t yet mastered delivering earnest but vague statements of reassurance, such as “waiting is good because it means you haven’t lost yet.” I’ve heard that’s what appellate lawyers do. The people waiting don’t think waiting is good, because it means they haven’t won yet either.
I wanted a real answer to the question how much longer? I looked all over the internet. There were reports (cited below) on dependency and TPR apppeals from 2010 and 2015, but no follow-ups or ongoing data on whether those reforms were successful. There were also lengthy reports on trial court clearance statistics. There was nothing (that I could find) on the district courts. So I decided to create something.
But first, an answer to How long do I have to wait on my appeal?
Probably at least 122 days for a dependency or TPR case.
Probably at least 293 days for anything else.
Probably a little longer if your case is in the Second DCA.
There. Quit asking.
I put it all in a tableau so you can play with it.
The details are really interesting, if you’re into numbers. I put it all into a tableau, a quick version of which should appear here:
A full version with more stats is available here. (You can also use the link if the embedded tableau above didn’t show.) The full version breaks things down by DCA, case type, wins and losses, and originating divisions. I will commit to updating it for a few months to test for stability. I can’t promise after that.
The process – also, why didn’t this already exist?
My plan was basically to dive in, coming up for air every now and then to run the same “florida district court statistics” google search to see if I missed something. If anyone wants to recreate (or check) my work, here’s how it went.
Step 1 to finding an answer was to see what information I even had access to. All of the DCAs report their opinions on their websites. Three of them use a searchable system that creates spreadsheets by month. Two publish weekly text lists that you have to go through on your own. All of the DCAs use an online docket system that has a very convenient URL interface for going right to the case you want, unless that case is a dependency case.
Step 2 was figuring out how many cases they’re even putting out. My curiosity knows no bounds, but my actual time to spend on this was limited to a week or so. The answer was about 200 cases per month per DCA. That wasn’t bad. I planned to do a 10% sample of three months anyway, so 60 cases per DCA felt reasonable.
Step 3 was dealing with the fact that dependency cases are restricted from the public, so they are not available on the online docket system. Instead, I had to look them up on Westlaw and pull out their appellate case numbers and the outcomes. Fortunately, all of the DCAs use a linear case numbering system (for example 15-001 was filed earlier than 15-055 in the year 2015). Once I had case numbers and filing dates of known cases, I could interpolate the dependency filing dates to within a few days. That was good enough for these purposes.
Step 4 was pulling all of the data on 459 cases and punching it into a spreadsheet. I then crunched some probabilities, ran some ANOVAs, generated a few survival reports, and made some tableaus based on what was statistically relevant. Some people have other hobbies, I guess.
What I could tell you about appellate cases would not fill a book
The sample size of three months was enough to get a big picture number, but not enough to do a lot of fine parsing of the data. As I add months in the future, maybe things will stand out. In the meantime, here is what I can say with a reasonable amount of confidence.
More people won than I expected, but still not that many. About 11% of the cases were “wins.” I defined win very broadly to include anything that wasn’t a straight affirmance or dismissal of a petition.
The DCAs were surprisingly similar. I was concerned that a 10% sample would result in garbage. It didn’t. All of the samples were roughly normal. The 1st, 3rd, 4th, and 5th all had numbers that were statistically indistinguishable. (A bigger dataset may eventually tease them apart, but this one didn’t.) Only the 2nd DCA stood out as statistically higher than the rest. For example, the 2nd DCA processed half of its cases in 282 days (+/- 18), while the statewide average was 208 days (+/- 11).
Below is a survival graph. Imagine the top left corner as the starting line, and each district racing to the bottom. The cumulative survival of 1.0 equals 100% of cases still open (“surviving”), and 0.4 would equal 40% of cases still open. The first to the bottom (measured in days across the bottom) is the fastest. As you can see below, four of the DCAs reach the bottom at about the same time. The 2nd DCA stands out as statistically different, in large part because it was slower off the line and struggled with its last 20% of cases compared to other districts.
There wasn’t much variation among the types of cases, except for dependency. The average of 208 days also applied to case types, but dependency stood out as significantly faster. It took the DCAs only 121 days (+/- 2) to process half of their dependency cases. Civil and criminal were indistinguishable in this dataset, though more info later may tease them out as well. There weren’t enough probate, worker’s comp, family, or administrative appeals to say much about them individually yet.
You can see below that dependency cases resolved much faster than anything else. Civil, criminal, and family are pretty consistent in the middle. (Civil starts out slower, but eventually catches up to criminal.) The jaggy curves are probate and worker’s comp cases, which only had a few examples of each.
There was no measurable difference between writs and appeals. Again, a larger dataset may tease out a difference, but the line for writs and appeals were indistinguishable in this one.
Dependency “wins” follow the curve, but exaggerate it a little. Again again, there aren’t that many dependency wins either. But in this dataset at least, they tended to come out faster at first, then move closer to the win curve above after a case has already taken about 150 days. This is a slight exaggeration of the full win curve above, which also flips somewhere around 150 days.
You can’t predict a win based solely on amount of time open. Again, I want to stress that there are very few wins in general (11%) and they are scattered across the timeline. Knowing that an appeal has been open for 600 days doesn’t tell you much about its eventual outcome because the last 10% of the “loss” line accounts for far more cases than the last 10% of the “win” line.
Even though wins are a little faster or slower as a group, you can only know that after you know the outcome of the case. I ran the numbers — if you only know how many days the appeal took, you can predict a win with 5% accuracy. Adding in the DCA, appeal type, and division only gets you to 11% accuracy. That’s worse than guessing.
The good news is that this data supports a claim that the Court’s previous efforts (below) to speed up dependency appeals actually worked. Only time will tell if that is a stable finding or if I just happened to look at a particularly fast few months. Stay tuned.
I was wondering who holds the largest DCF contracts in Florida. The answer was right on the Florida Department of Financial Services website (thank you, Mr. Atwater), which lists public contracts with an ending date of February 29, 2012 or later.
I created a tableau where you can explore the DCF vendors by name, and see the list of contracts with details on their purpose, dates, and amounts. Click on the contracts to see their entry in the Florida Accountability Tracking System, including the contract documents, deliverables, payments, and audits.
The answer is that (depending on how you count) 12 organizations have received about half of DCF’s business since DFS started keeping track online. Of that dozen, six organizations were CBCs, four were behavioral health networks, and the final two work with sexually violent offenders and psychiatric patients. Smaller CBCs and BHNs make up the next 25%, with the final quarter split among hundreds of small organizations, all the way down to air conditioner repair jobs and copying fees.
The total contract amounts need to be understood with a dose of context. Our Kids, for example, is the vendor for $1 billion over 10 years (5 years original, with 5 years renewed). The payment amounts get adjusted year to year based on statutory and contractual terms. And the contract amount is not the total cost of the child welfare system when you also factor in state, county, municipal, and charitable funding for all of the people and organizations who make their living adjacent to the system (including, for now at least, me).
Still, a billion dollars is a huge contract and the question of how it is being managed in Miami is particularly relevant today when Our Kids’ leadership team has resigned but not left office and DCF is holding stakeholder interviews to determine how people fighting to drink from that spigot think things are going.
I’ve added a new tab to the Child Welfare System Dashboard that shows the out-of-home care population annotated with historical events: governors’ tenures, legislative history, and Florida Supreme Court opinions. Each picture tells a different part of the story about what drives child welfare policy and the rise and fall of the OOHC population.
The saying goes “personnel is policy.” The chart below shows historical trends in the statewide out-of-home care numbers as a factor of both who was secretary and who was governor. Be careful about the vertical axis — it starts at 14,000 to make room for the labels, so the proportions may be misleading. The current OOHC population is 19% lower than when Governor Crist took office and 30% higher than when Governor Scott took office.
The next chart shows major state legislative enactments. It’s a little hard to read because the major overhaul bills do lots of things all at once. That’s not exactly how you want to run an evidence-based system. (The chart below is just major legislation — the tableau.com version lets you view all legislation during the tenure of each secretary.)
The chart shows that Secretary Butterworth took over right after the passage of SB 1080, which greatly expanded both the permanency options and case planning procedures. OOHC plummeted during this time. Secretary Sheldon took over right after the passage of HB 7077, which restricted case plan duration to 9 months before triggering a TPR ground. The size of OOHC continued to decrease through this period. At the end of Secretary Sheldon’s tenure, the Legislature passed HB 5303, which changed the funding and risk pool models for CBCs.
With a new governor and a new secretary in 2011, the Legislature passed SB 2146 creating the Equity Allocation Model in statute, which based funding on proportions of children in the CBC’s area (30%), proportion of children in care (30%), proportion of hotline workload (30%), and proportion of reductions in the size of OOHC (10%). (This is a gross oversimplification.) By 2015, the formula had been tweaked multiple times to condition 80% of funding on the CBC’s size of OOHC, 15% on the hotline workload, and only 5% on the size of the child population. Between January and June 2015 when the bill that cemented OOHC as the primary driver of funding was being considered in the legislature, CBCs permitted their OOHC populations to grow by 2,000 children in what appears to be the largest and steepest consecutive increase in documented history. Aside from seasonal variations, OOHC rates have increased ever since.
This last chart shows Florida Supreme Court opinions. You can see that in the early 2000’s, when the OOHC population was still high, the main issue was the level of due process afforded parents in dependency and TPR proceedings. (Very little.) Most opinions were answered with a legislative amendment. In 2004, the Court issued an opinion requiring a showing of substantial risk of harm to a child in order to terminate their parents’ rights. That was the Court’s last substantive child welfare opinion until 2015, when the Court held that parents have a constitutional right to effective assistance of counsel in TPR proceedings. The next year, the Court ruled that the existence of a bond between the parent and the child is not fatal to a TPR under Least Restrictive Means analysis.
It’s difficult to say that any particular Florida Supreme Court decision had a steering effect on child welfare policy. Instead, the opinions seem to have nudged the Legislature and Department to modify existing procedures to achieve their desired results.
I suppose the take-away is that if you want to shift child welfare policy you should become the Governor or Secretary. If you can’t do that, you should at least become a legislator. If you’re not interested in all that work, filing a lawsuit here or there can’t hurt. I’m apparently in the wrong business.
The DCF trend report numbers are out, and the expansion is continuing statewide. You can see in the chart below that, due to its size, the Suncoast Region continues to be the largest driver of the statewide expansion, but the Northwest Region continues to show the largest individual growth. The two Southeast Region contracted again this month, and the Southern Region flattened and may be entering an expansion soon.
The statewide racial disparity index has been dropping, driven largely by an increase in white children entering care in the northern parts of the state. That raises the question of whether policy changes are removing any “white bonus” that may factor into the decision to remove children. The southern areas still show incredibly high racial disparity indices that are worth digging into deeper.
I’m running short on time today, so the rest of the charts are below. Or you can explore in unbearably more detail at tableau.com.
I have read a lot in the news lately about the foster care crisis. By many accounts, the growth in out-of-home care (OOHC) has been driven in part by a growing epidemic of drug cases. In previous posts, I’ve shown that the data does not exactly bear that out and the growth is more likely a result of policy changes, especially policies on how cases with available relatives are handled. I don’t deny there’s been an uptick in drug cases, but the expansion is probably a result in fewer cases being referred to voluntary services while the children stay with a relative under a safety plan.
Another theme in these news reports is the lack of foster homes. So let’s take a look at those numbers. In March 2016, I requested the count of licensed beds in each zip code in Florida. The data went into the Licensed Placements by CBC & Zip Map. Last week I made the same request again, and can now compare the numbers between March and now.
The results: the number of licensed beds has grown 0.8% while the number of children in OOHC has grown 3.9%, or almost 5x as fast.
While 0.8% is probably a non-significant change, the numbers are higher and lower around the state. Losing 16 beds in Miami-Dade County is essentially no change, while gaining 60 beds in Duval County is an almost 8% expansion.
The chart below shows the changes in licensed bed numbers against the changes in OOHC placements. Non-relatives continue to make up the fastest growing placement type, which may be concerning if agencies are using this category to avoid licensing and support while also reducing board payments. I would be curious to know how many of these placements would convert to licensed placements if given an efficient way to do so. I would also be curious to know how many of these “non-relatives” would be more appropriately licensed as group homes.
On the other end, both facility placements (actual kids placed in a facility) and therapeutic beds are decreasing. The number of Child Caring Agency beds has remained almost even.
Family Foster Placements
Family Foster Beds
All Licensed Beds
Child Caring Agency Beds
Therapeutic Foster Beds
Another way to view these numbers is by how full-to-capacity each type is. The placement data and the licensing data isn’t broken up in exactly the same way, so I’ve combined family foster beds and therapeutic beds. The result is that only two-thirds of family foster beds are actually filled, and that number has crept up since March 2016. Simultaneously, a little over half of child caring agency beds (i.e., group homes) are filled by child welfare kids.
Family Foster + Therapeutic Capacity
Child Caring Agency Capacity
The 33% vacancy rate is probably due to lots of factors. Some families are licensed for more children than they want to take in at any given time. Other beds cannot be filled because a child in the home has a safety plan that prevents other children from being placed there. Still other homes are temporarily not accepting children at all. In group homes, not all kids in those homes are child welfare placements. What is important is that we’re pushing deeper into the foster home capacity and reducing the reliance on group home programs.
Below are the breakdowns for foster beds, therapeutic beds, and group home beds. You can explore the data in more detail on tableau.com. As always, if there’s something you want to see or know, just leave a message in the comments.
Florida’s statewide out-of-home care population rose 232 children in October, maintaining a consistent 5% growth over October of last year. In-home-care numbers decreased by 29 children and are expected to remain flat. Growth was largest in the Northwest and Suncoast regions, while the Southeast and Southern regions experienced contractions. Only the Southeast region is projected to continue to experience its contraction into next year. More details are below.
Dec 2017 (Projected)
% of OOHC (Oct 2016)
The Northwest region appears to be continuing its massive expansion, growing 17% over last year, the highest of any region. The expansion appears driven by an increase in removals concurrent with a flattened discharge rate. Relative placements have expanded the fastest, while permanent guardianships are nearing zero. Without some change in either removal rates or discharge rates, the Northwest is currently projected to grow another 27% in the next year.
The Suncoast region continues its OOHC expansion that began around January 2014 and its IHC expansion that began in June 2015. The massive expansion appears driven by increased removal rates concurrent with flattened discharge rates. Relative placements continue to be the largest placement while discharges through permanent guardianships continue to decrease. The Suncoast region continues to be the largest region with 28% of all children in out-of-home care.
The Northeast region continues its expansion this month, up 8% over last year. The expansion appears driven by an increase in removals concurrent with steady discharge rates beginning around June 2014. Relative and non-relative placements continued to show expansion.
The Central region continues its expansion this month, up 7% over last year. The expansion appears driven by heightened removal rates concurrent with steady discharge rates beginning around November 2014. Relative and non-relative placements show the most growth. Reunification and guardianships both have decreased as a proportion of OOHC, suggesting a bottleneck in resources.
The Southeast region’s contraction appears largely motivated by an increase in discharges, a larger reliance on permanent guardianships, and a slightly rising adoption rate. The Southeast region has also experienced a downsizing of its facility-based care population, which began a sharp drop-off in May 2016 and has decreased 19% to 501 since October 2015.
The Southern region’s contraction that began in May 2015 may finally be turning, though in-home-care placement may continue to fall. This region is projected to grow 8% in the next year. The Southern region continues to have the highest racial disparity of any region, with black youth over-represented by 4.3x the rate of white youth.
A word about placement types
Expansions in OOHC have been largely located in relative placements (pink line below), which have remained approximately 40-45% of OOHC since their initial expansion with Secretary Wilkins in June 2010. Family foster homes (green line) have risen gradually in the post-Wilkins era. Notably, non-relative placements (purple line) have continued to outnumber facility foster homes (blue line). The types of non-relatives (families, facilities, or informal group home settings) and the amount of social and financial support provided to these placements is currently unknown and probably deserves some attention as more children are placed in these settings.