child welfare, law, and lots of graphs

child welfare, law, and lots of graphs

Category: Charts & Graphs

Charts & Graphs

Miami’s foster children are going missing at historically high rates. Why?

There has been a lot of discussion in Miami lately about the appropriate response to kids who run away from foster care. The perception here is that foster kids are running away more often. I wondered if that was true, so I checked the Department of Children and Families’ Child Welfare Services Trend Report (January 2016). A few caveats about the numbers, which I present below. The CWST Report counts a child as “runaway/absconded/abducted” if they’ve left the placement without permission and their whereabouts are not known on the last day of the month. This seems to under-count by excluding kids who run away for short periods of time, unless those periods cross a month boundary. The report does not break down missing youth by age, sex, or other demographics (but DCF’s Missing Child List suggests that most missing foster children are teenagers). Nor does it tell us why a youth left, where they went, or how long they were missing.

The question being raised in Miami is whether children in the foster care system can be judicially ordered to not run away, and then held in secure detention for contempt if they don’t comply. Florida statutes and the Florida Supreme Court are clear that dependent youth cannot be placed in secure detention for contempt; moreover, running away, in and of itself, is not a criminal offense in Florida. Police are required to return children to their custodians in their role as so-called “community caretaker.” If the child is a habitual run-away, then the family can be referred for services. This program does not apply to children in foster care, who already receive services.

I checked DCF’s procedures on run-away youth. The Department’s operating procedures require the case manager to request court and law enforcement help to return the child if located. I was happy to see that the operating procedures also state that “When the child returns, the child must hear and see statements of concern regarding the child’s safety and well-being from the adults who have significant relations with the child.” I have seen adults respond with anger and frustration when youth run away, or adults who seek to punish the youth for running instead of exploring the cause and purpose.

I certainly understand the frustrations motivating people seeking to reduce the number of missing youth. The numbers discussed below substantiate the perception that kids are running away at high rates: Miami has a rate of missing kids that is four times the lowest Florida regions. We should be asking why that is. Running away is not inherently a form of mis-behavior. Running away can be a developmentally appropriate behavioral response to stressors and problems that need to be addressed in the home or the child’s life. It can also be a safer alternative to an abusive or neglectful home, especially if adults charged with protecting the youth are not responsive to complaints. Commanding or ordering a child to not run away without addressing those underlying problems is equivalent to ordering a child not to cry.

Based on my own experience, threatening foster youth with incarceration also has the counterproductive effect of undermining the child’s trust and belief in the fairness of the system, thus making the desire to run away even stronger. It also has the negative consequence of normalizing the idea of incarceration, i.e., “the judge is going to throw me in jail anyway, so I may as well do X.” Foster youth are already at significantly higher risk of juvenile justice involvement. Secure detention would only exacerbate that risk.

The numbers below suggest that the current missing child rate is a symptom of something going very wrong. We need more information about why these kids are running and how to address their needs. Now for the numbers.

Missing children over time

Since 2003, there has been a very large variation in the number of children listed as missing in the CWST Report. The peak occurred in June 2007 with 470 missing children statewide. I checked that against the number of kids in out-of-home care that month: 17.5 kids per 1000 kids in out-of-home care were missing.

Statewide over time.png

This peak appears to have gone largely unnoticed and un-commented upon. A search of Google News for that time period turned up nothing. A chart in a DCF training bulletin from January 2007 listed the missing child rate at 1.67% with no mention that this was nearly double the rate from just four years prior.

In the next chart, you can see that not all DCF regions had equal numbers of missing kids. The Suncoast Region has the highest number of missing children in 2007, but all regions were slightly up that year.

Regions over time

When these numbers are viewed as a percentage of kids in out-of-home care, though, a different picture emerges. Remember from previous posts that the Southern Region (Miami & the Keys) has historically had relatively fewer kids in out-of-home care than other regions. In January 2016, for example, the Southern Region had 1,999 children in out-of-home care, compared to 6,234 children in the Suncoast Region (Tampa, St. Petersburg).

Regions per 1000.png

Viewing the rate of missing kids as a percentage of kids in out-of-home care, the Southern Region has four times the rate of missing children as the region with the lowest rate. The historical statewide high was 17.5 missing children per 1,000 in 2007. The Southern Region has hit 30 missing kids per 1,000 twice. Moreover the Southern Regions’ rate appears to have grown since 2014 and is currently hovering around 20 per 1,000. Other regions are currently averaging 5-7 per 1,000.

What is going on? With this data I can offer only theories. One theory could be that the expansion of out-of-home care numbers has led to an increase in the number of missing kids. The graphs below show the number of missing kids versus the out-of-home care population. You can see that the rise in missing kids is partially a product of the rise in out-of-home care population. As the population goes us, the number of missing kids does go up. But the differences in the slopes and differences among regions suggest that more is going on in the South than just additional kids coming into care.

Another theory could be that the missing child rate is an indicator of system instability. Miami has experienced a lot of changes over the last 4 years, including the closing of several case management agencies and a restructuring of its placement system. Even among agencies that have remained open, the case manager turnover rate has been high. That upheaval has not been easy on anyone, most especially the kids. It could be that the missing kids aren’t running away from “care” as much as running to find care somewhere they believe will be more stable. Without more information from the youth themselves, it is very difficult to say.

Seasonal Factors

Foster kids don’t go missing at equal rates throughout the year. Statewide missing child numbers tend to seasonally increase 7.5% in June and then decrease 6.6% in September. I suspect these are summer runners who return to school, and thus to their placements, in the Fall. There’s a similar spike of 4.1% in December, with a dip of 2.8% in January when school is back in session. These are likely unsanctioned holiday visits.

Breaking up the numbers shows some regional differences. Every region experiences the back-to-school dip in recovered kids. Interestingly, however, the Northwest Region (Pensacola, Tallahassee) doesn’t have a pronounced summer spike. It does, however, have a very large increase — 20.5% — during the winter holidays. The Central Region, on the other hand, has a significantly smaller increase in missing kids over the winter holidays. This might suggest the need to look at how the Home for the Holidays program is being implemented in these regions to see if that is affecting the number of missing kids.

The numbers in context

If you know your region is experiencing a missing child crisis, what do you do? A 2011 study from the Urban Institute and Chapin Hall (which I will paraphrase liberally for the next three paragraphs) found that the reason kids run away from foster care falls into two basic categories: (1) wanting to be with family and friends and (2) disliking their placement. After a runaway episode, most youth return to care voluntarily. The reasons for returning included wanting to be back at their home, wanting to go to school, and avoiding getting themselves or others in trouble.

The majority of youth in the Chapin Hall study ran away to a friend’s home, including boyfriends and girlfriends; and about one-third spent their first night at a relative’s home. Only three youth reported spending any time of their most recent episode in an outside location such as a park or an abandoned building. As a result, these youth do not seek services while they are on the run from a placement.

Consequently, foster youth are often less sophisticated in terms of street knowledge. Youth who run away from foster care have more knowledge of services available than high school aged youth in general, but youth with foster care experience generally had similar or less knowledge than other runaway youth. In particular, youth in the study’s foster care sample had less knowledge of services that typically comes from street experience including free meals, drop-in centers, street outreach and free showers. The biggest barrier to foster youth seeking services while on the run is that they believe they will be turned in, either directly to the Department or to the police.

The National Runaway Safeline’s has a forum dedicated to youth who run away from foster care. Posts from youth in the forum mirror many of these sentiments.

The Chapin Hall study’s review of literature also found the following:

  • Females are more likely to run away than males.
  • Runaway behavior is not linked to a particular race or ethnicity.
  • Runaways tend to have more school problems, higher rates of suicidal ideation, more reported behavioral problems; and more alcohol, substance abuse, and mental health disorders.
  • Foster youth are more likely to run away the first time if they entered care due to lack of supervision and less likely if they entered due to sexual abuse or physical abuse.
  • The more placements they have, the more likely youth are to run.
  • Youth in group homes or residential facilities more likely to run away than youth in foster homes; youth placed with relatives are least likely to run away.
  • Length of time in care does not necessarily predict runaway; in fact, the older the youth is when entering care, the more likely they are to run away.

There’s no reason to believe that securely detaining kids for running away is a productive intervention strategy, and it may have unforeseen consequences if children remain in unsafe placements or remain on run in an unsafe situation out of fear of incarceration. There are evidence-based intervention models that could be implemented as alternatives to secure detention: researchers at The University of South Florida published a 2008 study on approaches to intervening in runaway behavior. The study found that a functional behavioral approach was significantly more successful than “services as usual.” The Child Welfare Information Gateway has a collection of resources on running away.

Whatever is going on in the Southern Region does not appear universal or unavoidable. We need to figure out why these kids are leaving at such high rates — and we must welcome them back warmly and with concern, so that they never have to think twice about reaching out for help.

Charts & Graphs

The good old days, part 2 (DCF abuse investigations over time)

Over the weekend, I posted some charts showing the number of kids under DCF supervision since 2004. Today we’re going to look at a slightly different measure: the number of abuse investigations between September 2006 and September 2015. Here goes.

calls per month

  1. This shows the total number of investigations from September 2006 to September 2015. Buried in DCF’s spreadsheets is a caveat that these numbers do not include calls that were screened out as “no jurisdiction.” Therefore be careful — this is the number of investigations, not the number of calls. This measures DCF’s response to calls, not the calls themselves.
  2. The statewide total is at the top and the regional totals are at the bottom. Immediately you can see the stratification that we saw in the last post, this time in three clear groups: (1) Suncoast and Central, (2) Northeast and Southeast, and (3) Northwest and Southern.
  3. All of the regions are strongly correlated (P <0.01) — they all rise and fall together.  This implies that anything that’s moving the numbers is either a change in DCF policy or some statewide phenomenon.

Let’s look at the numbers again controlling for the size of each region.

calls per capita

  1. This chart shows how many investigations per child in the region were conducted each month. The three groups we saw above have fallen away and we now see that the farther north you go, the more investigations per child are conducted.  The Northwest and Northeast regions vie for the highest rate, while the Southern and Southeast Regions are consistently lowest.
  2. Without knowing the number of actual calls per region, we cannot say any more than that. It could be that people in Miami do not call the abuse hotline as much as people in Pensacola. It could be that southern callers are screened out at much higher rates than callers from the northern parts of the state. The most striking difference between these regions of the state, of course, is the mix of languages. I’ve never called the abuse hotline in Spanish or Haitian Creole, so I do not know what that experience is like.

The zig-zags from month to month imply that there are seasonal effects going on. Let’s remove those to get a better picture of the trend lines.

calls per capita STC

  1. This chart shows the trend lines without all the noise from the regular ups and downs that occur month-to-month. You can see the clear dip in the 2009-2010 period. Again, this means that fewer investigations were conducted. Without knowing the number of calls  total, we cannot say more.
  2. Some statewide event happened at the end of 2010, right at the end of Secretary Sheldon’s tenure. Nowhere else in the data is there such a sharp change across all regions. I suspect there was a policy shift there, probably dealing with how calls were screened out, but I have not been able to find anything on DCF’s websites documenting it.
  3. More recently, something is occurring in the Northwest (and to a lesser extent in the Northeast) beginning around November 2014. Either callers are getting better at bringing maltreatment to the attention of the Department or there is some policy encouraging investigations at higher rates in these areas. The same increase is not found in the south.
  4. The regional rankings are amazingly consistent over time, with the exception of the two northern regions jockeying back and forth. I would expect to see more changes in the rankings. I am not a conspiracy theorist, but it looks as if there were almost quotas keeping the numbers even.
  5. Remember the sharp increase in kids under DCF’s supervision starting in 2013? That’s nowhere to be seen in these numbers. DCF is not handling more investigations; they’re handling investigations differently.
  6. I was curious about the effect that media reports have on DCF investigations, so I looked up the date of the Miami Herald’s Innocents Lost series. It published in March 2014 and spurred a series of town halls and other events over the following months. There is an inflection point in the data at March 2014, but notably it does not occur in the Southern Region where the Miami Herald is located. Complicating the picture is the fact that there were already very public discussions about DCF’s investigation policies in the preceding months and reforms were already working their way through the legislature. I know of one study finding that, instead of driving changes, news reports tend to follow the same forces that spur change in the system. I can’t say any differently with this data.

Interestingly, there is no correlation between the number of investigations and the number of kids under DCF supervision.

oohc v. investigations

  1. I checked the numbers and there is no statistically significant correlation here. Except for the slight bump in both lines around the end of 2010, the numbers of investigations and children under DCF supervision are independent.
  2. I even ran cross-correlations over time to see if more investigations in one month resulted in higher numbers of kids in care in later months. Again, there was no correlation.

That’s it for now. If you read this far, you like graphs as much as I do. Next time I’ll look at the number of verified abuse reports and the types of maltreatment.

Charts & Graphs

The good old days (or “Florida foster care rates since 2003”)

I’ve been running a lot of numbers this week for an article that I’m working on. I’ll post a few of the more interesting charts over the next few days. Beginning with these I created from DCF’s Child Welfare Services Trend Reports. The trend report contains the number of children in out-of-home and in-home care at the end of each month from 2003-2015. Taking the monthly average for each year and splitting it up by DCF Region shows a few interesting facts:

total

  1. The number of children under DCF supervision has been rising since 2013, but it used to be a lot higher statewide. It was lowest in 2010 at the end of Secretary Sheldon’s tenure.

 

under supervision

  1. Breaking down the numbers by region shows that the DCF regions have never been equal in size. The Southern Region (Miami) has been the relative smallest until recently. This explains historically why the Southern Region has one CBC and the Central Region has five. The size difference among the regions was at its peak in 2005, when the Central Region averaged 12,193 kids per month and the Southern Region averaged only 3,944.

 

monthly oohc

  1. Separating out the children in out-of-home care, the number is indeed rising, and has been in the south since 2013, and in the north since 2014. The Suncoast Region consistently has the most kids in out-of-home care. The Southern and Northwest regions have vied for title of lowest number of kids out of home.
  2. The Southeast Region has the largest increase in out-of-home care use of any region beginning in 2013.

 

monthly in-home

  1. Looking at the number of children who are receiving services in their home, however, is a different story. The Central Region in the early 2000s relied heavily on in-home services, but came into line with the other regions by 2009.  Whereas most regions’ in-home numbers are steady or decreasing, the Southern Region’s have gradually gone up.

 

ratio of out to in

  1. By dividing number of out-of-home children by the number of in-home children (not the clearest way of doing this — sorry), we can see the mix of kids in each region. Our first graph showed that the number of kids under DCF supervision was rising, but this shows that those kids are being taken out of their homes at increasing rates.
  2. The difference between now and the early 2000s is stark, when the regions were far more varied in their use of out-of-home care. The Southern Region’s total numbers were small, but their mix was mostly out-of-home, by almost 3:1. To the contrary, there have been times when the Northwest Region had more children receiving in-home services than were in foster care.
  3. The statewide tilt upwards we see in early 2014 signifies a policy shift towards favoring out-of-home care. At least in this recorded history, we have never seen such a uniformly implemented change. (I’m not counting 2010 when Secretary Sheldon left and all of the regions’ numbers immediately started climbing again.)
  4. As of September 2015, the last month with data in these charts, 64% of kids under DCF supervision statewide were in out-of-home care.  That number was the smallest at 59% in December 2011.

That’s it for now. If there are charts or numbers you’d like to see, just let me know in the comments.