Making Sense of the 2012 Failed States Index (Part I)

Last week, Fund for Peace released the 2012 Failed States Index. Accordingly, my next two posts will cover my thoughts on how the index may or may not be useful (Part I) and how Africa fares on the index (Part II).

But before I start, two important caveats: First, if you are looking for a debate on the utility of using the terms failed state, weak state, collapsed state, etc. and the policy implications that accompany that designation, this post is not going to do that for you, as that topic has been covered in-depth elsewhere. Second, when I turn to what the index means for Africa, I’ve scoped “Africa” to include 52 countries. I’ve excluded Egypt because I’m just accustomed to scoping the continent accordingly because it’s not in the AFRICOM AoR. Like the 2012 Failed States Index, I’ve also excluded South Sudan, which became a state midway through the year. Though unranked in the 2012 iteration of the Failed States Index, Fund for Peace did determine where South Sudan would have ranked, had it been included.

Methodology

First, a word on Fund for Peace’s methodology. (No, don’t skip this section – it’s important!)

Every year, the Fund for Peace triangulates data collected using content analysis, quantitative data, and qualitative input to develop final scores for the Failed States Index. Aggregated data are normalized and scaled from 0-10 (best to worst) for each of 12 indicators that span three categories – social, economic, and political/military. No single indicator is guaranteed to indicate state instability, as they cover a wide range of state failure risk elements. Failed States Index scores range between 0.0 and 120.0 (least failed to most failed), and the Fund for Peace categorizes the Failed States Index scores by quartiles: Alert (scores between 90 and 120), Warning (scores between 60 and 89.9), Moderate (scores between 30 and 59.9), and Sustainable (scores between 0 and 29.9).

Data for each year’s index is collected between January 1 and December 31 of the previous year. As a result, each year’s Failed States Index would not account for developments that took place after December 31 of the previous year. This is important to keep in mind when trying to make sense of where each country lies in the index in light on current developments. 

You can find out more about the Fund for Peace’s methodology here and here.

What the Failed States Index Does/Does Not Do

The Failed States Index is not a crystal ball. It is not intended to predict when, how, or under what circumstances a state will fail. It is merely an analytic tool that suggests which states may be more vulnerable to state failure than others. Furthermore, one should not read too much into a country’s numbered ranking, because ultimately ranking one state as slightly “more failed” than another is meaningless. It is, however, more useful to think of the data the index provides in terms of tiers or categories (i.e., Somalia is more vulnerable to state failure than Malawi, and Malawi is more vulnerable to state failure than Mauritius). Finally, what is even more important than rankings or categories are the changes in the country’s scores over time. Looking at the Failed States Index from year to year, one can see that many of the countries that top the list have topped the list every year – some just trade places with each other. So if a policymaker or military strategist was trying to make sense of this index, one takeaway is that there are countries that are chronically vulnerable to state failure, but absent a specific catalyst (i.e., coup, natural disaster, mass population displacement), it is unlikely that the state is going “over the edge,” so to say. What they then do with that information will of course depend on whether or not a particular state’s failure matters to their country, or even to their bureaucracy.

Leave a Reply (< 200 words please)