The poverty rate in the United States fell to 11.8% in 2018, according to data released last week by the Census Bureau – the lowest it's been since 2001. But this estimate significantly understates the extent of economic deprivation in the United States today. Our official poverty line hasn't kept up with economic change. Nor has it been modified to take into account widely held views among Americans about what counts as "poor."

A better, more modern measure of poverty would set the threshold at half of median disposable income – that is, median income after taxes and transfers, adjusted for household size, a standard commonly used in other wealthy nations. According to the Organization for Economic Cooperation and Development – which comprises 34 wealthy democracies – 17.8% of Americans were poor according to this standard in 2017, the most recent year available for the United States.

To be sure, there is no such thing as a purely scientific measure of poverty. Poverty is a social and political concept, not merely a technical one. At its core, it is about not having enough income to afford what's needed to live at a minimally decent level. But there's no purely scientific way to determine what goods and services are "necessary" or what it means to live at a "minimally decent level." Both depend in part on shared social understandings and evolve over time as mainstream living standards evolve.

At a minimum, we should set the poverty line in a way that is both transparent and also roughly consistent with the public's evolving understanding of what is necessary for a minimally decent life. The official poverty line used by the Census Bureau fails that test. It was set in the early 1960s at three times the value of an "economy food plan" developed by the U.S. Department of Agriculture.

The plan was meant for "temporary or emergency use when funds are low" and assumed "that the housewife will be a careful shopper, a skillful cook, and a good manager who will prepare all the family's meals at home." The decision to multiply the cost of the economy food plan by three based on a 1955 food consumption survey showing that families spent about one-third of their income on food at that time. Since then, the measure has stayed the same, adjusted only for inflation.

Assumption of a frugal 'housewife'

No expert today would argue that multiplying by three the cost of an antiquated government food plan – one that assumes the existence of a frugal "housewife" – is a sensible way to measure poverty in 2019, even if you adjust it for inflation. However meaningful this was as a measure of poverty in the 1960s, which is debatable, it makes even less sense to apply it today to an American population in which most people were born after 1980.

In 2018, the official poverty threshold for a family of two adults and two children was $25,465 or about $2,100 a month. If it had been set at half of median disposable income, it would have been $38,098, or $3,175 monthly. Ask yourself: If you were part of a couple raising two children, could you afford the basics on $25,000 a year without going into debt or being evicted? Do you think other people would view you as no longer poor if your family's income was a bit over $25,000?

For context, if you were living on $25,000 a year in Baltimore, and paying the U.S. Housing and Urban Development Department's "fair market rent" for a two-bedroom apartment in that city, $1,411 in 2018, you'd be spending just over two-thirds of your income on rent and utilities alone. (HUD's fair market rent, used to set the value of benefits such as housing vouchers, is set at the 40th percentile of actual market rent.)

As it happens, when the official poverty line was first developed in the early 1960s, it was equal to roughly half of median disposable income. The median disposable income back then was roughly $6,200 for a four-person family, and the official poverty threshold was $3,166. Research using Gallup and other public opinion data, from the 1960s to the present, has found that, even as median income rose, most Americans continued to believe a family was "poor" if their income fell below roughly half of median disposable income. In other words, Americans for decades have instinctively thought of poverty partly as a matter of relative, not just absolute, deprivation.

This common-sense notion is backed up by research documenting that relative deprivation is bad for health, well-being and social participation. And the negative impact of low income on health and well-being isn't limited to those who are most absolutely deprived: It is apparent at every step of the income ladder.

The dominant framework for measuring poverty in the United States is too technocratic and too ideologically conservative. There's never going to be unanimity on what counts as "poor," but we ought to give more weight to the views of ordinary Americans on that subject – which would also mean shifting toward the kind of metric used by our economic peer countries.


Shawn Fremstad is a senior policy fellow at the Center for Economic Policy and Research. 

Recommended for you