The United States is in the midst of a major demographic event: the depopulation of a significant portion of the nation's rural counties. Although in many rural counties the population has been growing since World War II, in a large number of others there has been a persistent pattern of population decline. Rural depopulation has ramifications for the future economic viability of the counties involved and for the banks that serve these counties. This three-part article spells out the causes and ramifications of depopulation, explores the effects of depopulation on community banks in the depopulating regions, and discusses possible policies for coping with the phenomenon.
Specifically, in part 1, after locating the major areas of rural depopulation in four regions—the Great Plains, the Corn Belt, the Delta-South, and Appalachia-East—we focus on the relationship between agriculture and population density; the relationship between agriculture and depopulation; the contributing factors of technological change, organizational innovation, and change in fertility patterns; the demographic components of depopulation (the increase in the proportion of elderly people in depopulating counties, and the exodus of the most educated and skilled young people); and the commercial structure of rural counties and how it affects—and is affected by—depopulation. We conclude this part of the article by discussing the vicious circle of decline. Because the Great Plains is undergoing the most serious depopulation and is exposed most deeply to its effects, we examine that region in special detail.
In part 2 we look at community banks in the Great Plains. Across the nation, more than 1,400 insured financial institutions with total assets of more than $131 billion are based in counties with declining populations. Many of these banks will face challenges on both sides of the balance sheet: funding becomes increasingly difficult, and the demand for loans continues to wane. Rural depopulation therefore has significant implications for the U.S. banking industry, especially with regard to the long-term health of rural community banks. The Great Plains is where the problem is most advanced.
Part 3 of the article is a brief look not only at policy approaches to depopulation but also at the prospects for the banking industry in depopulating rural areas.
Part 1. Rural Depopulation
Here we identify the areas where depopulation is occurring and quantify its extent, discussing the significant differences in population density and depopulation across rural counties. We also explain the causes of depopulation, its demographic components, and the implications of all of this for the economic viability of the communities involved.
Regions Where Depopulation Is Occurring
Although the U.S. population as a whole continues to increase, many rural areas are experiencing continued problems of population outflows. According to Census figures, between 1970 and 2000 the nation's population rose from 203 million people to 282 million, for an average annual increase of 1.1 percent, but this increase was not evenly distributed across the country. Our analysis of Census data at the county level shows that during the 30-year period 1970-2000, 779 of the nation's 3,141 counties (both rural and metropolitan) lost population. It is important that in 232 of the depopulating counties the rate at which the population declined actually accelerated during the 1990s.
For purposes of analysis, we divided the nation's counties into categories depending on each county's rurality and then on its population trend between 1970 and 2000. First we identified metropolitan counties (the overwhelming majority of which added population during our 30-year period) and separated them out.1 We considered the remaining counties to be rural and classified them into three groups according to the nature and extent of population growth: growing rural counties, declining rural counties, and accelerated-declining rural counties ("depopulating" refers to the second and third groups combined):
Growing rural counties added population between 1970 and 2000.
Declining rural counties lost population between 1970 and 2000, but not at a faster rate during the 1990s.
Accelerated-declining rural counties not only experienced a population decline between 1970 and 2000 but also lost population more rapidly in the 1990s than in the prior two decades.
Figure 1 locates these three types of rural counties on a map of the United States. As the figure indicates, depopulation is taking place mainly in the middle of the country, in the South, and in the Northeast. For purposes of analysis, we have identified four regions where the depopulation of the past 30 years has been significant: the Great Plains, the Corn Belt, the Delta-South, and Appalachia-East (see figure 2). These regions capture just under 66 percent of all rural counties in the nationbut 91 percent of all depopulating rural counties. As we discuss below, although each of these regions has experienced depopulation during the past three decades, the nature, severity, and causes of depopulation vary.
The Great Plains is defined as the continental slope of the west-central United States, bounded on the north by Canada and on the west by the Rocky Mountains.2 The Great Plains includes North Dakota, and portions of Montana, Minnesota, South Dakota, Wyoming, Nebraska, Colorado, Kansas, Oklahoma, New Mexico, and Texas. Of the four depopulating regions, this one is the most rural—only 11 percent of the region's counties are metropolitan—and its rural depopulation trends are the most significant. That is, depopulation here has been more prevalent and more severe than in the other three regions. As shown in table 1, the Great Plains is home to 304 of the country's 662 depopulating rural counties. In this region, 72 percent of rural counties have lost population since 1970, and more than one-third of the 72 percent experienced increasing outflows during the 1990s (for a comparison with the numbers in the other three regions discussed here, see table 1). In 2000, 16.1 percent of the region's population lived in depopulating counties. Furthermore, populations in rural counties in the Great Plains are significantly smaller than populations in the three other depopulating regions, and the population density (people per square mile) is substantially less.
Table 1 - Average Population and Density for Each Type of County, by Region
The connection between larger sizes of farms and ranches and lower population densities is twofold: obviously the population density of agricultural workers is lower, but in addition the towns that support them are fewer and smaller. Both the smaller size of the population (which means communities are relatively isolated) and the low population density greatly exacerbate the economically debilitating effects of depopulation (these effects are spelled out below). Businesses require a minimum number of customers to remain viable, so businesses in less densely populated areas must draw customers from a wider area. Thus, low-density counties are most in danger of losing economic viability.3
The dominant industry in the Great Plains is agriculture: 85 percent of the region's geographical area is devoted to agriculture (the largest percentage of our four regions). As discussed below, structural changes in agriculture are the root cause of the region's demographic and economic predicament, which has been aptly summarized as a "patterned movement of people" in response to these structural changes.
The Corn Belt
The Corn Belt consists of the states identified by the U.S. Department of Agriculture (USDA) as major producers of corn across the central-eastern part of the country.4 The Corn Belt includes Iowa, Wisconsin, Illinois, Indiana, Michigan, and parts of Ohio, Missouri, Minnesota, South Dakota, Nebraska and Kansas. As table 1 indicates, 40 percent of the Corn Belt's rural counties lost population between 1970 and 2000, but few lost population at an accelerating rate in the 1990s. The average population of the depopulating counties in the Corn Belt is almost three times the average in the Great Plains (17,500 versus just over 6,000); in 2000, only 5.7 percent of the Corn Belt's population lived in declining or accelerated-declining counties; and the population density is much higher than in the Great Plains.
In one respect, though, the Corn Belt is similar to the Great Plains: agriculture is an important industry, with farmland accounting for 69 percent of total land area. But because of differences in topography and weather, the types of agriculture practiced in the Corn Belt differ from the types practiced in the Great Plains. Over time, these differences have meant that in the comparatively fertile Corn Belt farmers require smaller acreages to earn a living. Therefore, population densities (as we have seen) are higher, and cities and towns form a more dense and extensive network. As a result, although portions of the Corn Belt are vulnerable to the effects of ongoing rural depopulation, these effects tend to be less severe and more localized than those observed in the Great Plains. In other words, quantitative differences in average population and population density are associated with qualitative differences in economic complexity and future viability.
The Delta-South includes Arkansas, Louisiana, and Mississippi (encompassing the part of the Mississippi Delta that falls in those states), along with Alabama and Georgia.5 As figure 1 shows, a great deal of depopulation has occurred in the Mississippi Delta area—more than a quarter of the region's rural counties have lost population since 1970—but the depopulating counties are scattered throughout the region. In the region as a whole, population trends have actually improved during the past 30 years. In fact, much more of the Delta-South region was depopulating between 1940 and 1970 than depopulated in the 30 years after 1970 (see figure 3).
In the period 1940-1970, the mechanization of agriculture and the consequent consolidation of farms displaced farm workers, many of whom migrated to the growing urban industrial centers in the Midwest and West.6 But the industrial resurgence of the South that began in the 1970s led much of the region to experience sustained economic and population growth. Despite the overall improvement in the region, some clusters of counties, including much of the Mississippi Delta, were unable to compete with other southern areas because of extreme poverty and low levels of educational attainment (conditions that still exist), and these counties have continued to depend heavily on the agricultural sector.7 In the meantime, the growing prosperity of many other areas in the South has attracted workers from the Delta region, contributing to its persistent decline in population.
The Appalachia-East region includes part of Ohio and all of West Virginia, Pennsylvania, and the state of New York.8 Just over a quarter of the rural counties in this region lost population between 1970 and 2000, but unlike the case in the other three regions discussed here, depopulation in this area was not driven pri-marily by an exodus from farming. Rather, it reflects an ongoing decline in the coal-mining industry, a decline caused by technological advances and the restructuring of the steel industry that occurred in the 1970s.9 Figure 3 shows that coal-intensive Appalachia (a region that is not coterminous with Appalachia-East and includes Kentucky, West Virginia, southern Ohio, and western Pennsylvania) also experienced widespread out-migration three decades earlier, between 1940 and 1970. The population of West Virginia, for example, peaked in 1950;10 the number of coal miners employed in the state declined from 150,000 in 1945 to fewer than 19,000 in 2002.11
Correlation between Agriculture and Population Density
Low population density puts a region at risk for depopulation, but low population density by itself is not synonymous with depopulation. In this section we examine the high correlation between agriculture and low population density; in the next section we examine the correlation specifically between agriculture and depopulation.
Agriculture tends to be a land-extensive enterprise, requiring substantial tracts of land for field crops and cattle raising. The result is relatively low population density—a characteristic of rural counties. However, rural population densities vary widely, depending largely on topographical conditions, the type of agriculture practiced, and differences in per acre production. For example, wheat is tolerant of a wide variety of natural conditions, including low rainfall and less-than-ideal soil conditions, so it can be grown on land unsuitable for crops such as corn and soybeans. Cattle grazing, requiring little labor or other inputs, represents an ingenious use of extensive areas of short grasslands that are unsuitable for other purposes: the vast grasslands of the Great Plains are converted to meat by the cattle that graze over them extensively. In contrast, the greatest proportion of the cattle in the Corn Belt are in the finishing sector, where they are fed locally grown corn and soybean products in confined feedlots (see table 2). As can be expected, all these differences translate into corresponding differences in the typical size of farms or ranches across the depopulating regions, with farm size—and therefore population density—varying inversely with productivity.
Table 2 - Agricultural Output per Acre, by Type of County and Region, 1997
A comparison between Iowa (a Corn Belt state) and North Dakota (a Great Plains state) is illustrative. Both states are highly dependent on agriculture, with 91 and 89 percent of land area, respectively, in farms (see table 3). But agricultural revenue (annual per acre cash receipts) in Iowa is almost five times that of North Dakota. The land in North Dakota is not as fertile as the land in Iowa and rainfall is less plentiful, so the predominant products are wheat and cattle, whereas the commodities produced in Iowa are corn, soybeans, and hogs. Corn, soybeans, and hogs typically generate comparatively high returns per acre; returns per acre for wheat and cattle are much lower. Where productivity per acre is relatively low, farmers and ranchers require larger operations to make a living; consequently, farms in North Dakota are four times the size of those in Iowa, and population density in North Dakota is much lower.
Table 3 - Population Densities and Type of Agriculture Practiced, Selected States, 2000
We can also illustrate the relationship between population density and the characteristics of the underlying land (and the resulting commodities produced there) by looking at cattle raising in Nebraska, a Great Plains state (see figure 4). Nebraska has the second-largest population of cattle among the 50 states, with 6.7 million head of cattle in 2000; in comparison, the state had only 1.7 million people in the same year.12 As the legend in the figure indicates, the proportion of cattle to people depends on the type of county: in declining rural counties, the ratio is 12.4:1, and in accelerated-declining counties the ratio grows to 16.5:1. This pattern of ratios suggests an association between this land-extensive sector of agriculture and the low population densities that are typical of counties where populations are declining.
In the Delta-South, where the crops grown are rice and cotton, the farms are even larger than those in the Plains because of the economies of scale associated with the rice and cotton production practiced there. But the linkage to population density is less direct because the states in the Delta-South are near or below the national average for the relative importance of farmland (figure 5).
A way of portraying the difference in population density between the Great Plains and the other regions with declining populations is to compare the distribution of county sizes (see table 4). The data indicate that in 2000 more than 85 percent of the Great Plains' depopulating counties had populations of 10,000 or fewer, compared with 32 percent in the Corn Belt, 25 percent in the Delta-South, and 17 percent in the Appalachia-East. Many analysts consider a county population of 10,000 the minimum threshold of long-term economic viability.
Table 4 - Distribution of Counties by Population Size, by Region, 2000
Correlation between Agriculture and Depopulation
Since the rise of cities and towns, rural-to-urban migration has long been common around the world; and at least since the end of the nineteenth century, farm populations in industrialized nations have declined and become a minority of total populations. Analysis of the geographic importance of agriculture in the United States suggests a clear connection between the prevalence of agriculture and the tendency toward rural depopulation: the distribution of significant concentrations of farmland (figure 5) corresponds with the distribution of rural depopulation. In fact, the states where farmland covers the greatest percentage of land area—North Dakota, South Dakota, Nebraska, Kansas, and Iowa—are the states where depopulation has been most extensive in the past 30 years.
Researchers at the USDA recently identified three factors that characterize rural counties that lost population in the 1990s: (1) a location away from metropolitan areas, (2) a low population density, and (3) a low level of natural amenities (as measured by climate, topography, and the presence of lakes and ponds).13
These researchers argue that a meaningful measure of economic activity is a 10.1 person per square mile density cutoff (this cutoff represents the lowest population quartile of nonmetropolitan counties).14 This measure is superior in most respects to the size of the largest town in the county, for community boundaries have become increasing diffuse as people commonly live in one town, shop in another, and work in yet a third. Furthermore, service providers such as governmental units and retailers tend to locate their branches on the basis of population densities rather than the sizes of specific towns.
The Great Plains, where the average size of farms and ranches is large, meets the first two criteria set forth by the USDA researchers: many counties are characterized not only by low population densities but also by remoteness from urban areas. A look at two road maps, one of Iowa (a typical Corn Belt state) and the other of Kansas (a typical Great Plains state), is suggestive. Iowa comprises seven metropolitan areas and hundreds of small cities and towns spread across its landscape, whereas Kansas comprises only four metropolitan areas, and its smaller communities are spread much more thinly over the landscape.
Counties that depend on agriculture also tend to be the counties that are least endowed with natural amenities. One USDA researcher notes:
Population change in rural counties since the 1970s has been strongly related to their attractiveness as places to live. Natural aspects of attractiveness can be summarized in three types of amenities: mild climate, varied topography, and proximity to surface water-ponds, lake, and shoreline. Counties scoring high in a scale of these amenities had substantial population growth in the last 25 years. High-scoring counties tended to double their population, while the average gain for the low-scoring counties was only 1 percent, and over half lost population.15
Unfortunately, the characteristics that distinguish areas covered by extensive farms are not those that define high-amenity areas. The best cropland tends to be in areas lowest in natural amenities—areas where the land is flattest and least broken up by ponds and lakes, where the winters are the wettest, and where the summers are the hottest and the most humid. In general, the lower a county's score on the scale of natural amenities, the higher the proportion of land that is in crops and the less likely the area is to be classified as a recreationally oriented county.16 Much of the Great Plains receives very low amenity scores.
Depopulation and the Roles of Technological Change, Organizational Innovation, and Change in Fertility Patterns
As noted above, rural depopulation has been occurring at least since the end of the nineteenth century. During the twentieth century, however, the decline in the U.S. farm population became dramatic. At the beginning of the century, nearly 40 percent of the population lived and worked on farms; by the close of the century, that proportion had declined to just over 1 percent (see figure 6). During this hundred-year period, the population of the United States grew from 76 million people to 281 million, but ongoing improvements in the technology of agriculture enabled the ever-increasing population to be provided with food and fiber by a continually shrinking number of farmers.17 Contributing to the decline in the farm population have been organizational innovations within agriculture and the trend in fertility rates since World War II.
As noted by one agricultural economist, agricultural technology has changed radically, especially with the changes since 1950 such as mechanization, the developments of herbicides and insecticides, and the availability of genetically improved crops and animals—all of which have made possible production techniques that economize on labor.18
Technological progress also had a significant effect on trends in the number and size of farms. The number of farms declined from 5.7 million in 1950 to 2.2 million in 2000, while the average size more than doubled, going from 213 acres to 434 acres (see figure 7).19 As farmers adopt improved technologies that require greater capital investment, the optimal farm size increases.20 Farmers who adopt new technologies are able to achieve lower costs of production by applying the new methods to larger land areas. Looking forward, we believe that ongoing research in both the public and private sectors will continue to yield technological improvements in agriculture, perhaps at an even faster rate.
Tractors and other machinery continue to become larger, more complex, and more specialized. Crop yields continue to increase steadily over time, as seed quality improves and fertilizers, insecticides, and herbicides become more effective.21 If recent advances in the genetic engineering of plants can gain public acceptance, they hold the potential for enormous advances in agricultural productivity in the near future.22
Also contributing to continued consolidation are organizational innovations in many agricultural operations, especially innovations affecting the integration of supply chains.23 Supply chains usually consist of contractual alliances between specialized businesses at successive stages of the production process, a business model that was especially successful in the chicken industry in the 1960s and 1970s. In that industry, chicken processors contract with growers who typically provide the labor and facilities to raise chickens. The processors own the chickens throughout their lifetimes and provide feed, veterinary care, and management to their network of growers. This arrangement, also known as vertical integration, has resulted in rapid and sustained productivity improvements in the industry, resulting in declining costs of production that have allowed chicken to dominate the meat menu of the U.S. consumer.24 This business model has led to significant consolidation in the particular sector: in 2002, 42 firms accounted for more than 99 percent of the chickens produced in the United States.25
As other sectors emulate the poultry industry, organizational innovation, together with the long-term trend of technological innovation, will probably drive the continuing and perhaps accelerating consolidation of agriculture. Consolidation will dramatically reduce the demand for agricultural labor for the foreseeable future, and areas with the largest farm populations stand to lose the most workers. As table 5 shows, the Great Plains, where rural depopulation is already the most severe, nevertheless has the highest proportion of farm workers. Thus, this region's risk from the ongoing technological and organizational change in agriculture continues to increase.
Table 5 - Proportion of Farm Population by Type of County, by Region, 1990
Another reason for the accelerated pace at which population in agriculturally dependent counties has declined in the past generation is fertility rates: especially recently, these rates—and therefore the number of children per family—have declined significantly in agriculturally dependent counties and now are only slightly higher than fertility rates in urban areas.
Traditionally families on farms and in small towns had many more children per family than their urban counterparts. The higher number of children born into rural families served partly to offset the steady departure of working-age migrants to employment opportunities in the cities. After World War II, however, rural women began to bear fewer children, as technology evolved and fewer farm workers were required. In addition, rural women came to be affected by the same trends that reduced fertility among urban women, including rising levels of education, greater participation in the labor force, and delayed marriage.26 A noted agricultural economist has quantified this effect: "In 1990 there were 2.1 persons per farm household. In 1940 there had been 5.2. The major reduction in household size did not begin until 1940, but after that, change came quickly."27
Demographic Components of Depopulation
Technically, changes in population are a function of migration (in or out) and natural increase (or decrease), defined as the difference between births and deaths. Table 6 displays the change in population in the 1990s for the depopulating regions, broken down into changes due to migration and changes due to natural increase.
Table 6 - Rate of Population Growth Due to Migration and Natural Increase by Type of County, by Region, 1990s
The first thing to notice in the table is the difference in growth rates between the depopulating rural counties and the growing and metropolitan counties across the board. Much of that difference is due to the fact that people who leave depopulating counties tend to migrate to growing rural counties and metropolitan counties. In addition, metropolitan counties are more likely to attract migrants from outside the state because their larger economies are more completely integrated into regional and national labor markets.
The second thing to notice is that the rates of natural increase are often highly correlated with rates of migration. There are two reasons for the high correlation. One is that out-migrants are usually young people in their prime child-bearing stages of life, and therefore birth rates in counties experiencing out-migration tend to be lower than average. The other reason for the correlation is that counties experiencing out-migration typically have larger proportions of the elderly, so death rates are higher than average. The combination of lower birth rates and higher death rates results in lower rates of natural increase in declining and accelerated-declining counties, except in the Delta-South region.
In other words, depopulating counties—especially those in the Great Plains—are losing an important demographic battle on two fronts.28 First, they have a disproportionate number of elderly people. Second, they are rapidly losing well-educated people of working age.
The Age Structure of Depopulating Rural Counties
One of the key predictions of human-capital theory is that young people are more likely to invest in education or migration because present income forgone is less for the young, and they are able to benefit from improved earnings over a longer period.29 This prediction has been validated many times throughout history, including after World War II in the United States. The rural-to-urban migration observed in this country at that time consisted overwhelmingly of young people seeking either advanced education or improved employment opportunities.30
Whereas the young seek more and better employment opportunities, those who have retired are, by definition, no longer part of the workforce and are largely indifferent to the quantity and quality of employment opportunities. Thus, it is reasonable to expect that where there has been significant out-migration of the young, there will tend to be disproportionate numbers of elderly people.31 In addition, there is evidence that a significant number of the "oldest elderly," or those over age 85, return to their home rural communities to take advantage of support by their families, after spending their early retirement years in high-amenity areas far from home.32
Data from the 2000 Census are consistent with this scenario (see table 7). The Great Plains—the depopulating region with the most significant out-migration in the 1990s—shows the greatest proportion of elderly and oldest elderly people in its depopulating counties. Conversely, the relatively low proportions of elderly people in Great Plains metropolitan and growing rural counties at least partly reflect the large inflows of young migrants to those areas.
Table 7 - Elderly People as a Proportion of Total Population by Type of County, by Region, 2000
The most serious outcome when populations are disproportionately older is that the high number of retired elderly people diminishes productive capacity in the communities where the retirees live, relative to counties with fewer elderly people.33 If historical trends persist, the concentration of elderly in depopulating counties is expected to grow substantially in the next 20 years.
The dramatic difference in age structures among counties can be seen in age pyramids, which are a graphical technique used by demographers to portray the joint distribution of ages and sexes in a given population. Using 2000 Census data, we constructed three such pyramids by dividing the population into five-year intervals and dividing the population in each of these intervals by total population, graphing the male populations on the left and the female populations on the right, consistent with traditional practice (see figure 8).34 These pyramids contrast the age structures of three counties in Nebraska:
Douglas County (population 464,000), the metropolitan county where Omaha is located
Hall County (population 54,000), a growing rural county in south-central Nebraska
Holt County (population 12,000), an accelerated-declining county in north-central Nebraska.
Visually, the differences in the age structures of the three counties are striking and largely typical of the differences observed across categories of all the counties in the Great Plains region.
The shape of the Douglas County age pyramid is typical of shapes associated with moderately growing metropolitan areas.35 The proportions of population in the 0-35 range are rather uniform, with differences in birth rates across the cohorts masked by net positive in-migration, both from rural areas in the state and, in this case, from rural areas in neighboring states. A metropolitan area the size of Omaha will have an economy large and complex enough to draw a variety of migrants from relatively great distances.36 The cohorts in the 35-44 age range are the largest in the population, representing the end of the post-World War II baby boom phenomenon that has been extensively documented.37 After age 55, the decline in the relative size of the age cohorts results from the deaths and out-migration of retirees. The proportion of the population older than 65 is 11.0 percent, and the proportion of the subset older than 85 is 1.4 percent.
The shape of the age pyramid of Hall County is similar to the shape for Douglas County except that the 20-30 age cohort is noticeably smaller, a difference reflecting a small net out-migration of these groups. Although growing rural counties tend to lose some young people to larger urban areas, they also tend to be destinations for young migrants from more-rural counties. As an agricultural economist has stated, "It is noteworthy that the heaviest off-farm migration is to rural nonfarm or smaller urban areas rather than to large central cities."38 Hall County, where Grand Island is located, is home to a community college, a satellite campus of the University of Nebraska, several farm equipment manufacturers, and a meat-packing plant. Notably, Interstate 80 passes through Hall County—a defining characteristic of many growing rural counties in Nebraska.
The shape of the age pyramid of Holt County is typical of the shape for many accelerated-declining counties. The most distinctive attribute of this pyramid is its "pinched waist" in the 20–34 age cohorts, representing the significant out-migration of high school graduates presumably seeking higher education or employment opportunities in other counties. In addition, the relatively narrow 0–5 age cohort probably results from the out-migration of fertile young people, illustrating the link between out-migration and natural population increase as discussed above. Also apparent here are the relatively high values in the over-65 cohort (as discussed above). It is noteworthy that Holt County reached its maximum population in 1920, whereas Douglas and Hall counties continue to reach new highs.39
The high proportion of retired elderly people in low population counties contributes to the economic disadvantage of their small workforces that limit the scale of businesses that can locate there. Even if labor quality is assumed to be homogenous, the small size of the typical population in a rural county in the Great Plains means that only a short list of industries are able to locate in those markets. In May 2003 we met with bankers from small-population rural counties in western Kansas, and one banker from a county of fewer than 5,000 people discussed his county's experience in trying to persuade a telemarketing operation to relocate to the county. Technological advances in communications technology are sometimes touted as a way for rural communities to compete and diversify away from dependence on agriculture, and telemarketing is an example of a business that may be able to conduct its operations far from urban centers. The banker told us, however, that the community, despite offering tax incentives and a building appropriate for the telemarketer, was unable to lure the company. The firm opted instead to relocate to a community larger than the banker's county, citing concerns both about housing for the relocated workers and about the small size of the available labor force.
This already unfavorable labor-force situation is exacerbated when a small community has a high proportion of elderly people, who typically lack both the economic motivation and the skills needed to work. In addition, elderly people as a group are characterized by a disproportionate demand for medical services, but specialized care centers tend to concentrate in urban areas that are often distant from small rural communities.40 This need by elderly people tends to strain local and state taxing jurisdictions—another factor reducing the areas' relative attractiveness as locations for new businesses.
The Phenomenon of "Brain Drain"
A second significant demographic effect of out-migration in depopulating rural counties is a phenomenon that development economists (economists who study differences in economic growth between countries) have long identified as the "brain drain":
Immigrants are often different from the natural citizens of a country in terms of their skills, motivation, education, and social behavior. It has often been noted that immigration has not been undertaken by the average person. Rather, groups of immigrants tend to be especially ambitious, more willing to take risks, harder working, more open to new ideas, and more willing to innovate. This is so because the act of moving from one country to another generally involves risks, temporary hardship, and a willingness to experience major changes in lifestyle immigrants are seldom "average" compared to the population they left behind or the one they join . The emigration of educated people from developing countries to the most developed economies is often referred to as the brain drain. This is not by any means a minor phenomenon: the number of well-educated emigrants from developing countries to developed economies is large.41
With the existence of the brain drain well established at the international level, it is reasonable to suggest that an analogous effect may be associated with rural-to-urban migration within the United States. This effect is hard to quantify at the county level because data are usually unavailable. However, a study conducted by the Federal Reserve Bank of Minneapolis at the state level suggests that the effect is real.42 The researchers used Census data to estimate the number of people who were older than age 25 and held bachelor's degrees in 1989 and 1999 in each of the states in the Minneapolis Federal Reserve Bank district. They then subtracted the total number of bachelor's degrees granted between 1989 and 1999 by all degree-granting institutions in the particular state, arriving at an estimate for each state of its net brain drain or gain (see table 8).43 The data suggest that Minnesota, the most urbanized of the states studied, is the destination of many migrants leaving the Dakotas, and northern Wisconsin, although probably many migrants from Wisconsin may also move to the Chicago metropolitan area.
Table 8 - Migration of College Students in the Upper Great Plains
North Dakota in particular has an increasingly critical problem with the out-migrating of educated people. According to Roger Johnson, North Dakota's commissioner of agriculture and the leader of a task force that examined this issue, 60 percent of those earning bachelor's degrees or higher in the state leave North Dakota within one year of graduation. "One thing is clear: A lot of people leave. No other state faces the [brain-drain] problem to the degree that North Dakota does. There's nobody that's worse off than us."44
Further research on North Dakota's brain drain suggests that the state's highest achievers are the people most likely to leave. A 1995 survey of the state's graduating high school students who took college entrance examinations found that high scorers were the most likely to leave the state: five years after graduating from high school, only one in four remained in North Dakota.45
At the state level, much of the concern with the brain drain is fiscal, as rural states such as North Dakota subsidize the education of their young citizens only to see them leave. Here the correspondence with the international brain drain is nearly exact. Low-population, rural states such as North Dakota already face comparatively high per capita costs for university-level education but are able to capture only a small fraction of the benefits for their local economies.
The outflow of college-educated people also suggests a broader policy issue, for most development experts consider the supply of highly educated workers to be a key contributor to the future prosperity of a state or region. Such workers are necessary to provide leadership in the local economy and to attract outside investment.46 The depopulating counties most in need of economic and policy leadership may have populations least likely to supply these skills and least likely to attract outside investment. Like the small size of the labor force in many depopulating counties, the quality of the labor force may raise concerns that shorten the list of companies willing to locate in those communities.
Depopulation and the Commercial Structure of Rural Counties
Above, we discuss how variations in agricultural practices influence differences in population density and how advances in agricultural technology are related to persistent declines in population. We also discuss the effect on a county's prosperity of the size and quality of its labor force. Another relationship that is at least equally important is the one we now discuss: that between trends in commercial activity and population in rural counties.
Economic geographers have developed a model known as "central-place theory" that provides insights into the distribution of commercial activity across a landscape. Central-place theory holds that
Towns and cities (central places) in a region may be thought of as organized into a hierarchy.
The greater the number and complexity of goods and services available in a central place, the higher its rank in the hierarchy.
Lower-order places offer convenience goods, such as groceries or gasoline that are consumed frequently and are provided by small-scale businesses that can be viable with only a small number of customers.
Higher-order places are fewer and farther apart and are home to larger-scale businesses whose survival requires a greater number of customers.47
Central-place theory also holds that businesses require a minimum number of customers to be viable. Over time, as the number of farms has dwindled in many rural areas, fewer customers are available to shop in the grocery stores, hardware stores, and agricultural supply facilities that are common in small rural towns.48 Thus, businesses in many of these areas have declined. Because the Great Plains has the largest and fewest farms, its commercial decline has been most profound.
When the decline in the number of farm customers leads to a decline in the number and complexity of businesses in lower-order central places, such lower-order central places become less important as destinations for those who live in the surrounding countryside. In many cases these places are able to support only businesses that provide just the most basic needs of the people who live there.
Furthermore, as farms become larger they often outgrow the ability of local small-town businesses to serve their needs. In the Great Plains, where farms are few and far apart, the towns that support them are also fewer and smaller and are able to support only the simplest businesses. Consequently, people who live in rural areas in the Great Plains have access to only a restricted range of goods and services. But according to recent research by the USDA, more than 40 percent of farmers have Internet access, and increasing numbers of them are using it to procure supplies from regional or national providers, bypassing local businesses even where these exist.49
In addition to the challenge of declining demand from the countryside, lower-order central places have also faced the challenge of increasing competition from businesses in larger towns. Much of this competition can be ascribed to the increased availability of inexpensive and reliable automobiles and vastly improved networks of roads, both of which allow residents of the countryside and smaller towns alike to visit larger central places to purchase a wider variety of goods and services. In fact, residents of smaller towns are willing to drive great distances to shop in larger market areas. More broadly, the increasing convergence between rural and urban culturesan effect of education and the mass mediahas stimulated the demand for a greater variety and volume of the consumer goods and services that are available in the larger towns.50
Retail businesses—even those in larger towns—are affected, in addition, by the consolidation of retail activity, as national retail chain stores present businesses in the rural Great Plains and in smaller towns elsewhere with strong and growing competition. Smaller retail stores have succumbed in great numbers to competitors that offer a larger variety of goods and services at lower prices. Many sources have dubbed this phenomenon the "Wal-Mart effect" because that chain offers the most prominent example.
Professor Ken Stone of Iowa State University, an economist who studies rural retail activity, declares:
There is strong evidence that rural communities in the United States have been more adversely impacted by the discount mass merchandisers (sometimes referred to as the Wal-Mart phenomenon) than by any other factors of recent times. Studies of Iowa have shown that some small towns lose up to 47 percent of their retail trade after 10 years of Wal-Mart stores nearby.51
Professor Stone's findings are summarized in figure 9, which shows that the communities with the smallest populations are the ones most affected when Wal-Mart stores open nearby. Although local businesses have been losing revenue to national chains since early in the last century, when Sears and Montgomery Ward began mailing catalogues, the effect has accelerated since 1970, with the massive proliferation of discount merchandisers.52 Although Wal-Mart and chains like it have been criticized for generating stiff competition for hundreds of Main Street competitors, comparative surveys have shown that traditional retailers are only 60 percent as productive as mass retailers—of which Wal-Mart is the leading, though not the only, example.53
The consolidation of retail activity in larger towns has been accompanied by the consolidation of other businesses in higher-order central places. For example, agricultural suppliers, such as machinery dealers and fertilizer and chemical suppliers have consolidated to achieve economies of scale.
Central-place theory predicts that the increasing importance of multipurpose shopping trips leads to a self-reinforcing trend of the consolidation of commercial activity.54 The more activities of all kinds that are concentrated in larger towns, the more willing small-town and rural residents are to make the trip to the larger towns. For example, if small-town residents travel to a nearby large town once a week to buy the agricultural goods and services available there, they may begin buying groceries at the large supermarket as well, bypassing the local store. The proliferation of mass discount stores that carry thousands of items increases the opportunity for multipurpose shopping trips, thereby increasing the traffic to larger central places.
This loss of retail activity can be quantified. One measure of the loss of business from rural counties to nearby larger counties is a trade "pull-factor," a statistic that measures the retail activity of a county in relation to the activity in nearby counties.55 A researcher calculates trade pull-factors by dividing a county's per capita retail sales for a given year by the state average per capita sales. This calculation is then adjusted to take into account differences in per capita income between the counties.56
A pull-factor of 1.0 implies that the county's sales tax revenue is proportional to the income of its residents, or that its residents are spending their dollars in their home county. A pull-factor greater than 1.0 suggests that a county is drawing business from adjoining counties, for its retail sales figures are higher than its population and per capita income levels would suggest. On the other hand, a pull-factor of less than 1.0 suggests that a county is losing business to neighboring counties.
To illustrate county pull-factors, we chose Nebraska (see figure 10). As expected, metropolitan and growing rural counties have aggregate pull-factors greater than 1.0, a score suggesting that they are attracting business from nearby counties. Conversely, depopulating counties have aggregate pull-factors of less than 1.0, a score suggesting that they lose business to nearby counties. The band of counties with pull-factors greater than 1.0 across the southern third of the state corresponds to the path of Interstate 80; this correlation suggests spending by tourists or travelers on the highway. Like the pull-factors of the counties in the path of the interstate, the unexpectedly high pull-factors of some other depopulating counties tend to reflect special circumstances, such as very small populations on other heavily traveled roads.
Pull-factors are greatly influenced by discounters such as Wal-Mart, especially in rural counties. Figure 11 shows the location (by type of county) of Wal-Mart stores in Nebraska—a distribution that is typical in Midwestern states.57 A majority of growing rural counties have Wal-Marts, and figure 10 indicated that these counties had the highest aggregate pull-factor, at 1.13. Although Wal-Mart is not the only reason for the favorable pull-factors in those counties where it is located, the Wal-Mart stores are emblematic of concentrations of retail activity.
Demographic Conclusion: The Threat to Viability and the Vicious Circle of Decline
Many demographers argue that communities whose populations fall below a critical mass are destined for irreversible decline because they no longer have sufficient resources to maintain economic viability. Given their low populations and low population densities, many rural counties, especially those in the Great Plains, face a number of interrelated difficulties. First, with small workforces and populations that are relatively unskilled and uneducated, they have a hard time appealing to prospective employers to relocate. Second, the shrinking customer base, as well as the Wal-Mart effect, drains scope and vitality from the commercial activity in these counties. Third, the per capita costs of services provided by governments—for example, law enforcement, maintenance of infrastructure (roads, bridges, and so forth), education of a quality comparable to that found in more populated areas, health care of a quality commensurate with the needs of a disproportionately elderly population—are high in areas of low population densities, where relatively few people must share the fixed costs associated with such investments.58 Consequently, low-population counties not only find it difficult to maintain the existing level of services but also lack the resources to improve their infrastructures to the point at which they can attract new businesses. In addition, small adjoining counties often find that they are maintaining redundant public resources as they struggle to provide a full menu of governmental services.59 Yet efforts to consolidate or share services (as frequently proposed) typically face strong political opposition, for residents of small-population counties are reluctant to surrender their separate identities.
Thus, many counties may face a self-reinforcing cycle of decline: declining populations lead to decreased economic vitality, and both lead to higher per capita costs; the higher costs provide incentives for continued out-migration—and the downwardly spiraling quality of life and of the supporting infrastructure in these counties makes it increasingly difficult for the counties to attract new businesses to the area.60 Counties with accelerating population declines may already be experiencing this phenomenon.
Part 2. The Banking Implications of Rural Depopulation
Rural depopulation—which is long-term and continuing and has serious consequences for the communities involved—is also significant for the banking industry. At year-end 2003, there were 1,451 banks and thrifts-16 percent of all insured financial institutions in the nation—headquartered in rural counties with declining populations (see table 9).61 For financial institutions, declining populations equate to declining customer bases.
Table 9 - Number and Assets of Banks and Thrifts by Type of County, by Region
The demographic data discussed above indicate clearly that the Great Plains is far more vulnerable to depopulation trends than other regions, and the banking data reinforce this vulnerability. In terms of number of institutions, most of the institutions that are headquartered in depopulating rural counties are located in the Corn Belt (48 percent) or the Great Plains (35 percent); in the rest of the country, including the two other depopulating areas, there are significantly fewer institutions headquartered in depopulating rural counties. But in proportional terms—the banks located in depopulating counties as a proportion of all banks in the region—the Great Plains stands out: approximately 46 percent of all banks that are headquartered in the Great Plains are in declining or accelerated-declining counties. This percentage is far higher than the percentage for any other depopulating region. Furthermore, 17 percent of all Great Plains institutions are in accelerated-declining counties.
The relative size of institutions is another indication that Great Plains institutions are at a disadvantage compared with banks in more vibrant areas (size correlates with an institution's ability to grow its business). The median asset size of a bank in the Great Plains is only $56 million, and in rural counties with declining populations it is only about $39 million. Institutions in other regions are significantly larger: even the Corn Belt's median bank holds $89 million in assets. Thus, although other areas may also be experiencing depopulation, they begin with much larger customer bases.
Here we analyze patterns of consolidation among Great Plains rural community banks. Then we survey the performance of Great Plains community banks, comparing them first with community banks in the nation as a whole and then among themselves.62 Next we analyze profitability and asset growth among these banks, which are not homogeneous in either regard; our focus is on asset size, branching, risk taking, and net interest margins. In the final section in this part of the article, we consider how the Internet may affect rural banks' customer base. Overall, we identify strategies that some banks in depopulating areas have used to remain successful.
Community Bank Consolidation in the Great Plains, Past and Future
The number of insured banks and thrifts in the United States has been declining for two decades, primarily because state unit-banking requirements were weakened (and then eliminated), many banks failed and merged during the banking and thrift crises of the 1980s and early 1990s, and many banks wished to grow larger to achieve economies of scale. Between year-end 1984 and year-end 2003, the number of financial institutions in the nation shrank to slightly more than half what it had been. Because of the large number of depopulating rural counties in the Great Plains, one might expect that bank consolidation would have been more robust in that region; after all, wouldn't fewer people require fewer banking institutions? However, the reductions in bank numbers that have occurred in the Great Plains are similar to the reductions in rural areas in the rest of the nation (see figure 12). At year-end 1984, the Great Plains was headquarters to 1,559 rural banks and thrifts (of all sizes); this number declined to 813 by the end of 2003, or 52 percent of the total from 19 years earlier.63 At year-end 2003, rural areas outside the Great Plains had 54 percent of their earlier total. And the reduction in insured institutions is consistent across all three types of Great Plains rural counties (see figure 13).
Where we do see differences is in the number of counties that are not home to the headquarters of a bank. Of the 424 rural counties in the Great Plains, 76 of them, or 18 percent of the total, do not have a headquartered bank or thrift. By contrast, of the 890 rural counties in the other depopulating regions, 13 percent do not have a headquartered institution. Of the 76 rural Great Plains counties that do not have headquartered banks, 18 did not have an institution headquartered there over the entire 19-year period we studied. The other 58 had at least one institution at the beginning of the period, but those institutions either failed or were purchased by other institutions in the succeeding years.
As one would expect, the vast majority of the counties without headquartered banks are experiencing population declines. Only 11 percent of Great Plains rural growing counties have no headquartered institution, but the comparable figure for declining and accelerated-declining counties is more than 20 percent. Of the states in the region, South Dakota has the largest proportion (and greatest number) of counties with no headquartered institution, or 32 percent (21 counties) of its 66 counties. Montana, at 20 percent (or 11 counties), has the second-highest proportion and number.
Even though many Great Plains rural counties lost their only bank headquarters after 1984, few actually lost a bank facility; rather, in most instances what had once been a main office became a branch office of an institution headquartered in another county. In most counties this consolidation activity has had a relatively neutral effect on branch totals, but a qualitative decline in bank service is possible. The conversion of a once-main-office to a branch is sometimes accompanied by reductions in customer services, customer service hours, and managerial authority and decision-making discretion.
Although consolidation trends in rural community banks in the Great Plans have been stable and representative of national figures, two pieces of evidence suggest that consolidation in the Great Plains may increase more rapidly in the future. One is the significant number of elderly people living in depopulating counties. In Part 1 of this paper, figure 8 depicted the age pyramid of a depopulating Nebraska county. That age pyramid—representative of many Great Plains counties—shows a large pocket of elderly people. At some point in the relatively near future, these people are going to pass away, and as indicated above, their banking business may move outside the area with the heirs. As many elderly customers also carry large deposit balances, their passing may result in a major loss of funding that may be difficult for many small banks to withstand.
The second factor that could increase consolidation is the lack of a succession plan in many community banks in the Great Plains. The typical profile of community banks in the Great Plains is that they are small—as noted above, the average size of a community bank in depopulating counties is only $39 million—and are owned and operated by the same person. In many cases, the owner/operators do not have family members groomed to take their place when they retire because, like other young people, the family members have migrated to counties where economic opportunities are greater. And because of the brain drain in rural areas, there may not even be suitable nonfamily members to assume operations.
During outreach meetings in the Great Plains the problem of succession plans has been a common theme, and bankers do not seem to have identified solutions. The typical short-term plan is for owner/operators to delay retirement, since other suitable options do not exist. The most likely outcome when these bankers do retire is the sale of their institutions, which could dramatically increase the pace of rural bank consolidation.
The Performance of Great Plains Community Banks: External and Internal Comparisons
In this section we examine the performance of rural banks in the Great Plains. Given the relative severity of rural depopulation trends in the region, it would seem reasonable to assume that insured institutions based in the Great Plains would be in a worse condition than banks headquartered in other regions' rural counties. It would also seem reasonable to assume that performance data within the region itself would vary by type of county. Neither of these assumptions is borne out.
Comparison with Community Banks outside the Region
Surprisingly, when the financial ratios of community banks in the Great Plains are compared with the ratios of community banks headquartered outside the Great Plains, evidence of depopulation-induced deterioration does not emerge (see table 10). From 1999 to 2003, the overall earnings, net interest margins, and asset-quality ratios reported by rural community banks in the Great Plains were similar to those reported by rural community banks headquartered outside the Great Plains. A notable difference is the loan-to-asset ratio: community banks based in the Great Plains report lower loan-to-asset ratios than their counterparts across the country. These lower ratios are probably explained by a comparative lack of lending opportunities in the market areas of Great Plains rural community banks.
Table 10 - Financial Ratios, Rural Banks in the Great Plains Compared with Rural Banks in the Rest of the United States, 1999-2003
Thus, despite the lack of strong loan demand and a shrinking customer base in the Great Plains, community banking performance there is similar to what it is across the entire nation. How have community banks in the Great Plains been able to report similar operating results when such a large number of them are located in dwindling markets? One possible answer is that, to date, depopulation has been occurring very slowly, and bankers have been able to adjust capably to their economic environments. Anecdotal evidence from our outreach meetings with rural bankers suggests that this is the case.
An additional, quantitative answer can be found in the final pair of lines in table 10, which indicate that community banks in the Great Plains have nearly three times the exposure to agricultural lending that community banks in the rest of the nation have. In fact, 80 percent of community banks in the Great Plains are considered farm banks, compared with just 28 percent elsewhere.64 This is a key point, especially when one considers government assistance to farmers and, by extension, to their lending institutions during the past three decades. Farming has been, and continues to be, one of the most heavily subsidized industries in the United States. In fact, government payments nationally averaged $19 billion per year from 1999 through 2003, representing about 40 percent of net farm income over that period. Although not all farm products nationwide are subsidized, the primary crops of the Great Plains—wheat, corn, and soybeans—tend to be supported more generously than products grown outside the region.65 As a result, farms in the Great Plains have received higher subsidies as a proportion of net farm income than farms elsewhere in the nation (see figure 14). Such support has certainly helped farmers repay their farm loans and has helped offset whatever negative consequences farm banks might have otherwise experienced from adverse demographic trends.
Just as performance data are similar for rural banks in the Great Plains and rural banks located elsewhere, performance data within the Great Plains itself are also relatively similar across the different types of county. Table 11, which shows community bank performance broken down by growing, declining, and accelerated-declining county types, indicates that banks in depopulating areas continue to perform well. Institutions in growing counties have earned a bit more pretax revenue, largely through higher sources of noninterest income, but institutions in declining and accelerated-declining counties have not fared poorly. Net interest margins are similar in the three types of county, for banks in declining and accelerated-declining counties have offset lower loan yields with lower funding costs. Loan-quality measures tend to modestly favor institutions in growing counties, but the other institutions offset this with higher levels of equity capital.
Table 11 - Financial Ratios for Community Banks by Type of County, Rural Great Plains, 1999-2003
However, significant disparities in lending activity exist among institutions in the three types of county. Growing counties, which are probably adding to their populations through growth in the number of nonagricultural jobs, tend to offer community banks more diversified opportunities for lending. Although community banks in growing counties continue to hold concentrations in farm lending, they make significantly fewer farm loans than their counterparts in declining or accelerated-declining counties, and fewer of the institutions in growing counties have enough farm lending to be labeled farm banks. The ability to diversify out of agriculture offers benefits, such as spreading risk across various industries and reducing dependence on federal farm assistance. Such assistance may not always be as generous as it has been in the recent past.
Beyond issues of performance, however, overall asset growth rates indicate that depopulation in rural counties has adversely affected community banks. Declining populations translate into dwindling borrower and depositor bases; and compared with community banks in growing counties, community banks in declining and accelerated-declining counties have lower growth rates for total assets, loans, and deposits. Table 12 shows annualized growth rates for Great Plains community bank balance-sheet accounts for the ten years ending December 31, 2003. The first thing to note is the tremendous difference between community banks based in metropolitan areas and those based in rural areas. Across the board, the economic vibrancy of metropolitan areas has contributed to higher growth rates in the banks headquartered there, even when these areas are compared with rural counties where populations have been increasing.
Table 12 - Balance-Sheet Growth Rates by Type of County, Great Plains, Year-end 1993 to Year-end 2003
When we look only at the rural counties in the Great Plains, the differences among them are evident, although far less striking than the metro-rural disparity. Not surprisingly, community banks in growing counties reported the greatest asset growth during the past decade, commensurate with their expanding communities: annualized asset growth was over two-thirds of a percentage point higher in growing-county community banks than in banks in declining or accelerated-declining counties. Although at first glance this disparity does not appear significant, its cumulative effect is more striking (see figure 15). Growing-county community banks expanded aggregate assets by 60 percent over the past decade, compared with 49 percent for banks in declining and accelerated-declining counties.
The three county types are clearly differentiated in terms of deposit growth. Community banks in growing counties reported growth in deposits of 4.3 percent per year between 1993 and 2003, whereas institutions in declining and accelerated-declining counties posted annual growth rates of 3.5 and 3.6 percent, respectively. Even more important than growth in total deposits is growth in core deposits. These are stable funds that have traditionally provided the backbone of community bank funding sources and consist of noninterest-bearing, savings, and money market deposit accounts, as well as time deposits of less than $100,000.66
Core deposits are generally less expensive and less sensitive to interest-rate movements than other funds, such as large time deposits, brokered deposits, and other borrowings such as Federal Home Loan Bank advances. As shown in figure 15, growing-county community banks reported cumulative growth in core deposits of 41 percent, or 3.5 percent annually, from 1993 to 2003; by comparison, community banks in declining counties reported cumulative growth in core deposits of 30 percent (or 2.6 percent annually), and for community banks in accelerated-declining counties the comparable figures were 32 percent (2.8 percent annually).
Although declining population during the past decade tends to be a reason for institutions in depopulating counties to have difficulties raising core deposits, the problem goes even deeper. The massive aging of depopulating areas (as discussed above) has caused significant problems for community banks. Many rural bankers tell the same story: an elderly depositor with large accounts in the bank passes away, and the deposits that the community bank had used to fund loans and other investments are withdrawn quickly by heirs who no longer live in the community but have long since moved to more thriving metropolitan counties. These funds are very hard to replace, and the large population of elderly people in Great Plains rural counties suggests that this problem will only intensify in coming years.
Analyses of Profitability and Asset Growth among Great Plains Community Banks
Although, as noted, many counties in the Great Plains face similar economic issues, not all community banks have responded in the same way or have reported the same operating results. Our goal in these analyses was to determine if some banks located in counties with declining populations had identified successful techniques to overcome local economic problems. Defining success is a somewhat subjective exercise, but we chose two community bank metrics that tend to generally indicate banking success: profitability and asset growth.
Most analysts would agree that profitability is an appropriate measure of success, and we measured profitability by the five-year (1999-2003) pretax return on assets (ROA) ratio.67 Asset growth also indicates success, though some banks may experience success in other variables (such as profitability) without achieving growth. We measured growth by the five-year annualized merger-adjusted asset growth rate. To prevent new banks from distorting the results, we looked only at the 483 depopulating-county community banks that had been operating for at least 10 years.
The two banking metrics—profitability and growth—are shown in figure 16, with each community bank's performance indicated by a single dot. The figure clearly shows the significant disparity in operating results: annualized profitability ranged from a low of -1.07 percent to a high of 3.53 percent, with the middle 80 percent of banks in the range of 0.62 percent to 2.10 percent. Only nine community banks were unprofitable over the five-year period.
Annualized asset growth ranged from -11.71 percent to 79.65 percent, with the middle 80 percent of banks falling between -0.51 percent and 9.04 percent. Sixty-two institutions, or 12.8 percent, reported declining assets over the five-year period. The trend line is interesting: it is nearly flat and slopes slightly downward, indicating a slight negative correlation between earnings and growth. Typically, healthy asset growth would be joined by strong earnings, but in this case the results raise the question of whether some institutions are trading profitability for asset growth.
To analyze the data further, we divided each metric into thirds, creating a nine-cell matrix. For profitability, one-third of institutions reported annualized pretax ROA of less than 1.05 percent; the middle third, between 1.05 percent and 1.57 percent; and the upper third, at least 1.57 percent. For asset growth, the lower third of institutions reported annualized growth of less than 1.91 percent; the middle third, 1.91 percent to 4.88 percent; and the upper third, at least 4.88 percent. The lines on figure 16 indicate these breakdowns and the resulting matrix.
The corners of the matrix are of particular interest. For example, what is the secret of the 49 community banks in the upper right-hand corner (those that reported high asset growth and high profitability)? By contrast, why do the 61 institutions in the lower left-hand corner report both low growth and low profitability? The other corners indicate, respectively, institutions that were able to achieve high profits despite low growth and institutions that reported high growth but low profits. We lump the 280 institutions in the matrix's other five cells into a single unit that we term the "middle cross," to use as a control group for analysis. Figure 17 puts the data from the scatter plot of figure 16 into a simpler format.
Our analysis points to several key factors that indicate why groups of institutions are faring so differently:
Significantly higher asset size appears to result in lower operating costs through economies of scale.
Branching into other counties has benefited some banks but possibly hindered others.
Risk taking differs considerably between the groups of banks.
Net interest margins differ significantly between the groups of banks.
Community banks that have achieved high earnings and high asset growth are the largest community banks, at a median $54.8 million in total assets. Banks that have achieved high earnings without commensurate growth also have relatively high levels of assets, at $41.2 million. By contrast, institutions that have achieved lower profitability are significantly smaller—$37.5 million for those with high asset growth, and just $21.5 million for those with low asset growth. These figures suggest that asset size is a significant determinant of success, and particularly of earnings.
Larger asset sizes can result in certain economies of scale, helping institutions keep operating costs relatively low. Our analysis indicates that larger banks posted significantly lower noninterest expenses (in relation to average assets) than smaller institutions (see table 13). When the earnings of banking groups that are most different—those with high growth/high earnings and those with low growth/low earnings—are compared with each other, operating expense is one factor that stands out. High-growth/high-earning banks reported annual noninterest expenses of 2.67 percent of average assets, whereas low-growth/low-earning banks reported expenses of 3.18 percent. The primary difference between these groups is salaries expense, which accounts for more than half the difference in noninterest expenses between the two groups of banks. Apparently, larger institutions are able to spread managerial and other salaries across larger asset bases. A similar but smaller difference can be seen in premises expenses, which again are significantly lower in larger institutions because these banks can spread the expenses further.
Table 13 - Operating Performance Measures of Community Banks in Great Plains Depopulating Counties, by Segment, 1999-2003
Banks reporting low growth but high earnings have the tightest control on operating expenses: these banks reported noninterest expenses of just 2.48 percent of average assets. We noted above that these banks, too, are relatively large in size, with size again making possible some efficiencies of scale. In addition, perhaps the management teams of these institutions, realizing that opportunities for robust asset growth do not exist, have streamlined their organizations to maximize profitability. As we show below, these institutions tend to operate a single branch, albeit a large one, and this allows them to keep costs down. At the opposite end of the spectrum, banks with high growth and low earnings have reported the highest operating expenses, at 3.25 percent of average assets. Salaries, premises costs, and other noninterest expenses are all high in this group of banks compared with other groups.
Another significant factor in the success of community banks in depopulating areas is the willingness and ability to add branches appropriately. For many banks in the rural Great Plains, branching into areas that are more economically vibrant than the county of the bank's headquarters is a relatively popular strategy. But although such a strategy can certainly be expected to add to a bank's asset base, it may not always prove profitable.
Community bank managers have many branching choices available to them, including operating a single branch. In fact, just over half of Great Plains community banks located in depopulating counties are unit banks. As table 13 shows, the unit-bank option is most popular with low-growth/high-earning banks (70 percent), which appear to achieve high profits by keeping operating costs low. By contrast, far fewer high-growth/low-earning banks (35 percent) operate a single branch, but these banks may have sacrificed profits for growth. Even when we add in multiple branches inside the bank's "home" county, we find these same differences in branching patterns persisting. Low-growth/high-earning banks tend to have all branches within the home county, while high-growth/ low-earnings banks tend to operate branches outside their home county.
The question is whether branching outside a bank's home county can be expected to improve a bank's prospects, and the answer is unclear. A case can be made that branching into other counties, especially those with more vibrant economies, was a primary factor in high-growth/high-earning banks' success, for 47 percent of these banks operate branches outside their home counties. These banks have achieved asset growth because of the branch expansion, but they have also been able to report high profitability. By contrast, only 15 percent of low-growth/low-earning banks have branched into other counties, at the cost of both growth and profit potential.
But branching can also be a risky proposition because management's knowledge of new markets, its expertise in new types of lending activities, and its ability to control expenses become more important. It would be reasonable to assume that high-growth/low-earning banks, nearly half of which operate branches outside their home county, might have lacked the management skills necessary to make such bold branching moves successful. Sixteen percent of these banks have branched into metropolitan counties, where the competitive arena—and therefore the required managerial expertise—is much different from what it is in rural areas.
Other balance-sheet components besides total assets are affected by branching decisions. For example, banks with high asset growth have been able to achieve relatively strong loan and core-deposit growth, but they have also significantly increased noncore funding. Low-growth banks have had difficulties retaining core deposits; in fact, from 1999 through 2003 low-growth/low-earning banks lost $22 million in core deposits and posted little loan growth.
Another factor that appears to influence community banks' success is risk taking. Management's tolerance for risk is apparent in branching activities, capital levels, and asset composition, and differs significantly among the groups of banks we studied. Although high-growth banks tend to show increased levels of risk tolerance, the fact that significant earning disparities exist suggests that risk taking can be a double-edged sword.
Adding branches, especially well outside a bank's headquarters county, is certainly a risky proposition, depending on management's abilities. Still, many institutions have proved successful at such branching moves.
Another area that evidences management's tolerance for risk is capital levels. As table 13 indicates, equity capital levels range from 9.32 percent for high-growth/low-earning institutions to 13.07 percent for low-growth/high-earning banks. Banks with high growth tend to have significantly lower equity capital levels than banks with low growth. As we saw with branching decisions, banks with high growth are willing to take greater risk, and whereas some have been rewarded, others have experienced far fewer benefits.
A significant divergence in risk tolerance is indicated by the share of assets held in loans. High-growth community banks hold substantially more loans (and, conversely, fewer securities) than low-growth banks. Since loans tend to have far greater credit risk than securities, these holdings tend to indicate management's greater tolerance for risk. In fact, researchers have found that in the agricultural crisis of the 1980s, the primary factor influencing whether a bank failed was the loan-to-asset ratio.68
Interestingly, despite high-growth banks' willingness to take on additional credit risk, an examination of loan composition within the different groups of banks reveals only relatively minor differences among the groups. The most significant differences are that low-growth/high-earning banks make substantially more agricultural loans and fewer single-family housing loans than the other groups, and that high-growth banks make slightly fewer farm loans but more commercial real estate loans. The fact that loan composition is comparable for all groups indicates that high-growth banks, despite taking on more loans, continue to make particular types of loans in roughly the same proportion as low-growth banks.
Although high-growth banks have made substantially more loans, high growth alone does not appear to indicate how the loans will perform. During the past five years, low-earning banks—whether or not they have been growing assets significantly—have reported elevated levels of past-due loans and significantly higher loan charge-off rates than high-earning institutions. In fact, charge-off levels at low-growth/low-earning institutions were more than four times higher than levels at low-growth/high-earning banks.
Net Interest Margins
When the earnings performance of community banks that are based in depopulating areas is examined, the disparity in net interest margins (NIMs) is particularly striking. The range of NIMs reported for 1999- 2003 went from 3.87 percent for low-growth/low-earning institutions to 4.49 percent for high-growth/high-earning institutions. A considerable majority of community bank revenue is generated through the NIM; as a result, this difference is significant.
Differences in the NIM can be attributed to a variety of causes. First, some of the disparity can be linked to the substantial difference in loan-to-asset (LTA) ratios. Typically loans are characterized by far higher yields than securities, federal funds sold, or other "earning" investments; as a result, higher loan volume usually translates into higher levels of net interest income. Thus, high-growth/high-earning banks, with an aggregate LTA ratio of 65 percent, report higher yields on earning assets than low-growth/low-earning banks, with an aggregate LTA of only 52 percent.
However, low-growth/high-earning banks have achieved the second-highest aggregate NIM, despite having a relatively low (54 percent) LTA ratio. These banks appear to have achieved their NIMs through a combination of a very low cost of funds (at 2.94 percent, by far the lowest of the groups) and relatively high loan yields. Low funding costs have been achieved through high levels of core deposits (the second highest of the groups) and low-growth prospects that do not require the raising of higher-cost funds. High loan yields appear to be the product of the group's loan mix, which has more agricultural loans and fewer residential loans than the mixes of the other groups, but could also be the product of stable lending relationships and the fact that these banks are not entering new, highly competitive lending areas.
The Effect of the Internet on Customer Base
Beyond these differences in bank performance, does a cure exist for community banks in depopulating rural areas? One common response from rural bankers is that the Internet could be the elixir that helps them to overcome their problems, but this remains to be seen.
Use of the Internet in rural America is widespread and growing.69 In fact, the adoption of computers by farm households is similar to that by U.S. households in general.70 Clearly, rural populations can benefit from using the Internet, which expands their choices for goods and services and reduces the burden of being located in geographically remote areas. Although it may be an overstatement to suggest that the Internet could abolish distance entirely, it is certainly true that the Internet can enhance the ability of farmers, rural consumers, and rural businesses to access information, goods, and services from faraway sources and that such access may perhaps increase the economic viability of rural areas. Thus, some economists view the Internet as the possible savior of rural areas, for companies could locate their businesses in rural areas, taking advantage of lower costs for labor and land and less-stringent environmental regulations while still marketing their products to urban end-users.
Although many economists argue that the Internet has the potential to improve the economic prospects of rural communities, the history of earlier technological innovations suggests otherwise. In the early 1900s, for example, it was widely thought that expanding telephone service to rural areas would solve the depopulation problems of that time.71 As we point out above, similar claims were made when the automobile became available in rural areas in the 1920s and when rural electrification became widely available after World War II, but some believe that these innovations actually increased the pace of rural-to-urban migration rather than decreasing it.
Proponents of the Internet see it as a bridge from rural communities, in that rural populations can reach beyond their local communities to shop and conduct business, but those who are more skeptical about the rural benefits see the potential for the Internet to provide a bridge to rural areas, in which non-local businesses can easily enter rural areas to compete. Rural residents are increasingly able to use the Internet to shop for goods and services anywhere in the country, rather than use the products and services of local businesses that have long served them. For community banks, the spread of the Internet, in the best-case scenario, would allow them to expand their customer bases electronically even while their local populations are declining. However, in that scenario, the banks also would effectively be undoing the geographic ties that bind them to their customers.
Furthermore, the Internet may also allow larger banking companies to market their products in rural areas where locating a physical branch might never have been feasible. Large banks typically have a wider array of products than rural banks, and their size allows large banks some scale benefits in the cost of providing banking services. When use of the Internet is widespread in rural areas, therefore, these larger companies may become very formidable competitors of rural institutions.
Part 3. Policy Approaches and Prospects
What does the future hold for depopulating rural counties in the Great Plains and for the insured financial institutions that are headquartered there? As we have seen, of the four regions studied, the Great Plains is the one where rural depopulation seems most extensive and severe. The low population densities, the relative isolation of the population, the lack of natural amenities, and the dearth of opportunities for nonagricultural industries all pose significant obstacles to any strategies to reverse the trend. In addition, the very low populations of many Great Plains communities, in tandem with high concentrations in agriculture, make these communities highly vulnerable to slipping below the threshold of continued economic viability.
Policy makers at every level continue to search for solutions to the problem of rural depopulation in the most severely affected counties. The question is what public policies are appropriate responses to the continuing depletion of the populations of many rural areas.
One viewpoint holds that rural depopulation is the result of fundamental economic forces, or the cumulative effect of millions of individuals responding to market forces. The proponents of this view maintain that the role of public policy should be limited to programs that facilitate migration from the rural areas. These programs may include educating and training rural residents to improve their skills, thereby presumably improving their attractiveness to employers. Such programs would typically have a short-term orientation and would work in concert with the underlying market forces.72 These policies would be expected to adversely affect community banks in depopulating areas, for the banks' customer bases would continue to erode. The programs favored by the advocates of this viewpoint are labeled by some observers as "rural transition programs."
Advocates of the opposing viewpoint favor an "economic development strategy" that would use government funds to reverse market forces and restore viability to declining rural areas. Theirs would be a long-run strategy, addressing the needs of those left behind—those who are unwilling or unable to migrate. Economic development policies are usually justified by arguments that lie beyond economics, such as the social value of the rural lifestyle. Such policies typically include expenditures for the development of infrastructure and the enhancement of business opportunities.73 These policies could ultimately benefit community banks in counties where such policies were implemented, but the ultimate cost of such programs could be substantial.
On a smaller scale, some communities have implemented economic development policies that have shown some promise. For example, several communities in Kansas—most recently the city of Marquette—have given away land if a new residence or business were erected on it. While these efforts have worked well for these communities, their scale is much too small to be considered as a macro policy to reverse depopulation trends throughout the Great Plains.
Communications technology (e.g., the Internet and the continued spread of broadband access into rural areas) potentially holds some promise for depopulating counties. Rural businesses hope that such technology will allow them to market their goods and services to customers well beyond the businesses' own county lines. However, such technology could become a bridge to these communities as well as the hoped-for bridge from them: urban businesses, including large banks, would have the means to reach into isolated rural communities, thus becoming a powerful new source of competition.
On the bank regulatory side, one effort that may assist rural community banks is the federal agencies' work in reducing federal banking regulations. A law known as the Economic Growth and Regulatory Paperwork Reduction Act of 1996 (EGRPRA) requires the federal financial regulatory agencies to identify outdated, unnecessary, or unduly burdensome statutory or regulatory requirements for possible elimination. These efforts could reduce the operating costs of financial institutions, and be of particular importance to small banks, which, because of their size, have disproportionately high legal compliance costs.
Looking ahead, we foresee increasing bank consolidation in depopulating rural areas, potentially altering the number of institutions dramatically over the next 20 years. Community bank consolidation in these areas has yet to outpace the consolidation elsewhere in the nation, but two factors are approaching a critical juncture. First, the large pocket of very elderly people in rural depopulating counties points to a future significant weakening of community bank customer bases. Second, in areas where the lack of succession plan is due to the lack of younger, capable bank managers, many retiring bank owners could have no option but to sell their institutions.
In the meantime, the strategic options available to community banks in depopulating counties are limited. Over the short term, community bank success in rural areas could depend on management's willingness to take well-conceived risks, such as branching into more economically vibrant areas. However, many management teams may not have the expertise to do this without heightening their institutions' risk profiles. Another viable strategy may be for management to streamline their institutions, cutting costs wherever possible, to remain profitable despite the absence of local opportunities for growth.
While the current economic prospects of the Great Plains rural counties remain foreboding and bank consolidation may increase considerably over the next 20 years, rural banking is by no means entirely discouraging. As discussed in this paper, many insightful bank managers have already crafted strategies to combat the demographic challenges and have been rewarded with strong profitability, asset growth, or both. Such managers will continue to do so, even if the numbers of rural banks continue to dwindle around them. The result could be that while there may in fact be far fewer rural banks in the future, the rural banking system still may be intact and strong.
* The authors are Regional Economist and Regional Manager, Kansas City Region in the Division of Insurance and Research at the Federal Deposit Insurance Corporation. Richard Cofer, Shelly Yeager, and Rae-Ann Miller of the Division of Insurance and Research contributed to this article.
1 To identify metropolitan counties, we used the U.S. Department of Agriculture's Rural-Urban Continuum Codes, a typology developed in the 1970s and updated after each decennial census. The most recent version of the codes was released in August 2003.
2 For the definition of the Great Plains Region, see Rowley (1998), 5.
3 McGranahan and Beale (2002), 2.
4 This definition of the Corn Belt Region is adapted from the USDA's Cost and Returns Regions for corn production, available at http://www.ers.usda.gov/Data/CostsAndReturns/oldregions.htm#corn.
5 This definition of the Delta-South Region was constructed from the.distribution of declining counties per 1970 and 2000 censuses.
6 Cosby et al. (1992), 47.
7 Ibid., 284.
8 This definition of the Appalachia-East Region was constructed from the definition of Appalachia appearing in Couto (1994), 5.
9 Global Insight Historical Labor Force Database.
10 U.S. Bureau of the Census (1996).
11 Williams (2002), 345; and Global Insight Historical Labor Force .Database.
12 USDA (2001), 4.
13 McGranahan and Beale (2002), 2.
14 Ibid., 4.
15 McGranahan (1999), iii.
16 McGranahan and Beale (2002), 6.
17 U.S. Bureau of the Census (2003), table 1.
18 Huffman (1999), 1.
19 The aggregate statistics presented in figure 7 actually understate the degree of consolidation in U.S. agriculture, for they are based on.the USDA's extremely broad definition of a farm as any operation with more than $1,000 in annual sales. Commercially viable farms are those with more than $100,000 in annual sales, and for them the proportional decline in number has been much greater.
20 Gardner (2002), 15.
21 Ibid., 11, 12, 19, 22, 24.
22 Wordie (2003), 80.
23 Drabenstott (1999), 66, 68.
24 Gardner (2002), 70.
25 William Roenigk, staff economist, National Chicken Council, .telephone conversation with Jeffrey Walser, January 15, 2004.
26 Johnson (1999), 7.
27 Gardner (2002), 94.
28 Table 6 shows that, compared with the other regions, the Great Plains exhibits the highest rate of population decrease in both the declining and accelerated-declining categories. When this finding is combined with the finding from table 4 that the counties in the Great Plains are significantly less populated to begin with, the severity of the risk that that region's counties face from depopulation is evident.
29 Baines (2003), 116.
30 Albrecht and Murdock (1990), 153.
31 Johansen (1993), 59.
32 Moore and McGuiness (1999), 149.
33 Hendrik Van den Berg, Economic Growth and Development (New York: McGraw-Hill, 2001), 267.
34 Steve H. Murdock and David R. Ellis, Applied Demography—Introduction to Basic Concepts, Methods, and Data (Boulder, Co.: Westview Press, 1991), 152.
35 Van den Berg (2001), 263Ð4.
36 U.S. Bureau of the Census (2003), table 30.
37 Becker (1991), 169.
38 Gardner (2002), 102.
39 Maximum populations were calculated using the decennial U.S. Censuses.
40 Rogers (1999), 1.
41 Van den Berg (2001), 270, 400.
42 Wirtz (2003), 1.
43 Ibid., 4.
44 Ibid., 2Ð3.
45 Wirtz (2003a), 2.
46 Feser and Sweeney (2003), 39.
47 Berry, Conkling, and Ray (1976), 228.
48 Gardner (2002), 125.
49 USDA, Economic Research Service (2001a), 19.
50 Gardner (2002), 125.
51 Stone (1998), 189.
52 Ibid., 199.
53 Basker (2002), 4.
54 Morrill (1970), 76.
55 Broomhall and King (n.d.), 2.
57 Wal-Mart stores have tended to be built in larger counties. Our analysis of 13 states shows that the 247 rural counties where Wal-Marts have been built since 1968 had an average population of 30,218 and an average population density of 27.9 as of the 2000 Census. By contrast, the rural counties in the same 13 states that did.not have Wal-Marts averaged a population of 8,215 and a density of 6.9 people. (See Rand McNally Road Atlas with Wal-Mart and SamÕs Club Store Directory, 2003 Edition. States included are Colorado, Idaho, Iowa, Kansas, Minnesota, Missouri, Montana, Nebraska, Nebraska, .Oklahoma, South Dakota, Wisconsin, and Wyoming.)
58 On health care, see Rowley (1998), 4.
59 Drabenstott, Henry, and Gibson (1987), 41.
60 Ibid., 44.
61 To be sure, these institutions represent a very small percentage of total industry assets.
62 In this article, community banks are defined as banks and thrifts that hold less than $250 million in assets. We chose $250 million for two reasons: (1) The vast majority of institutions in the Great Plains—88 percent—have less than $250 million in assets; and (2) our analysis shows that for institutions under $250 million, most of the banking activity (in terms of location of bank offices) occurs in the same county where the bank is headquartered. In fact, as of June 30, 2003, Great Plains institutions with less than $250 million in assets had 70 percent of their banking offices located within the same county as the headquarters. By contrast, in institutions between $250 million and $1 billion the figure falls to 38 percent of banking offices. When bank performance is analyzed by its headquarters county, it is important for the bank's activity to be concentrated in that county to the greatest extent possible.
63 Between year-end 1984 and year-end 2003, 766 rural community banks were eliminated in the Great Plains; 720 of them were acquired by other institutions (149 of those acquisitions were .failure related), and the other 46 failed or voluntarily liquidated.
64 The FDIC defines farm banks as institutions where at least 25 percent of total loans are made for production agriculture or are secured by farm real estate.
65 While the region's primary crops are heavily subsidized, cattle, another important product in the Great Plains, are not.
66 As of December 31, 2003, community banks in the nation reported that 69.3 percent of their assets were funded by core deposits. By contrast, larger institutions (those with over $1 billion in total assets) had core deposits totaling just 44.8 percent of total assets. Although both of these ratios have declined over time, the differential has been relatively steady.
67 Pretax ROA is used in lieu of after-tax ROA because some institutions have adopted Subchapter S status, in which they do not pay income taxes; these institutions therefore have much higher after-tax ROAs than non-Subchapter S institutions.
68 FDIC (1997), 281Ð82.
69 Much of this section is drawn from Walser (2002).
70 Abbott, Yarbrough, and Schmidt (2000), 220.
71 Kline (2000), 24.
72 Drabenstott, Henry and Gibson (1987), 47.
73 Ibid., 51.
Abbott, Eric A. J., Paul Yarbrough, and Allan G. Schmidt. 2000. Farmers, Computers, and the Internet: How Structures and Roles Shape the Information Society. In Having All the Right Connections—Telecommunications and Rural Viability, edited by Peter F. Korsching, Patricia C. Hipple, and Eric A. Abbott, 201-26. Praeger Publishers.
Albrecht, Don E., and Steve H. Murdock. 1990. The Sociology of Agriculture—An Ecological Perspective. Iowa State University Press.
Baines, Dudley. 2003. Internal Migration. In The Oxford Encyclopedia of Economic History, edited by Joel Mokyr, 3:116-19. Oxford University Press.
Basker, Emek. 2002. Job Creation or Destruction? Labor Market Effects of Wal-Mart Expansion. Working Paper. Federal Reserve Bank of St. Louis. www.missouri.edu/~baskere/papers/ (accessed October 7, 2003).
Becker, Gary S. 1991. A Treatise on the Family. Harvard University Press.
Berry, Brian J. L., Edgar C. Conkling, and D. Michael Ray. 1976. The Geography of Economic Systems. Prentice-Hall.
Broomhall, David, and Eric King. n.d. Retail Sales Trends in Indiana Counties. Agricultural Economics EC-690. Purdue University Cooperative Extension Service. http://www.agecon.purdue.edu/AgCom/EC/EC-690.html (accessed June 20, 2003).
Cosby, Arthur C., Mitchell W. Brackin, T. David Mason, and Eunice R. McCulloch. 1992. A Social and Economic Portrait of the Mississippi Delta. Mississippi State University.
Couto, Richard A. 1994. An American Challenge—A Report on Economic Trends and Social Issues in Appalachia. Kendall Hunt Publishing.
Drabenstott, Mark. 1999. Consolidation in U.S. Agriculture: The New Rural Landscape and Public Policy. Federal Reserve Bank of Kansas City Economic Review 84, no. 1:63-71.
Drabenstott, Mark, Mark Henry, and Lynn Gibson. 1987. Rural Economic Policy Choice. Federal Reserve Bank of Kansas City Economic Review 72, no. 1:41-58.
Federal Deposit Insurance Corporation (FDIC). 1997. Banking and the Agricultural Problems of the 1980s. In History of the Eighties—Lessons for the Future. Vol. 1, An Examination of the Banking Crises of the 1980s and Early 1990s, 259-290. FDIC.
Feser, Edward, and Stuart Sweeney. 2003. Out-migration, Depopulation, and the Geography of U.S. Economic Distress. International Regional Science Review 26, no. 1:39-67.
Gardner, Bruce L. 2002. American Agriculture in the Twentieth Century—How It Flourished and What It Cost. Harvard University Press.
Global Insight Historical Labor Force Database. (A proprietary database published by Global Insight, an economic and financial information company headquartered in Waltham, Massachusetts.)
Huffman, Wallace E. 1999. The Labor Intensity and Technology of Agriculture: California vs. the Other States, 1960-1996. Presented at workshop "Immigration and the Changing Face of Rural California: Focus on the Sacramento Valley," Davis, CA (September 2-4), http://migration.ucdavis.edu/rmn/changingface/cf_sep1999/Huffman.html (accessed August 4, 2003).
Johansen, Harley E. 1993. The Small Town in Urbanized Society. In The Demography of Rural Life, edited by David L. Brown et al., 59. Cornell University Press.
Johnson, Kenneth. 1999. The Rural Rebound. Population Reference Bureau's Reports on America 1, no. 3:1-19.
Kline, Ronald R. 2000. Consumers in the Country: Technology and Social Change in Rural America. Johns Hopkins University Press.
McGranahan, David A. 1999. Natural Amenities Drive Rural Population Change. Agricultural Economic Report No. 781. U.S. Department of Agriculture.
McGranahan, David, and Calvin Beale. 2002. Understanding Rural Population Loss. U.S. Department of Agriculture Rural America 17, no. 4:2-11.
Moore, Eric G., and Donald L. McGuiness. 1999. Geographic Dimensions of Aging. In Migration and Restructuring in the United States—A Geographic Perspective, edited by Kavita Pandit and Suzanne Davies Withers, 149. Roman and Littlefield.
Morrill, Richard L. 1970. The Spatial Organization of Society. Wordsworth Publishing.
Murdock, Steve H., and David R. Ellis. 1991. Applied Demography—An Introduction to Basic Concepts, Methods, and Data. Westview Press.
Rand McNally Road Atlas with Wal-Mart and Sam's Club Store Directory. 2003.
Rathage, Richard, and Paula Highman. 1998. Population Change in the Great Plains since 1950 and the Consequences of Selective Migration. Research in Rural Sociology and Development 7:71-89.
Rogers, Carolyn C. 1999. Changes in the Older Population and Implications for Rural Areas. Rural Development Research Report 90. Economic Research Service, U.S. Department of Agriculture.
Rowley, Thomas D. 1998. Sustaining the Great Plains. U.S. Department of Agriculture Rural Development Perspective 13, no. 1:2-6.
Stone, Kenneth. 1998. Impact of the Wal-Mart Phenomenon on Rural Communities. In Increasing Understanding of Public Problems and Policies—1997, 189-99. Farm Foundation.
U.S. Bureau of the Census. 1996. Population of States and Counties of the United States: 1790 to 1990 from the Twenty-One Decennial Censuses. Bureau of the Census.
____________. 2003. Statistical Abstract of the United States: 2002. Bureau of the Census.
U.S. Department of Agriculture. 2001. Cattle and Calves: Number by Class, State and United States. U.S. Department of Agriculture Cattle report (January): 4.
U.S. Department of Agriculture, Economic Research Service. 2001. Farms, the Internet, and E-Commerce: Adoption and Implications. Agricultural Outlook (November): 17-20.
Van den Berg, Hendrik. 2001. Economic Growth and Development. McGraw-Hill.
Walser, Jeffrey. 2002. The Information Superhighway: Panacea or Threat for Rural America? Regional Outlook (FDIC Kansas City Region) Q3:3-9.
Williams, John Alexander. 2002. Appalachia: A History. University of North Carolina Press.
Wirtz, Ronald. 2003a. Patterns of the Young and Restless. Federal Reserve Bank of Minneapolis Fedgazette (January): 1-4. http://minneapolisfed.org/pubs/fedgaz/03-01/young.cfm (accessed May 1, 2003).
____________. 2003b. Plugging the Brain Drain. Federal Reserve Bank of Minneapolis Fedgazette (January): 1-10. http://minneapolisfed.org/pubs/fedgaz/03-01/ cover.cfm (accessed April 24, 2003).
Wordie, J. R. 2003. Agriculture: Technological Change. In The Oxford Encyclopedia of Economic History, edited by Joel Mokyr, 1:75-80. Oxford University Press.