When the concept that the lawn was essential to residential landscaping first gained traction in the U.S., synthetic fertilizers didn’t exist. Lawns were a mix of various “meadow grasses” and white clover, weeds were ignored or pulled by hand, and lawn mowers were powered by human muscle. In 1844, A. J. Downing, who did as much to popularize the lawn as anyone, recommended that lawns be top-dressed in early spring with compost “of any decayed vegetable or animal matter.” In 1900, such authorities as the Columbus [Ohio] Horticultural Society recommended spring fertilization, saying that the new chemical fertilizers could be used, but that fine ground bone meal and cottonseed meal would work as well at lower cost. That year, total synthetic fertilizer use in the US totaled 2.2 million pounds.
In the familiar hockey stick pattern of other, related, increases such as CO2 parts per million, resource use, and population, total synthetic fertilizer use went up, slowly at first, to about 8.2 million tons in 1940. After World War II, marketing forces that shaped cultural norms combined with the “Green Revolution” to drive sharp increases in the U.S. Recommendations for lawns increased from one to four applications over the growing season. In 1981, when the Talking Heads often performed their song in concert, total synthetic fertilizer use was up to 54 million tons. But lawns rapidly gained market share: According to the EPA approximately 13.5 million tons of synthetic fertilizer were spread over American farmland in 2005 and 2006, covering about one-eighth of the continental land mass. Yet in 2004, about 70 million tons of fertilizer were used on U.S. lawns. And while agricultural use has declined somewhat, lawn use continues to increase: lawn acreage continues to expand, and there is more input per acre than on agricultural land.