Understanding Urban Agriculture – Part 1, The Present State in Historical Context
This is part one of a series of articles Angelo is writing, for part two please visit: http://permaculturenews.org/2016/10/12/understanding-urban-agriculture-part-2-productivity-potential-possibilities/
You have to know the past to understand the present.
Urban agriculture is a growing phenomenon worldwide. and this movement has been driven by individuals and local communities interested in access to fresh, cheap healthy, more nutritious and better tasting food, increased self-reliance, local food security and sustainable food production without the unnecessary food miles and health risks of pesticides and GMOs.
The rise of urban agriculture and its increasing importance in many countries around the world has been documented by the United Nations:
To some people urban agriculture is a radical new concept and they question whether it can really work. Urban areas in industrial cities may be a relatively new concept, but ‘urban’ agriculture itself definitely is not, it’s technically just localised agriculture, which is the historical norm since the very advent of agriculture itself. Perhaps the rise of urban agriculture is better described as the reclamation of localised agriculture in post-industrial urban cities. Human settlements have historically always existed in close relationship to their food sources as we shall see below.
Our relationship to food
A discussion on urban agriculture, or any form of agriculture for that matter would be incomplete without an examination of the relationship between human societies and the food that sustains them. Despite the supposed scientific progress of modern society, many popularly held opinions are totally devoid of intellectual scrutiny, so what better way to start than with a scientific and historical analysis!
First, some perspective is necessary in order to eliminate the unquestioned assumptions we may have about the proper order of things. In the study of biology, we learn that all living organisms exist in habitats that supply their essential needs, including food and water. Humans are no different, the earliest human civilizations sprang up in fertile regions of land where food was abundant and which supported the hunter-gatherer lifestyle for the first 200,000 years or so. When humans progressed to agriculture, which they have been practising for the last 10,000 years, they chose fertile regions that supported the growth of their crops. Villages and cities were located close to productive land and sources of water as a matter of common sense, and early settlements emphasized the production of food. As cities grew and industrialized over the last 200 years or so, the demand for land and concerns about keeping farm animals gradually pushed farms further away from cities into the rural areas.
It is clear then that for around 99.9% of our existence, the human species operated just like every other biological organism, and lived in close proximity to its food. This is the normal state of being for all living things, including humans, and the last 200 years of industrialised society are only an aberration in the historical context, with no evidence that this state of existence, living away from one’s food, is either sustainable, permanent or even desirable.
This raises the next question, how did we become ‘disconnected’ from out food?
Setting the historical context
The industrialisation of society in itself did not prevent people from growing food on a small scale in cities. To understand how and why food was pushed out of cities we need to examine some critical cultural factors, which are curiously endemic to western culture.
Whether we like it or not, modern English speaking countries are heavily influenced by the relatively recent historical English feudal culture. Typically, feudal societies have a distinct class structure which includes an aristocracy and landowners who own the land and a peasant class who are forced to work the land and grow all the food.
What may surprise some people is how recent all this was – the feudal system disappeared from most of Western Europe around 1500, and legally abolished in England in 1660. England created the first position of prime minister as a constitutional monarchy in 1721, and only became a full democracy very late with the Representation of the People Act in 1918. When we consider that the concepts of democracy and constitution were created in ancient Athens circa 508 BCE, we realise how long the English culture existed in such a state with a strongly divided class structure. These are the seeds of our modern culture and the root of our current problem as we shall soon see.
As Western society gradually democratised and industrialised, there was a gradual shift in the class structure, and a corresponding movement away from the peasant class and all things associated with it. Sure, many people moved into the cities and became ‘factory fodder’, a new unskilled labour peasant class, but as cities grew, the rural areas became inhabited by the new class of food producers, the farmers.
This is still not enough to create a culture of food-free cities, as this would not completely prevent people growing food on a small scale, even if they were all crammed into squalid little workers cottages in the soot and smoke polluted new industrialised cities. People indeed did grow food in such circumstances.
Throughout the 1700’s all the way through to the industrialised era we have seen the presence of allotment or community gardens all around the world, areas allotted to the public, sometimes at a cost (rented out) to allow people to grow food. The fact that food could be grown in significant quantity in private and public space in the industrial age is evident when we consider the phenomenon of ‘Victory Gardens’. During World War I and World War II, in the US, Canada, UK, Germany and Australia, private and public land was used by people to grow their own fruit, vegetables and herbs in an effort to reduce pressure on the public food supply.
To create a culture of food-free cities would require something more, something deeply ingrained in the minds of people that would cause them to act against their best interests and reject food from their environment. If we dig deeper, this time onto the human psyche and mindset of the time, we find our true answers.
Wealth, affluence, food and ornamental gardening
Sometimes people do things simply because ‘they’ve always been done that way’ without ever questioning why, without understanding what prompted people to undertake a certain course of action in the first place. Why don’t we grow food in public spaces in our cities? How many people question that?
It is said that we can never understand history if we look at it with the mindset and perspective of the present. To understand the actions of the past, we need to understand what people were thinking at that time in history.
Let us cast ourselves back to the time of the downtrodden landless food growing peasants in historical Europe. The owners of the land benefited greatly from peasant labour – the peasants could keep some of the food they grew to feed themselves, and were permitted to live on the land they worked on, while everything else went to the landowners. The landowners would feed themselves, and sell food and livestock to make more money.
It therefore stands that land was very valuable for growing food for sale, or grazing animals for sale, as a means of increasing one’s wealth. But having lots of money stored away was not enough for the rich and wealthy, what good was wealth if they couldn’t flaunt it? The rich landowners needed a way to show off their wealth, and how better than with opulent shows of extravagant waste!
Imagine what an extravagant and ostentatious display of wealth it would be for a landowner to intentionally waste valuable land by using it for ornamental purposes, purely for showing off, and losing all the money it could generate in the process. This is compounded further by the fact that valuable peasant labour, which would normally generate income, would be wastefully redirecting to maintaining a purely aesthetic, non-productive, ornamental garden. That’s really making a statement of having so much wealth that one could literally throw it away!
What we’ve just described is a critical paradigm shift from agriculture to ornamental horticulture, and this is the critical point that most people miss. So what’s the difference? To use the Oxford dictionary definitions:
• Horticulture – the art or practice of garden cultivation and management
• Agriculture – the science or practice of farming, including cultivation of the soil for the growing of crops and the rearing of animals to provide food, wool, and other products.
Essentially it’s a shift in land use from land as productive space providing primary needs to land as art and play space.
This ushered in the idea of a lawn and an ornamental garden as a sign of extravagant wealth.
Before I go on, I’d better explain lawns too!
The concept of having a ‘lawn’ came to us from Europe. Originally pastures of grass were used to graze livestock, which kept the grass short, resulting in large expanses of land covered with close-cut grass. Europe’s mild weather with plenty of rainfall ideally supports the growth of grass across large, open spaces.
In the same vein as the ornamental gardens described earlier and as part of the same movement, in 17th century England we first saw the deliberate growing of trimmed grass by the wealthy and the aristocracy as a show of affluence. Recall that during those times, land was a valuable resource, as it was used to grow food, which provided a source of nutrition and a source of income. For a wealthy landowner to simply grow grass was a show of extravagance, flaunting the fact that they had land to waste as they pleased.
The enormity of such a status symbol may not be immediately obvious to us in our current day and age until we realise that lawn mowers did not exist at the time! These lawns were cut by hand — by servants using scythes, sickles and shears. The amount of labour involved in maintaining a large lawn was considerable, and only the wealthiest in society could afford to pay people to carry out this unproductive work.
The first mechanical lawn mower was invented in 1827 and patented in 1830 by engineer, Edwin Beard Budding (1795-1846) from Stroud, Gloucestershire, England.
After the lawnmower was invented, having a lawn no longer remained the mark of wealth and status that it once was, but this status symbol now became accessible to the masses. From that time onward, people just kept on growing lawns in our front yards, forgetting that they were once grown as a symbol of wealth. Just another one of those unquestioned traditions that people blindly follow without knowing why….
That in a nutshell is the history of how food-free ornamental horticulture became a symbol of wealth in places like England and France, the birthplaces of the concept of formal ornamental gardens with pruned rectilinear hedges and huge expanses of manicured lawns.
It is also important to realise that the concept of formal gardens and lawns originate from countries with plenty of rainfall and an absence of harsh sun, allowing the gardens to maintain a pristine look (albeit with absurd amounts of labour) all year round. With that in mind it should be evident that carrying out this style of gardening in hot, dry climates is a highly questionable practice. Ornamental gardening has its place in cities, as art and beauty is important, but not at the cost of our primary needs as a species!
Banishing food growing from cities – the rise of the urban food desert
Once the idea of a non-productive garden with a lawn became established as a symbol of wealth, common people wanted to emulate the wealthy. Unfortunately some things never change… With the rise of industrialised cities, as people rose above the restrictions of their former class boundaries, they desired to disassociate themselves from the peasantry of times past and take on the symbols of affluence that were once exclusive to the landowners.
Whether they did this consciously, or in a lemming-like fashion simply followed the people who first chose to emulate the wealthy without thinking why, the consequences were the same. Sure they were adopting empty symbols, but this satisfied some psychological need for the inhabitants of the new industrial cities. Perhaps it provided a new class identity, the symbols somehow reassured them that they were no longer peasants. The practices gradually became absorbed into modern culture, and people built ornamental gardens and lawns simply because everyone else before them did.
Such irrational cultural anomalies from 17th century England which centred on appearing wealthy were by no means unique to that time or place. It must have been important to display social rank in England back them, probably as much as it is to wear flashy designer clothes and accessories, and drive expensive cars today. Vanity is vanity, and humans haven’t changed that much!
In Tudor England, at the time of Henry VIII, when sugar was first imported it was so expensive that only the wealthy could afford it, and they ate so much of it that they ended up with rotten teeth. Since their teeth went black from overindulging in sugar, a fashion arose amongst well-to-do ladies of blackening their teeth to show that they were able to afford large amounts of such a luxury item.
We shouldn’t really laugh at this, today we’re no better with the whole ‘suntan look’ – in the past wealthy people valued pale skin, as a sign of ‘refinement’ as only poor people had suntans because they worked outdoors, something only poor labourers did… Until one fateful day in the 1920s, when fashion-designer Coco Chanel accidentally got sunburnt while visiting the French Riviera, and when she arrived home with a suntan, tanned skin became a trend partly because of Coco’s status and partly because it showed that she had a wealthy leisure lifestyle that allowed her to spend lots of time lounging around in the sun. Fast forward to today, the consequence of this fad is an increase in skin cancer and prematurely aged skin, just to impress other people.
Yes, human cultural practices, fads and fashions are very often irrational, but we only realise that in retrospect, they always seem like a good idea at the time and nearly everyone wants to jump on the bandwagon!
This may account for the trends and fashions that influence what people grow in their front and back yards, but what of public spaces? Surely landscape designers and horticulturalists must know the historical context of their art? When we consider that some front yards and many back yards do indeed grow productive trees and plants, this leaves us asking what happened to public space to make cities truly ‘food free zones’?
Well, local governments working with the landscape architecture profession are ultimately responsible for how our public spaces look, they will tell you that themselves and eagerly take credit for their presumably good work. When we consider that the profession of landscape architecture actually arose for the purpose of wasting food growing land to produce artistic ‘designed landscapes’ for ostentatious aristocrats, we are no longer surprised about how things have turned out a few centuries later. In fact, these two parties are solely responsible for pushing food production out of cities and continue to do so, often unapologetically. For those who are interested, I have discussed this topic at length in my previous article Cities – Food Free Zones? The Creation of the Urban Food Desert.
Now that food security, food sovereignty and health concerns are capturing public attention, there is an emerging trend worldwide to push against such an unhealthy and ill-conceived cultural trend. Many are looking to restore urban agriculture back to its rightful place amongst the people living within the cities.
Food sovereignty in the cities
Implicit in the urban agriculture movement is the concept of food sovereignty, the right of people to define their own food systems. Here, the people who produce, distribute and consume food are the ones making the decisions about their food, rather than large corporations and agribusiness whose primary concerns are financial profitability.
It should be self-evident that if people have the right to define their food systems, they will primarily be concerned with year-round access to fresh, tasty and nutritious quality food that is not a threat to their health, and that has been produced sustainably so as not to pollute the planet which they and their children inhabit.
If we compare this to the motivations of agribusiness and corporate food production, all those factors that are primary for consumers become secondary, and instead making money becomes the primary focus. With the corporate aim of market domination, where a few corporations control food production, the old adage of “the market decides” no longer applies as there is no competition and the consumers no longer have a choice what they eat. It doesn’t take a genius to figure out what happens when corporate profits are put ahead of public health, food security, sustainability and care for the environment.
If people living in cities choose to grow their own food locally for the reasons discussed, then urban agriculture becomes a necessary solution.
The role of urban agriculture
Most people live in the cities, the figure is around 80%, and the role of urban agriculture is to put the food where the people are. That’s the way it’s always worked, that’s the way Nature works, and that’s the only sustainable way to do it.
If we put the last 200 years of industrial farming into perspective, if we are honest with ourselves we can admit that it’s a temporary state of affairs at best, totally unsustainable, a complete anomaly to how food has been grown for 98% of the 10,000 year existence of agriculture itself, that is just riding on the availability of cheap energy – oil.
With a growing population that is getting bigger and bigger, we need to supply more food, it’s that simple. The problem is not that Nature can’t provide enough food, the problem is unequal distribution of food – with the western obesity epidemic rising, it’s now official, there are now more overfed people in first world countries than starving people in third world countries. The amount of starving people is still more than ever, it’s just that overfed people have exceeded their numbers.
The other problem which is an inconvenient truth is overpopulation. It’s not possible to keep increasing demand on the resources of an ecological system without causing it to eventually collapse – but that reality flies in the face of the secular pseudo-religious dogma of economics, with its nonsensical concept of continuous growth in a limited and finite system – if that isn’t blind faith in miracles, I don’t know what is.
Permaculture emphasises the principle of ‘collaboration not competition’. Urban agriculture can function as a stand-alone system or as a complementary system alongside the other two existing food production systems, peri-urban agriculture located at the city fringes and rural agriculture located furthest from the cites.
If food production is so critical, why not employ good design to make food production systems more secure? The Permaculture design principle ‘every important function is supported by many elements’ addresses this very issue. One plus one plus one is three, which is one more than two! A decentralised system of food production is far more resilient, and by creating a food system that operates on three levels in parallel, urban, peri-urban and rural, greater productivity is possible than from a rural system alone. Additionally, this opens up the possibility for crop specialisation – growing crops that are optimal in each respective system. Growing high value produce such as berries which don’t ship well would be a better choice for urban areas than grains such as wheat for example, which require more space for lower value yields and are better suited to rural areas.
How much food does urban agriculture need to produce to make it viable? In many discussions around the role of urban agriculture, a common point of misunderstanding is the difference between self reliance and self sufficiency. The role of urban agriculture in modern cities is not to supply all the food required, that would be self-sufficiency. Producing some of the total food requirements is what us termed self-reliance. Increased self-reliance is an adequate goal for cities to achieve through urban agriculture. It’s not an all-or-none situation as some people make it out to be. Why should it be?
The right mindset, the right questions
However we look at it, yields are proportional to the area available for production, if we want to grow more, then we need more space to do so. Sure, increases in efficiency can make some difference, up to a point. Given enough space, we can feed as many people as we like! With a limited area, we can provide for some of our food needs, which is far better than not being able to grow any food at all.
Many supposed issues with urban agriculture are really non-issues, and more often the result of unclear thinking, and we see this often. When the concepts of self-sufficiency and sustainability are not properly understood, erroneous thinking ensues. We’ve all heard the challenge “…but can you grow all the food you need to feed x number of people?” be it in reference to a family, a community a city or a nation. The simple answer is that this is the wrong question to be asking. It’s like asking how we can fit more, one litre containers, in a 10 litre carton! This is flawed reasoning!
Every ecological system has hard limits when it comes to supporting any living thing, including humans, we don’t get a special exemption because of our collective superiority complex as a species! The correct question to be asking is “what size population can a sustainable system support?” and treating the population number as the mathematical variable to be adjusted up or down because the finite ecological resources are the mathematical fixed constant that we must work around. Currently governments are doing it the opposite way around and wondering why problems are emerging!
Towards a more sustainable future
There is no divinely ordained decree in place that demands all food production efforts achieve self-sufficiency or cease to be! Growing food in urban areas makes cities more sustainable. An unreasonable all-or-none attitude does nothing more than denigrate the many small but significant efforts that collectively make a real difference overall.
People need to understand that humans are community-based social creatures, and self-sufficiency is a community endeavour, not an individual one, as “no man is an island” and being human isn’t all about doing everything alone!
Sustainability is quantitative, a matter of degrees and we can increase our level of sustainability by engaging in a greater number of positive, constructive and beneficial activities. If some activities are not possible, then others will be, and every effort to live more sustainably makes a difference in the grand scheme of things. Basically what it really means is that we’re wasting less and polluting less.
1. Urban Environment Food publication – United Nations, www.un.org/ga/Istanbul+5/72.pdf
2. Australian Government Land & Water Facts Sheet, Change and Continuity in Peri-urban Australia, October 2008