pluto

Lords of food

When food started to be listed on the stock exchange markets, it ceased to be a natural good nourishing the many and became a financial product enriching the few. We discussed the matter with Stefano Liberti, one of the first journalists to delve into this topic, to understand why turning food into a commodity could be the worst idea humankind has ever had. 

What’s happening in the food industry?

A small group of companies is taking over the whole food production, distribution, and sales process, laying the foundations for a monopoly. Over the last few years, as a result of larger firms buying up and absorbing small and medium companies, we’ve witnessed an astonishing market concentration, which leads to an impressive reduction in biodiversity and a profoundly worrying standardization of the production models. The ultimate consequence is that food is being commoditized, meaning that it is being transformed into a good which is produced in a standardized way, with a standardized taste and sold at a price which quite often is traded in real stock exchanges. 

The price of a commodity can be artificially manipulated, just like OPEC does with oil. Is the same true for food?

It’s a price manipulation, but it’s a taste manipulation, too. The price of an oil barrel is basically set at NYMEX and London’s ICE and applies to all oil barrels – no matter where it has been extracted. The same applies to the food industry: the ultimate consequence of this is that we will no longer have a tomato from Sicily, another from Tunisia or China with all their differences. We’ll just have a tomato that, having become an industrial good produced cutting costs to the minimum and increasing the yield of crops, is sort of neutral. Something that obtained by shifting production to wherever labor is cheaper, quality checks are less intrusive, and environmental laws are minimal. 

When did this process begin?

It all began in the ‘90s, as a result of the proliferation of free trade treaties and commercial deregulation agreements on one side, and of the simultaneous withdrawal of public institutions which gave up any control over the food production and distribution systems, on the other. When tariffs were lowered or even eliminated, the processing industry found it convenient to seek raw materials wherever prices were lower, no matter how far away.

But food has always traveled long distances. What’s different this time? 

Food traveling long distances is nothing new: ancient Romans got their grain supply from North Africa, and tomatoes, a staple ingredient of the Italian culinary tradition, actually comes from South America. But this time these long-range and unjustified shipments of foodstuffs take place with no public institution supervising, controlling and regulating them. As a result, everything lies in the hands of private companies whose only goal is that of making profits. That’s the big difference. As said, all this comes at a cost in terms of biodiversity and pollution: the tuna market is the most striking example of this.

Could you expand on it a bit?

Bluefin tuna is an extremely swift and highly migratory apex predator that can cross the Atlantic Ocean back and forth several times a year. Vast shoals of tuna cross the Gibraltar Strait for their reproductive migration, and enormous amounts of them reach the Mediterranean sea, attracted by its warmer waters, following patterns which haven’t changed at all over the centuries. After crossing the strait, they end up in the nets laid by fleets of fishing vessels, usually Italy and Spain. But tuna’s journey is not over yet. Its raw meat is particularly sought after in sushi and sashimi – so much so that around 95% of the tuna caught in the Mediterranean is sold to Japanese buyers. Once the tuna is caught, it is put in huge cages, and fed special feeds like soy until it reaches the right weight. Then, under the buyers’ supervision, the animals are killed one by one, frozen and sent to Japan, where they are sold at a very high price. 

What happens is that “European” tuna is not consumed in Europe, it is barely known in our market. The tuna sold and ate here is mainly the Yellowfin kind, which comes from the Pacific Ocean. It’s a twofold, illogical transport of millions of tons of tuna fish, shipped from a continent to another at a very high environmental cost which, by the way, is not taken into consideration at all. All the polluting, all the damages inflicted to the marine ecosystem are just negative externalities no one cares about. 

Who are the Lords of Food?

A tiny number of huge transnational companies control the food systems as far as commercialization and distribution are concerned, moving like a swarm from one place to another. I define them as locust-like firms, for they have the same predatory and mining-like approach as those insects. They treat natural resources the same way miners treat their mine: they work it until it is exhausted, and move on to the next one. Likewise, these companies exploit natural resources of a particular area until it’s depleted, then move on to the next one. Another essential feature that characterizes them is their size: they can use their economies of scale to shove small, independent producers out of the market easily.

How many players control the food system?

A dozen companies own the majority of brands. In the US, around 75% of the pork meat market is in the hands of just four companies. The biggest of them controls 25% of the market and has been recently bought by a Chinese firm, therefore becoming the world’s biggest pork meat producer. The same applies to the soybean industry, which is a four-horse race. This oligopoly has a spillover effect on other markets since this bean plays a fundamental role in intensive farming too. These are probably the most striking examples, but the same concentration is present in almost every sector of the food industry, each dominated by vertically-integrated companies created by a constant M&A process. These giant firms strictly control the entire chain, from production to distribution. More precisely, they control production indirectly – so to avoid any capital risk – outsourcing production to farmers and imposing strict protocols and binding, exclusive contracts. Large scale retail distribution also went through the same concentration process.       

One of the most interesting, although unsetting, changes brought by technology is food engineering. Have you dealt with it? 

Not really, but I’ve studied how big corporations like Monsanto have started producing laboratory-engineered synthetic seeds. A large portion of the world’s agricultural production comes from seeds which have not been generated by fruits but that are artificially created in laboratories where geneticists speed up an evolutionary process that would take centuries. The scientists’ goal is not that of creating the best plant in terms of flavor: they aim to create a plant tailored to the company’s needs and geared to large scale industrial production. Let’s take tomatoes as an example again: the ones geneticists have been engineering aren’t particularly tasty, but have hard skin, to make the mechanized harvesting more manageable, and a soft stalk, to ease the boxing and distribution processes. 

Courtesy of Elliot films

What’s the situation in Europe and what’s the role of Common Agricultural Policy (CAP) in spreading such a model?

In Europe, we are witnessing the same trend, just on a smaller scale. Europe has become marginal, geopolitically speaking. Therefore the players acting there are not as big as those in Asia or in the US. The commoditization process of the food industry in Europe happened with the contribution of the CAP’s grants mechanism, which de facto hinders small producers and benefits the bigger ones. CAP also played a fundamental role in sustaining agriculture, a sector which, with no incentives, would have been washed away by global competition.

The EU did it all at the expense of developing countries, didn’t it?

Exactly. I’ll tell you once again about the tomato supply chain, because I’ve been studying it in detail, but the same process applies to many other products. Italians consume vast quantities of tomatoes ignoring the fact that most of them come from China; more precisely from the Xinjiang region, where this agricultural production is strongly subsidized. The Chinese consume just a minimal share of what is harvested in their country: they ship most of it as tomato concentrate to the ports of Naples and Salerno. For Italian entrepreneurs it is more convenient to import tomatoes from China rather than buy them from local producers: they take the paste, add water and salt, package it and export it as an Italian product to several markets, further damaging and suffocating local production. 

In Africa, farmers are not growing tomatoes anymore because those coming from China are cheaper than the ones growing a few kilometers from the local plant. All this can happen because China actively subsidizes production, while the EU subsidizes its export. The dairy industry works the same way, as European milk’s overproduction literally floods other markets. Besides the pollution produced by hundreds of ships crossing the oceans to bring food which could have been grown locally, one has to consider another scaring effect, which quite often goes unnoticed. Farmers leave their fields and move to big cities, whose suburbs are growing bigger and bigger.

A phenomenon that can further destabilize underdeveloped countries.

Yes, it can cause food insecurity, as fewer and fewer people farm the land to feed an “unproductive” and growing population. Food has to be imported, and this costs money which would have been more useful if invested in other sectors. Another overlooked effect is that this unemployed workforce competing for a small number of jobs drives salaries down, thus stifling the rise of a domestic market which is essential to economic development. There are many ways these countries can end up in the so-called poverty trap: food is at the core of many of them.  

What’s the role of the financial industry in this process, if any?

After the financial crisis of 2008, when the usual investment channels like the stock and the real estate markets became too risky, there was a massive relocation of financial capitals, of billions of dollars, towards anything related to the food system. It was considered the most secure investment: people need food and, given the rise of the world population, it will be an increasingly valuable asset. That logic led big funds to buy land and to enter into the share capitals of the leading food sector players. This process deeply impacted those companies, as they have gradually lost their peculiarities, becoming firms driven merely by the logic of making profits, with boards dominated by people with a financial background but without basic knowledge of the food system. That’s how food became a commodity, produced no one knows where and how.

Unchain the supply chain

If humankind wants a chance to dismantle its own climatic atomic bomb, it needs to seriously address the matter of food: meaning what, where, and most of all, how it is produced. To many, it may not be clear why reforming our Food System is so crucial to avert climate change. But there are lots of good reasons to do so since food is a pivotal issue. The 17 Sustainable Development Goals (SDGs) established by the United Nations reinforce this idea, as all of them are inescapably linked to the Agriculture, Food, and Beverage industries. So it is helpful to look at the SDG’s and particularly at their newly integrated framework or systems view called the wedding cake, as devised by Johan Rockström, Executive Director of the Stockholm Resilience Centre and Chairman of the EAT Advisory Board.  

The true cost of food

Today, the world starting to realize that the way to solve the climate crisis is through food. There is, in general, a greater awareness, more debate, and more action, which have been and are growing exponentially since the UN Paris Agreement in 2015. All of this is the consequence of a paradigm shift: we started abandoning the silo mentality, embracing complexity and understanding that when we think about systems, our hypotheses need to include the pivotal role of food.

Systems thinking brings out the real costs and impact of all things. Since the Seventies, we have been led to believe that those responsible for greenhouse gas emissions and climate change were the Automotive, Oil, Coal, and Gas industries. False. They are part of the problem but not the primary cause. To find it, we have to look at the Agriculture, Food, and Beverage industries. They are the greatest strain on natural resources and the health and wellbeing of everyone (and everything) on Earth. The majority of the food we grow first goes to feed cars, then to animals, and last (but by no means least) humanity. Our food is creating a pandemic of obesity, diabetes, asthma, cardiovascular disease, and other health problems. 

The fossil fuels and refrigerants we use to produce and transport these products are the biggest emitters of greenhouse gases – more so than the Oil, Coal, and Gas industries combined – and are keeping these industries in business. The packaging for food is causing biodiversity loss in our oceans and contamination on land. Globally, 30% of all food produced is wasted or thrown away before it is consumed. If we dispose of this waste by burying it in landfills, it comes back to bite us as methane, which is seventy times more powerful at trapping heat than carbon dioxide. Even if this waste is burned or dumped into the water, the long-term results are not much better.

A report from Truecost and KPMG showed that the total environmental costs and impact – as a percentage of EBITDA (a measure of a company’s overall financial performance and is used as an alternative to simple earnings or net income) – is entirely out of balance and not sustainable on a planet of finite resources. The Agriculture, Food, and Beverage industries show a 224% imbalance for total environmental cost impact as a percentage of EBITDA.

Business as usual for 12,000 years

This figure is staggering, but not surprising for those who have more than a vague idea of how these companies operate. The problem is that this market is markedly concentrated, and we still rely on just a few companies to provide our food. We need to empower ourselves with the know-how to resiliently fulfill our basic physiological and safety needs without leaving it up to governments and a few major food producers to feed humanity. The Agriculture, Food, and Beverages sector is the oldest and longest running economy, at 12,000 years old. It is also the most successful, but in such a long time, it has only witnessed four major innovations besides automation and mechanization. 

The entire industry is stuck in the Dark Ages or the Industrial Age, it is the least digitized industry in the world. That is because oligopolistic players dominating the industry and are not so eager to welcome this change. Big Capital has tried to hijack the revolution and spin things for their benefit and in the long term it always comes back to bite them in their A$$ET$, as they are being disrupted and losing billions of dollars in court battles, along with losing face and social respect of their customers: see Volkswagen, Bayer and Monsanto, to name just a few examples. Impossible Foods, Beyond Meats, and Memphis Meats are all examples of meat alternatives disrupting the Meat industry. In the dairy industry, plant-based alternatives to milk, from companies such as Suja, Koia and Malk, are bringing disruption. These are trillion-dollar industries that are experiencing major transformations and where incumbent companies see their hegemony endangered. 

Making money by saving the planet

Yet, by acting wisely, Food industry companies would serve their own interests too, economically speaking. In fact, Global Food Reform projects are also the most lucrative, and guarantee short term and long term returns. Numbers never lie. Humanity needs to eat multiple times every day, and I do not see this changing. The global Agriculture, Food, and Beverage industry is a $13 trillion market with a shadow economy of $3 trillion. In order to achieve the SDG’s by 2030, we need to invest $94 trillion in Sustainable Development, which is more than $6 trillion a year, had we started in 2015. Now, having broken these unfathomable sums down, food and nutrition are at the heart of all the SDG’s and for every one dollar invested, there is a $16 return. Data shows that embracing the Global Goals could generate $12 trillion of New Business Value a year, that is $6 trillion more than we need per year to reach all of the 17 goals. This is equivalent to 10% of the global GDP forecast by 2030. 

A Food System Reform, could address health inequalities as well, since more than 820 million people still lack sufficient food, and many more consume either low-quality diets or too much food. Unhealthy diets now pose a greater risk to morbidity and mortality than unsafe sex, alcohol, drug, and tobacco use combined. There is substantial scientific evidence that links diet with human health and environmental sustainability. Yet, the lack of globally-agreed scientific targets for healthy diets and sustainable food production has hindered large-scale and coordinated efforts to transform the global food system. The joint Lancet-EAT Commission is working on it: it convened 37 leading scientists from 16 countries in various disciplines including human health, agriculture, political sciences, and environmental sustainability to develop global scientific targets for healthy diets and sustainable food production. It is the first attempt to set universal scientific targets for the food system that apply to all people and the planet. The Commission focuses on two “end-points” of the global food system: final consumption (healthy diets) and production (sustainable food production). Scientists acknowledged that food systems have environmental impacts along the entire supply chain, from production to processing and retail, and furthermore reach beyond human and environmental health by also affecting society, culture, economy, and animal health and welfare.

Bring food home

A global food reform implies a cultural reform, too. For centuries, living and working were built around food but now the latter has moved outside cities and towns into industrial parks, massive greenhouses, and farmlands. This decoupling of agriculture, food, and beverage production began occurring at the beginning of the Industrial Revolution. Given this, we have to reconsider not only the way we produce food, but also bring the production back to where we live and work. Keeping it far away from where we are is an inefficient model: it has to be brought back into our communities, schools, businesses, homes, and apartments where we spend our time living. Today, this can be done in many ways, such as with home gardens, vertical farm home fridges, sprouting, dehydration, preserving, juicing, blending, jerking, baking, ambient water harvesting, and flash freezing. We need to use land and space more efficiently, going vertical, and using closed-loop and circular economy systems.

But, as said, the way we produce is also important – today’s methods are highly inefficient, and we are using more resources than we have: those required to sustain one human life throughout a normal life expectancy in 2019 is 1.7 global hectares, meaning the ecological footprint and biocapacity. Currently, we are using 2.87 global hectares per person, which represents a deficit of 1.17 global hectares per person. If we continue with business as usual, human demand on the Earth’s ecosystems is projected to exceed what nature can regenerate by approximately 75% by 2020. This is the big problem, not the increase in the world’s population. We already have all the technology we need to nourish every human being. We just have to use it. 

The rise of Cleantech

It is time to get up to date with technology and processing by changing the way we produce, doing it without greenhouse gas emissions, food waste, packaging waste, and finite resource waste by using exponential and innovative technologies. Clean technology (Cleantech) has brought us some hope in this sense. Clean technology is any process, product, or service that reduces negative environmental impacts through significant energy efficiency improvements, the sustainable use of resources, or environmental protection activities. It includes a broad range of recycling technologies, renewable energy (wind energy, solar energy, hydropower, etc.), green transportation, electric motors, green chemistry, and DLT (distributed ledger technology) for food traceability. Then there are the technologies which enable higher efficiency in producing, cooking and preserving food, like Ultraviolet Light for processing, Greywater Recycling, Pulsed Electrical Field food preservation, Food Waste Prevention technologies, and biodegradable or fully recyclable packaging. 

The Food System’s revolution has already begun, with the Stockholm’s EAT Forum, the World’s leading science-based platform dedicated to transforming our global food system through sound science, impatient disruption, and novel partnerships. But where are the governments? They have to play a role in innovation, (as they always do) and that role has to be recognized, as some economists such as Mariana Mazzucato have pointed out (I suggest reading her book The Value of Everything). She emphasizes how important it is to build symbiotic partnerships that can create a form of growth that is more innovation-led, inclusive, and sustainable. That means reforming Capitalism as well.

We are two minutes before midnight: that’s what the Atomic Society’s Doomsday clock tells us. This can sound pretty cold when we hear it, but in fact, humanity’s time on Earth has been one of exponential growth. If you look at our planet’s creation until now and place it in a 24-hour time model, human beings have been here for two seconds. However, our impact has grown so fast, which has awakened the Anthropocene. We can use this negative exponential growth for good, and apply the innovations and technologies we have, inventing new ones along the way. We still have some time left, but need to act now.

The end of boundaries

Unwittingly, we have always drawn borders. Whether we think about those that our common ancestors used to define their villages or the more abstract ones deep inside the wonders of the human mind, in every era human beings have used the power and symbolic value of borders. Faced with the aberrant terror of climate change, however, we are witnessing the crumbling of physical borders and the reformulation of mental ones, which we must adapt to, in order to give a new meaning to our lives on planet Earth.

Borders are essential tools to shape and limit the world: a way to mark fixed, unmovable reference points, an attempt to give our anthropocentric order to the environment that surrounds us. In a sense, our own body is a form of border: it separates what we conceive of as human from anything else — the inhuman, the natural, the inorganic. In the Anthropocene epoch, where we are forced to reckon with the active role human beings are playing in the terrestrial ecosystem — threatening its future — the function of borders as we imagine and use them starts crumbling. The prerequisite for borders to work is that everything must remain unmoved, static: “Modernity has all settled on the principle of the immobility of the individuals. It is no coincidence that States call themselves this way. They assume individuals that do not move,” explained Professor Franco Farinelli

Climate change, however, is already putting a strain on some particular types of borders that could be considered as geo-defined, meaning those using natural elements as a reference point. This is the case of the Grafferner glacier, which defines part of the border between Italy and Austria. Some researchers have noted that global warming has melted the glacier and consequently introduced the risk that the border will change. Here, however, it is not actually a real border that disappears, Farinelli explains, in these cases, “the underlying ground is taken as a reference, meaning the land under the ice.” It is since the Peace of the Pyrenees, in fact, that we began to place borders on the ridge lines of the mountains, this within the goal of organizing modern territoriality.

A different example is the one concerning the Arctic ice. According to some estimates by the National Snow and Ice Data Center, the extent of Arctic ice has declined steeply since 2002. According to NASA’s Earth Observatory, many climate models predict that the Arctic will be free of ice by the end of the 21st century — or even, more dramatically, by mid-century. “In this case, it is not a matter of retouching borders, but new overall scenarios are opening up,” explains Farinelli. “New ways of access and trade routes will be available. These are not proper borders, however. In the Arctic, in fact, borders are established on paper and are quite geometric, based on the idea of dividing parts of compact ice.” If in these two previous cases, the border does not undergo any direct changes itself, the idea on which it rests, however, begins to falter: the immutability of the territory is no longer an invariable and peculiar characteristic of the world. 

According to the recent special report by the Intergovernmental Panel on Climate Change we can expect the oceans to rise between 26 to 77 centimeters by 2100, and this could seriously affect most of the coastal cities in the world. Similarly, large swathes of territories will become uninhabitable or will be subject to changes so sudden as to undermine agricultural systems and thus induce millions of people to migrate. According to the UNHCR, limited natural resources, such as drinking water, are likely to become scarcer in many parts of the world. Crops and livestock struggle to survive where conditions become too inadequate, threatening livelihoods and exacerbating food insecurity. Some estimates say that up to 200 million people could be displaced by climate change by 2050. The foretold crisis of climate migrants effectively destroys the political function of the borders. Several states risk becoming largely uninhabitable and resembling empty containers. And the inhabitants will be forced to move to survive. “When we think of migrants, there is a dimension that does not exist at all in the international debate, that is the one that evokes the mobility of subjects,” Farinelli emphasizes. Since modernity has all settled on the idea of borders that organize the points of contact between States, migrants undermine the very structure of the State form that we know so far, destroying the prerequisites for its existence and operation. “Faced with such catastrophic scenarios, it is clear that the dramatic problem, beyond the sustainability of the processes from the physical point of view, is also our difficulty in imagining what could happen,” explains Farinelli. We are bound by foundational assumptions of modernity that natural phenomena, currently underway, tend to question and abolish. “The problem of borders is linked to their symbolic function and the production of meaning,” adds Farinelli. If we think of the management of migrants in the Italian case, the Professor continues, we realize that the border is not at all an static line: we extend control and border lines directly to the coasts of Africa, far away from Italy. We could reach an extreme scenario in which, according to Farinelli, in order to prevent the departure of unwanted subjects in the same place of origin, people will be blocked before their journey even begins. “This is a central problem from a cognitive point of view,” declares Farinelli. What is the meaning of the political border if it keeps moving away from its original position?

At the same time, another kind of cognitive crisis is looming over us, and it is the one that concerns our central role in the Anthropocene. We are at the same time the cause and one of the victims of climate change and our desire for control over nature is gradually wavering. Planet Earth will probably survive even without us. Other beings, living and inorganic, will continue to inhabit the Earth anyway. We will be the ones who, facing rising temperatures, famines, and floods, risk extinction. Continuing to consider ourselves as a subject outside the natural ecosystem, which we observe from above, and trying to use the map as a means to control the environment, will not ensure our salvation, quite the opposite. “The map was a formidable device for differentiating our relations with the environment, but now, evidently, it is no longer enough,” emphasizes Farinelli. 

Seeing cartography as the only way to give shape and order to the environment will not be the only way forward, according to Farinelli. “The real problem is that we are not able to model the world in adequate terms and with the schemes with which the world already works. Because the tragic aspect, in fact, is this: the world works, but we do not have the schemes to understand how. This also depends on the fact that our schemes are too tied to the cartographic mediation mode,” explained Farinelli. We could start thinking about a map for the future that can take into account the destructive effects contained in the Anthropocene and reposition the human being on a new scale of relations with the natural world. The problem, according to Farinelli, however, does not lie in the search of a cartography for the future: “the problem is a new logic, which will not necessarily refer to the cartographic one which so far has allowed us to survive in an acceptably adequate way.” We will no longer have to think about the map but the globe. “So far we have thought about the plane, that is the map, but the sphere is something topologically reducible to any map,” explained Farinelli, “We have to think about the globe, this is the challenge we face.” All of this could lead to shocks that are currently difficult to predict. What is certain, emphasizes Farinelli, is that this is a process that should be constructed with extreme sagacity, yet unfortunately, in the face of a dramatic current lack of political will. “Politicians should construct the possibility of cognitive processes that enable humanity to cope with the dramatic changes towards which we are going at a crazy speed. And this is not happening.”

We find ourselves dancing on an unexplored border that, inevitably, risks erasing us from the face of the Earth. Borders as a political tool are increasingly becoming obsolete objects. Furthermore, the original premises are no longer guaranteed: the environment around us has started such a rapid and unstoppable mutation that, even if we were able to stay within the limits of the 1,5 °C increase as highlighted in the special report of the Intergovernmental Panel on Climate Change (IPCC), we would live in a radically different planet. At the same time, our cognitive borders are put to the test: not only because we find ourselves staring at the abyss of extinction of our species but also because a total revolution of the concept of human being is required to get out of it. We need to reformulate the meaning of our lives in this world, and try to trace innovative and wider borders that also include nature in a new arrangement. The notion of borders is no longer sufficient to assign meaning to the world and we, together with it, are destined to redesign ourselves. At the moment, however, the processes that will underlie this new design are only barely hinted at.

Our last chance

Climate change shouldn’t be our only concern. Human’s impact on earth spans far beyond climate change, and involves biodiversity, water consumption, ocean acidification, land use, and more. Each of these areas is experiencing a different situation: the loss of biodiversity is already at an alarming level, while ocean acidification—although it is increasing at a worrying pace—is in less severe conditions. These are the so-called Planetary Boundaries studied by Katherine Richardson.

“When climate experts claim that the global warming increase should be held below 1,5 °C above pre-industrial levels, they are giving us a range. If we can stay within these two degrees we will be fine, but if we exceed this limit, we’ll have to face huge problems,” says Richardson. “It’s not just a matter of climate: we know that the human impact on the planet is also occurring in many other ways, which should be researched to define the maximum level if we do not want to damage the planet to the point that it stops sustaining us.

In short, this is the definition of Planet Boundaries: the boundaries that we must not cross not only in regard to climate change but also concerning the entire terrestrial ecosystem. “We know that if we get the planet fully polluted, it will no longer be suitable for hosting human life. But this is the worst possible scenario: we need to understand what our safe operating space is, and that is what we are trying to do with our research.”

From this point of view, speaking of planetary boundaries is also a way to show that Earth operates as a single system. “We have to analyze the different components of the system and observe how each of them could affect its overall state if it were excessively modified,” confirms Richardson. “We have to understand how much we can push the boundaries without affecting the planetary activity to such an extent that it can’t sustain us anymore.”

Such safe spaces, however, must be identified as the world population, having reached 7.6 billion, keeps growing. Is it possible to combine population growth with environmental sustainability? “I think it is, as long as we don’t keep on behaving like nothing is happening,” claims Richardson. “Everything we do at a global scale to improve health conditions, fight poverty and hunger, make healthcare available to all, or give everyone access to drinking water, inevitably comes with a cost for the climate. The only solution is to become more cost-effective. When we talk about “cost,” we immediately think of money. But money is nothing more than a proxy for natural resources, with a crucial difference: none of us would spend more money than necessary to buy something, while we all use many more natural resources than required to maintain our lifestyle.”

The over-consumption caused by our way of life becomes apparent when it comes to food. “Currently, almost 40% of the Planet’s ice-free land is used for food production. An area as big as North and South America combined is dedicated to livestock farming and livestock feed production. 70% of the freshwater we consume worldwide is used for agriculture, which in turn is accountable for 80% of biodiversity loss, also because of deforestation. In addition to all this, we throw away a third of the food we produce while there are 1.5 billion people in the world suffering from malnutrition.

“How can we get out of this? “First of all, we should stop throwing food away, and instead reuse it to feed many more people. But the problem is also that we eat wrongly: we consume an enormous amount of resources to eat meat so often that, as Westerners, we would be healthier if we reduced consumption. Moreover, all this happens while in some countries they only eat meat twice a year because they can’t afford it.” 

Responsible consumption, circular economy, more attention to waste, reduction of greenhouse gas emissions: these are some of the behaviors that would help preserve the Planet. “In a way, we don’t have to do anything different from what our ancestors did when they became settled with agriculture from being nomads,” Richardson adds. “I am sure that in their early days they would throw their rubbish everywhere and freely take anything from nature; but when their settlements began growing, they had to set some rules, because they understood that such a system was not sustainable in the long run. We have to do the same, but the effort has to be global: because the behavior of each nation is inevitably reflected in the others.”

And it is also reflected in places that, perhaps, we thought were sheltered from human activity, such as the oceans. In recent years, there’s been a lot of talk about the plastic that’s invading the oceans and the damage it’s causing. A topic which is being discussed as if it were a sudden emergency, but it’s not: “Since the ‘50s, we’ve been throwing plastic away knowing it’s a non-biodegradable material, that two-thirds of the Earth’s surface is covered with water and that there is obviously no way out of the planet. Where did we think it would all end up? And it’s not just about plastic: waste in general, and not just plastic, is a huge problem. That’s where the urge for a circular economy comes from.”

But that’s not it, because oceans are becoming increasingly acidic because of the rise in atmospheric CO2: “Oceans behave like a membrane: the more CO2 is pumped in the air, the more it dissolves in water. The pH of seawater has already dropped by 0.1%, which does not seem so much, so to speak, but instead has enormous effects. This, combined with water warming, causes a reduction in oxygen, meaning a reduction in the life that oceans can sustain.” 

But is it possible to combine environmental sustainability with capitalism, a system that needs constant economic growth to exist? “GDP is definitely an awful indicator. It’s crucial to find new ways to define ‘growth’ and to measure our socioeconomic well-being. An indicator that needn’t be material, but of improving our conditions. Just think of commercial agreements: if we were to consider not only monetary but also environmental costs, we would realize we already have many tools available for sustainable development.”

We’re running out of time. But our countdown is not being matched by suitably urgent political action. “I can’t know for sure, but I think we still have some time left, and I am not willing to give up. Let’s put it this way: I don’t think our planet will be emissions-free 10 years from now, but I think that within 10 years we will have succeeded in significantly reversing our course.”

Image by Kevin Krejci

What is advanced fission?

We need more energy, but we cannot continue to extract it from fossil fuels, because we’ll be sawing off the branch we are sitting on, nor we can hope to fill this energetic gap through renewable energy sources only since they fail to provide enough power. There will always be an unmet demand for power, especially considering how fast some areas like Asia and Africa are developing in terms of population and infrastructure. As such, their economies are growing at considerable speed, and so are their domestic consumptions. Without alternatives, we’d find ourselves at a dead end. This is where Advanced Fission steps in. 

Once upon a time, the fission reaction

The idea behind it is simple yet crazy at the same time. Scientists found inspiration by looking at what is the source of an almost perpetual, enormous, and steady flow of power: the Sun. This simple detail explains by itself why tomorrow’s atomic energy may have basically nothing in common with today’s and why it would be potentially limitless and with no ecological fallout.

The main difference lies in the reaction that produces energy. In today’s reactors, it is achieved by a process known as fission, which consists in splitting an atom’s nucleus into smaller and lighter nuclei, thus producing free neutrons and gamma photons. It creates a large-scale chain reaction in which expelled neutrons split the nuclei of other U-235 (an isotope of uranium) atoms, thus releasing further neutrons and so on. If it happens millions of times, it creates an astonishing amount of heat. 

This energy comes at a cost though – the problem with fission is that its constitutive elements are not easy to find since the chemical elements sustaining the reaction, are isotopes of uranium and plutonium: i.e., nuclear fuel. Even more troublesome is that it leaves behind an amount of long-life, highly toxic waste which is not easy nor cheap to dispose of.

The new clean energy we need

Next-generation reactors may be producing power through a completely different process, fusion, which – as its name reveals – operates by combining atomic nuclei: creating a new atom which is bigger and heavier than the previous two but that is smaller and lighter than their sum: the missing part is released as energy. 

There are several approaches to achieve such an outcome, on the atoms chosen to be combined. However, the most promising recipe is the one fusing one atom of deuterium with one of tritium. The former is an isotope of hydrogen, which means that the most precious element to ignite the fusion reaction is also the cheapest: water. The energy thus produced could work on the same generators now running on coal and gas as it would create steam power, which means that this power could be ready to use without any disruption of the grid. If compared to that of oil derivatives, the energy thus produced has an outstanding power: one liter of seawater generates the energy produced by 299 liters of gasoline with almost no waste.

Too good to be true or at least too good to be easy. Scientists have been trying to replicate this fusion since the ‘60s, but until a few years ago, any progress was barely noticeable. Creating the conditions to get something resembling the solar plasma has been an extremely challenging goal. For the atoms to move at astonishing speed, colliding and fusing, unfathomable temperatures and pressures are needed, higher than those in the Sun, which has a core of 15 million degrees Celsius. As an example, for the deuterium/tritium’s combination to ignite the desired reaction, heat has to reach 39 million degrees. Other recipes need even higher temperatures, close to 100 million degrees Celsius. Same for the pressure. The Sun’s gravitational force is 27 times that of the earth, and that’s why solar fusion can occur there at a much lower temperature.

A two-horse race

Solar plasma, a soup of gasses ranging from 100 to 200 million degrees Celsius, poses another major problem: how to contain such a substance without inflicting mass damage to the plant itself. This is the area of research fusion companies have been investing in the most. Among the projects being worked on in the world, two seem to be standing head and shoulders above the rest. One is EAST, which stands for Experimental Advanced Superconducting Tokamak. EAST is a reactor built by China’s Institute of Plasma Physics. In November 2018, it became the first ever reactor to reach 100 million degrees Celsius.

Its main rival is ITER: International Thermonuclear Experimental Reactor. Located in Cadarache, Southern France, ITER is a €22 billion project which brings together scientists from 35 countries (including the EU, India, Japan, South Korea, and the United States). These scientists have joined forces to achieve the first positive fusion energy. Initially, it should have turned on in 2018, but researchers were forced to postpone its launch to 2025. According to the last known schedule, the first full-power fusion should be generated in 2035. Unfortunately, by that time we may have already developed our first clean nuclear energy, making it in many ways redundant.

Which is fine, because it is, of course, important to keep investing and developing other viable alternatives to be ready sooner, as suggested by the Academies of Sciences, Engineering, and Medicine. Last December, it delivered the result of a study by the US Secretary of Energy. The group urged the government to develop an American Fusion Strategy and to do that by coupling investments in ITER (200 million dollars per year over the next decades) with investments in smaller projects with a shorter learning and development cycle.

A sparkling alternative

In this, the Academies of Sciences, Engineering and Medicine were most likely referring to SPARC, a scaled down version of the ARC (Affordable, Robust and Compact) reactor and a concept developed by the Commonwealth Fusion System in collaboration with MIT’s Plasma Science and Fusion Center – funded by the Italian oil giant ENI. SPARC would be half ITER’s diameter, cheaper to build and would have a lower Q: a letter which indicates the ratio of energy used in the process compared to what is gained. ITER’s Q is 10, meaning that scientists have to extract 10 times the energy they employed to ignite the reaction before passing to the next operational phase while the MIT reactor’s Q is just 3.

SPARC has an innovative design too, but the most notable feature is the technology used to solve a longstanding problem for the next generation of fusion reactors: how to reduce heat before it causes structural damage to the plant. In fact, today’s reactors cannot work for more than a few seconds before overheating. The solution lies in high-temperature superconducting magnets, an invention originating in MIT’s research lab which led to designing a chamber that can be opened, thus allowing for replacement of its inner components.

Magnetic confinement, as designed in the SPARC reactor, is one of the three techniques to handle heating plasma, the other two being inertial confinement and magnetized target fusion. What all of these projects, EAST, ITER and SPARC, have in common is the Tokamak. This Russian word is used to indicate a special donut-shaped (or torus-shaped, to use the correct geometrical definition) magnetic confinement device developed to produce controlled thermonuclear fusion. It is surrounded by coils that generate a magnetic field that suspends and squeezes the plasma.

The last hope

Mistrust in atomic energy is still high, especially since the Fukushima’s accident in 2011. Fukushima reminded us how a simple incident could turn into a large-scale catastrophe when an atomic reactor is involved. Scared by the footage coming from that nuclear city, many countries acted consequently. Germany is expected to shut down all its plants by 2022 while Italians ruled out new investments in this field in a referendum held a few months after the Japanese tragedy. 

However understandable, this may prove to be a very short-sighted strategy, even according to the historically nuclear-skeptic Union of Concerned Scientists, which in 2019 was quoted by the MIT Technology Review as saying that all that those closedowns would achieve is a reduction in the amount of energy we had access to. The resulting increase in our dependence on natural gas would cause a rise in carbon emissions by 6%, and we need cleaner fuel to counter such an increase. 

Not every hope is lost, however. Recent impressive technological advancements in superconducting magnets, simulators, and reactor materials paved the way for many private companies to join in the research, thus creating inviting market niches which have attracted other companies. Nuclear technology has been for long the preserve of a very small group of players, which now have multiplied into a variety of public and private actors. It means more collaboration and more competition, which raises the bar of the research and speeds up the development of the technology we need: the same as has happened in the space industry. Both developments are good news, for if we fail to prevent temperatures rising above the fatal 2-degree threshold, sooner or later we could be forced to pack our things and search for a new planet to inhabit.

Death to the jackals

They say that, in Chinese, the word “crisis” contains a character meaning “opportunity”: this doesn’t apply to the Mandarin language, but it does to the finance sector. It’s pretty rare for a crisis to produce complete destruction of wealth most of the time, this simply shifts from the losers to the winners. The incumbent climate change catastrophe is no exception: even if the most apocalyptic scenarios actually happen, there could still be room for investment or enrichment. Actually, those who have identified such opportunities have already placed their bets. This is climate profiteering.

At first sight, climate profiteers might seem to all belong to the same category of entrepreneurs. However, there is a fundamental distinction: that between those who aim at making money by exploiting the green revolution and those who foresee good business in the aftermath and do not have any interest in preventing it from happening. Both fronts are very actively supporting their own cause. For example, on the eve of December 2018’s Katowice Climate Change Conference, the international community sent the heads of governments a letter asking for a quick and concrete answer to prevent the worst outcome of all. Those writing it were the representatives of 415 investment funds, owning a total of $32 trillion in assets. 

Their equally determined opponents know that, in the event of a devastated world, control over key resources such as water will guarantee an invaluable economic (and therefore political) power. If salinity levels in the sea grew, rainfall diminishes and famine and drought became more frequent, access to water would be a pivotal factor. In 2008, Dow Chemical Company’s CEO Andrew Liveris told the Economist that “water is the oil of 21st century.” 

All banking and financial industry’s giants, from Goldman Sachs to UBS, from Blackstone to JP Morgan, have been buying acres. Michael Burry, founder and strategist of Scion Capital, was among the first to think so and act accordingly: he was the first hedge fund manager to forecast the subprimes mortgage crisis. Burry was so sure of his analysis that he bet 1.3 billion dollars against what was thought to be the safest market in the world. After collecting the winning, he stepped back for a few months but went back to the game in 2013 with a new investment strategy focused on water and farmland.

Summit Global Management, de facto the first water-focused hedge fund, founded by CIA former analyst John Dickerson, is betting on the blue gold, too: his company buys billions of liters of water and bet on the hydric crisis in the Colorado and Murray lagoons, Australia. He’s among those mentioned in Windfall, a book authored by McKenzie Funk which in 2014 brought to the fore the climate profiteering and the commoditization of raw materials. In a world that won’t have managed to limit global warming to 1.5 degrees, water will be both a vital source and a problem. Thousands of billions of people now live by the coast: the sea level rise caused by ice melting is a direct threat that turns them into potential customers. Companies specialized in building flood barriers could then become highly interesting to investors looking for safe profits. There would be plenty of work for them, and quite profitable, too. According to Michael Cembalest, JP Morgan Asset Manager, building a defensive barrier to protect New York and part of New Jersey is worth 2,7 million dollars per meter.

Climate profiteers, though, are not just evil entrepreneurs, but also simple citizens hoping for a new gold rush. That is the case for the Greenlandic population, who is sitting on an invaluable mining treasure which could make them independent from Denmark. Companies such as American industrial corporation Alcoa, who planned huge investments in the extraction and processing of bauxite to be transformed into aluminum, already know this quite well. But under the Arctic ice also lies 25% of the world’s gas and oil reserves. Global warming is a blessing for oil giants such as Royal Dutch Shell, British Petroleum, Exxon Mobil Corp or Rosneft, too. Among those cheering for a polar cap melting there also are shipping companies: in a world with no ice, they could connect Asia and Western Europe without having to cross the Indian Ocean and the Suez Canal, saving up to $500 thousand each trip. 

But Planet Earth is useful even if it doesn’t hide mining treasures or hydrocarbon deposits. In an overpopulated world dealing with desertification, the remaining arable lands will acquire enormous value. That’s why, over the last year, the practice of land grabbing, once limited to Africa, is now expanding to South America and Europe. Large plots of land have been sold in Ukraine and Poland, but the real promised land is Canada. The first to have such an intuition was Eric Sprott, a Toronto hedge fund manager who in 2005 wrote one of the first papers on climate investing. 

In the paper, Sprott claimed water would become pivotal in the near future. But he also pointed out its weaknesses: it’s difficult to transport it from the areas where it is collected to those where there is a lack of it, and the water utilities sector has strict regulations. What to do, then? Sprott suggested buying a vast plot in Canada, a country where the climate would become mild and suitable for agriculture, a sector that could attract huge amounts of investments. This analysis is supported by the Canadian Department of agriculture, according to which the cultivable land should increase from 26% to 40% of the territory by 2040.

If Canada is too far away, one can still hope in the technologies enabling indoor agriculture and aquaculture, which are what many investment companies such as American Cambridge Associates are betting on. Nephila Capital is another company that deals in insurance and reinsurance specialized in weather risks. When it comes to climate change profiteering, imagination is the limit. Companies such as Key Point Capital have discovered that demand even for the short-term housing market, perfect for natural disasters (tornados, storms, floods, hurricanes) evacuees.

And this is the best-case scenario, because many people will be displaced for longer periods, even forever. According to the World Bank’s estimates, by 2050 there will be 143 million climate refugees across Sub-Saharan Africa, South America, and South Asia. Most of them will travel without documents, being de facto unidentifiable, thus posing huge challenges for its management but also a threat to the host country’s security. 

They will find Big Tech to welcome them with open arms. Today, each one of us has a digital ID and leaves a trail of traces, of pics, passwords, emails, messages, geographic positions, etc. In brief, data, which are the new oil, just like Big Tech is the new oil tycoon. Developing countries are of special interest to companies like Mastercard or Microsoft, to name a few, as is shown by massive identification programs like Aadhaar in India and similar experiments being done in Kenya and Nigeria. In most cases, these biometric ID documents are also used as payment cards. Billions of unbanked people are being given a bank account. The purpose is that of generating a more inclusive economy. The serious risk is that of imprisoning billions of low-income people in the trap of debt. Big Bank is happy and Big Tech too, since it is interested in a broad and meticulous profiling.

But even without following these Black Mirror-esque thoughts, Wall Street can profit from climate change refugees in other ways. It’s interesting, in this regard, to read some news given to Bloomberg by David Vogel, founder and CEO of Voloridge Investment Management, a quant fund focusing on sectors such as agriculture, insurance and healthcare, in view of a large-scale emergency. Vogel didn’t go into detail, but he admitted buying vast plots of land at the border between North Carolina and Tennessee, convinced that the population migrating from coastal areas will inflate inland land prices. For its part, Monsanto/Bayer is already working on a new generation of extreme-temperature resistant seed, but also has many interests in anti-malaria products, an illness which due to increasing temperatures might sprout in the hottest areas of the US. Cool Futures Group, instead, is a case to itself. This archipelago of companies based in the Cayman Islands is betting on a reverse climate disaster: the threat according to them is not an increase, but a decrease in temperatures. An unexpected outcome that would render one of the most important geopolitical phenomena of the last decades vain: the scramble for the Arctic, which involves the main world powers, from the US to China, from Russia to Japan and the United Kingdom. A quick gaze into this new edition of The Great Game makes us understand that the outcome of the climate change challenge has probably been decided already.

A brief history of the anthropocene

The name Anthropocene is a combination of anthropos (Ancient Greek: ἄνθρωπος) meaning human, and cene (Ancient Greek: καινός) meaning new or recent. An early concept for the Anthropocene is noosphere by Vladimir Vernadsky, who in 1938 wrote of “scientific thought as a geological force.” Scientists in the Soviet Union appear to have used the term Anthropocene to identify the most recent geological period. Ecologist Eugene F. Stoermer subsequently used Anthropocene with a different meaning in the 1980s. The term was widely popularized in 2000 by the Nobel Prize-winning atmospheric chemist Paul J. Crutzen, who believes that the influence of human activity on the Earth’s atmosphere is enough to drive it into a new geological epoch.

The noun Anthropocene entered the Oxford English Dictionary surprisingly late, along with selfie and upcycle, in June 2014 — 15 years after it is generally agreed to have first been used in its now-popular sense. The beginning of this new era, in which humans have a force capable of destroying nature itself and the life on the planet is debated: some say it began at 5:29 AM on  July 16, 1945, when, with the Trinity Test, the U.S detonated the first nuclear bomb ever; others contend that it dates back to the invention of agriculture; still others claim it all started much earlier, around 50,000 years ago, when Homo sapiens caused the extinction of a number of large animals.

There is no univocal answer: it’s not easy to state what marked the beginning of an irreversible – and that is the keyword – impact of humankind on Earth. It will all be puzzled out in 2021, when the 34 members of the International Anthropocene Working Group will make an official proposal for the beginning of this geological era. What most of the community agrees is that, regardless of when the Anthropocene began, the human impact on the planet spiked in the second half of the 20th century: this, most recent period of the Anthropocene is known as “The Great Acceleration.”

We have bored 50 million kilometers of holes in our search for oil. We remove mountain tops to get at the coal they contain. Our oceans dance with billions of tiny plastic beads. Nuclear tests have dispersed artificial radionuclides across the globe. Rainforests, burned for monoculture production, send out killing smog-palls that settle into the sediment across entire countries. We have become titanic geological agents, our legacy legible for millennia to come.  

According to the astronomer Martin Rees, the dawn of the Anthropocene was an important moment. The darkest prognosis for the next millennium is that bio, cyber or environmental catastrophes could foreclose humanity’s immense potential, leaving a depleted biosphere,” he said. It is surprising that the human and social sciences have avoided this issue for so long, given that it will determine the future of humanity. The question now is: how do we get out of this mess? What are the relationships between political economy, geography, the study of culture, and the environment?

Humanity is changing the world as a side effect of its behavior, there is no more nature that stands apart from human beings. There is no place or living thing that we haven’t changed, and this will have unintended consequences for territorial, structural and climatic changes. But will these consequences really lead to the end of the world? Our planet has been tested several times: natural catastrophes, epidemics, wars, nuclear disasters. We have always survived all this. Many times, the end of the world has been predicted in the course of human history. Wikipedia has collected a list of more than 300 dates for apocalyptic events. But there are good reasons to be skeptical of the epitaphic impulse to declare “the end of the world.”

Lord Rees said that there is also cause for optimism. “Human societies could navigate these threats, achieve a sustainable future, and inaugurate eras of post-human evolution even more marvelous than what’s led to us. The dawn of the Anthropocene epoch would then mark a one-off transformation from a natural world to one where humans jumpstart the transition to electronic (and potentially immortal) entities, that transcend our limitations and eventually spread their influence far beyond the Earth.”

Green activism

In April 2017, Patagonia launched a campaign on social and traditional media to raise attention about Donald Trump’s decision to scale back the natural protected areas of Bears Ears and Grand Staircase-Escalante, leaving the land open to drilling and hunting. The campaign generated around 200,000 public comments on Regulations.gov, 70,000+ tweets to politicians, and 5,800 phone calls to lawmakers. It was a success, and a good example of what is known as Corporate activism.

The idea of Corporate Social Responsibility (CSR) is not a novelty: in the early years of the 20th century, some entrepreneurs recognized for the first time ever that a company’s ultimate goal couldn’t be limited to gain profit, but had to consider other factors such as preservation of the planet and public welfare. The concept stayed unheard of for almost a century, until the 2008 worldwide economic crisis turned the tables. Simply speaking about “social concerns” wasn’t enough anymore: it became crystal-clear that both big and small firms needed to reframe their business models, and ESG (Environment, Social and Governance) issues started becoming a fundamental part of how they — and their products or services — were perceived by clients.

In the last decade, new technologies, social networks being on the top of the list, have shaped a widening audience of well-informed and opinionated customers who make purchasing decisions based on their values. Companies are required to take part and act accordingly: if they betray people’s trust, they will be abandoned, and in some cases even boycotted. Transparency is fundamental, and it proves to be the key to different, more responsible achievements. Thirteen years ago, Apple published an annual supply chain report about the poor work standards of their Chinese plant: the decision to be clear and open gave the brand an economic momentum, and the report became a model for the industry. 

Patagonia was responsible “before it was cool:” activism and sustainability were inscribed in our DNA since 1973, the year of its foundation, and just recently our mission has passed from not wanting to hurt the planet to try and save it. The climate crisis is no longer a forecast — for millions, it’s become a frequent, difficult, even devastating reality, and every business is implicated. If we’re to keep the earth livable in the future, we must change our ways. Living our mission, the goal is to be carbon neutral across our entire business including our supply chain by 2025. This means that we will eliminate, capture, or otherwise mitigate all of the carbon emissions we create, including those from the factories that make our textiles and finished clothing and farms that grow our natural fibers. We are on the market, but we don’t depend on it: our values always come first. That’s why we select partners with the utmost attention — and why we leave them when they don’t match our views. 

Courtesy of Patagonia

All of this is not a random experiment or a charity project, but an actual business model which exists and works. Showing that an alternative to corporations’ standard, with stock listing and ad boards to report to, already exists: following a strictly ethical path, acting responsibly and still generating profit is actually possible. Responsible businesses will grow not despite corporate activism, but thanks to it. It has nothing to do with a trend, it’s a profound shift of mentality that started with CEO activism, when CEOs began to talk openly about ESGs, and are now involving organizations as a whole.

Millennials and Generation Z workers are different from their predecessors: When choosing a job, they look for companies that reflect their beliefs and offer not just financial stability, but a sense of belonging to a community who works to make the world a better place. Investors too are valuing social engagement more and more. A 2017 report by Edelman found that 76% of investors expect companies to take a stand on the environment, gender equality, diversity and globalization. Of course, corporate activism must be handled with extreme care. To be able to react quickly is essential, but the real value of a project built on social responsibility emerges only in the long run. People are extremely critical about continuity, and never fail to notice when some action or statement is just a marketing trick. Being consistent speaks of authenticity, and customers reward that. 

In the end, it’s always a matter of inclusion, of taking on responsibility without taking it out of individuals. We can’t just wait for the big economy scale to change: we need to start everywhere, whenever we can. Fully sustainable companies don’t exist: there are companies that decide to be responsible, and others that don’t. 

It’s the end of the world

“Post-Anthropocene is repeatedly claimed by scientists to be approaching within the next few decades, as overconsumption keeps destroying our planet’s vital resources. As such, the imminent mass extinction of all living beings of the Earth is a possibility. In this frame, video games have been responding to the arrival of the Post-Anthropocene: in recent years, an increasing number of gamers is both fascinated and creeped out by a world without humans, forcing us to reconsider the central position of humans in video games. To tell the truth, videogames have always been a ‘post-human’ medium; when gamers play, there is more than a mere exchange of inputs and outputs, they have to recalibrate the borders of their proprioception. VR games are a striking example of this, but, as a matter of fact, any gaming experience makes our body boundaries blur. In the book A Play of Bodies, Brendan Keogh reflects on how players can feel ‘present’ in a virtual world without forgetting that they are actually touching a screen, and calls this mechanism an ‘embodied act.’”

Those are the words of Paolo Ruffino, professor of Media Studies at Lincoln University and President of Digra Italia (Digital Games Research Association): according to Ruffino, video games, as a constantly-evolving medium, are able to respond rapidly to the changes humankind is making to the planet, acting, “as Marshall McLuhan said, as extensions of our sensory apparatus.” Videogames have become a fully-fledged medium that merges the main features of all means of communication. Therefore, Ruffino states, “it becomes natural that they contribute to our social, economic and psychic changes, addressing them in a flexible, plastic fashion, just like David Cronenberg’s eXistenZ controller.” At the foundations of gaming lies user interaction, which is not only a matter of action-reaction but also a creative space where we can enjoy an almost fully-immersive experience.

In the last decade, the games with dystopian, post-apocalyptic settings and plot had the most success, but Ruffino reminds us that “the dystopian genre is a constant in pop culture and it’s nothing new, there are variations that make contemporary dystopias different from those of the past. Mark Fisher noted how, after decades of neoliberalism, our imagination reached a stalemate that made us unable even to envisage alternatives, however utopian they can be. It is easier to imagine an end to the world than an end to capitalism’: such concept – made famous by Fisher, but formulated by Žižek and rephrased by many – is more relevant than ever. The representation and exploration of potential future developments of our society, with humankind never managing to prevent its destruction or self-destruction, and having to survive in chaos, puts users in the condition of mitigating their own fears and worries about the future. Essentially, it helps them explore potential different, unknown futures and deal with ancestral fears of one’s own extinction.

“The ecological anxieties of the era we live in are reflected in the narrative, but it’s not just that. The videogames that explicitly leverage such fears in their plot (such as Horizon Zero Dawn) are probably the most trivial cases. Anxiety is exorcised”, Ruffino adds, “less explicitly and perversely, therefore a more original way, in games where humans become marginal in the very act of playing. Think of No Man’s Sky, a game designed to be completed over 585 billion years, much more than the duration of our universe. No Man’s Sky is a videogame meant to be played by other forms of life and in other geological eras.”

Post-apocalyptic video games use four main fictional timeframes to lead players toward the end of civilization: during the apocalypse, a few years following the catastrophe, centuries after the event (the most popular one) or a very remote moment in time where our civilization has been completely forgotten. The reasons leading to the destruction of humanity include nuclear disasters, pandemics, robots and AI, asteroids, alien invasions, and any sort of natural catastrophe. Thanks to their success, most of these videogames are now part of mainstream culture.

Wasteland (Interplay, 1988) is on the top of the list in the genre: developed in the heat of the cold war, the game features a society that collapsed because of a nuclear war between the US and the USSR, exorcising the fears of a generation. Ten years later, the same developers released the first chapter of the hugely successful saga Fallout (Interplay, 1997-1998 and Bethesda, 2008-2018): the game features a planet that’s been first ravaged by a worldwide conflict brought on by a global oil shortage, and then completely destroyed by a nuclear war. One hundred years after, the consequences are still more than visible: the protagonist was born in a new society that rose from the ashes of the old one. He’s aware of the causes that almost led to human extinction and tries to survive the new dystopian social dynamics.

Another masterpiece of the category, this time set in the outbreak of the catastrophe, is The Last of Us (Naughty Dog, 2013). In the game, we’re shown a society working to face a global epidemic, but its relevance, beyond the unquestionable quality of its graphics, lies in how it exposes the fragility of our technology-dependent society and shows how in a very short timeframe our civilization might collapse, giving life to human race’s most feral instincts. 

Another very successful video game saga is Metro (4A Games, 2010-2019). Based on the namesake sci-fi novel by Dmitry Glukhovsky, it’s set in the ruins of Moscow following a nuclear war. The survivors found shelter in the subway tunnels and recreated a society, leaving the surface in the hands of their mutant enemies. The plot of the video game reminds us how human beings easily forget history and the mistakes they made in the past, reiterating conflicts that led to the surface destruction and proving their inability to find new ways to build a society.

An original video game, set in a very distant future where there is almost no trace of human civilization – as is the memory of its destruction – is Horizon Zero Dawn (Guerrilla Games, 2017). Here, machines have taken control of the planet, relegating humans to live in small tribal communities with no access to technology. The player is encouraged to investigate the reasons that lead to such a situation, where humans are surrounded by a flourishing and wild nature, populated by mechanical creatures with the features of predators.

Each of these masterpieces featured nature and the impact the different kinds of catastrophes have caused to planet Earth. As Ruffino states, they feature “an ecological scenario where, although limited, resources are abundant and nature flourishes. This imagery is often paired with the return of a mythical antiquity or a post-apocalyptic future that leaves room for a renewed nature. I’m thinking of Red Dead Redemption 2, set in the United States in 1899, or Zelda Breath of the Wild and Horizon Zero Dawn, set in a post-apocalyptic scenario. In all these games, nature is dangerous but prosperous, and the player can make use of it as an almost infinite resource.”

The storytelling common to all these video games challenges the very meaning of human existence on Earth, giving life to a new category that Ruffino defines as “Post-Anthropocene video games: video games that challenge the centrality of mankind, making it increasingly less essential, forcing players to think of their own responsibility and impact on the environment.” The last Civilization VI (Firaxis Games, 2019) edition gives players the chance to develop their own civilization from scratch and guide it through the millennia, from first settlements onwards, managing the resources themselves. With climate change becoming an agent and kicking in during the late stage of civilization, it’s a great example of the role played by nature in video games. If, at first, nature fights back randomly, with no specific reason, the more the civilization moves forward, the more the pollution, temperature, and sea level parameters must be taken into consideration. The gaming experience becomes a learning opportunity. Despite the fact that the video games industry strongly contributes to global warming, with online game servers producing an amount of CO2 that is “comparable to that of cars and industries” and with the energy consumed by consoles and computers worldwide, “as for any other form of art,” argues Ruffino, “video games can challenge our imagination, force us to face unsolved issues and raise original questions, demanding a new ecological utopia.”