Joe Zadeh, Author at NOEMA https://www.noemamag.com Noema Magazine Mon, 01 Dec 2025 17:07:45 +0000 en-US 15 hourly 1 https://wordpress.org/?v=6.8.3 https://www.noemamag.com/wp-content/uploads/2020/06/cropped-ms-icon-310x310-1-32x32.png Joe Zadeh, Author at NOEMA https://www.noemamag.com/author/joezadeh/ 32 32 The Shrouded, Sinister History Of The Bulldozer https://www.noemamag.com/the-shrouded-sinister-history-of-the-bulldozer Thu, 20 Feb 2025 16:41:08 +0000 https://www.noemamag.com/the-shrouded-sinister-history-of-the-bulldozer The post The Shrouded, Sinister History Of The Bulldozer appeared first on NOEMA.

]]>
According to an 1881 obituary in a Louisiana newspaper, the word “bulldozer” was coined by a German immigrant named Louis Albert Wagner, who later committed suicide by taking a hefty dose of opium dissolved in alcohol. Little else is recorded about Wagner, but his term became a viral sensation in late 1800s America, going from street slang to dictionary entry in just one year. It likely originated from a shortening of “bullwhip,” the braided tool used to intimidate and control cattle, combined with “dose,” as in quantity, with a “z” thrown in for good measure. To bulldoze was to unleash a dose of coercive violence.

If, like gods, we aspire to create machines in our own image, then it’s fitting that the original bulldozers were humans. Leading up to the corrupted U.S. election of 1876, as the Southern states were being reconstructed following the Civil War, terrorist gangs of predominantly white Democrats roamed about, threatening or attacking Black men who they thought might vote for the Republican Party. The men were the bulldozers, and the acts they carried out were bulldozing.

Wearing black masks or black face paint, and, on occasion, goggles, they brutally whipped, beat and sometimes murdered their victims. In June of that year, a Louisiana newspaper reported that bulldozers took a Black Republican voter named W. Y. Payne from his bed in the night and hung him from a tree 2 miles away. Later that month, in nearby Port Hudson, a Black preacher named Minor Holmes was hanged from the wooden beams of a Baptist church by bulldozers, but they cut him down before he died. 

“The good people have been cowed down, brow-beaten, insidiously threatened, forced to silence or worse, the countenancing of outrages, blackmailed and their contributions made the lever for future extortions, their tongues muzzled, their hands tied, their steps dogged, their business jeopardized and themselves living in continual fear of offending the ‘bulldozers,’” read an article in the New Orleans Republican in June. By the following year, the association of “bulldozer” with rampant voter suppression during the election made it a common term across the U.S. for any use of brutal force to intimidate or coerce a person into doing what the aggressor desired.

It’s hard to trace when the word first became a label for machines. For decades, it floated around the language tree, resting a while on branches where some instance of terrific violence needed a novel and evocative label. A handful of arms manufacturers marketed various “bulldozer” and “bulldog” pistols in these years. As the 19th century came to a close, it popped up in a Kentucky newspaper as a term for a towboat used to smash through heavy ice and in an Illinois court case to describe a manufacturing machine that had ripped off a worker’s left arm.

The bulldozer we know today took shape in the first quarter of the 20th century. In 1917, the Russell Grader Manufacturing Company advertised a bulldozer in their catalog: a huge metal blade pulled by mules that could cut into the earth and flatten the land. Other manufacturers like Holt, Caterpillar and R. G. LeTourneau were working on similar devices, technological descendants of scraping tools developed in the American West and associated with Mormon farmers. In time, animals were replaced with tractors (on wheels or continuous tracks) powered first by steam, then gasoline and eventually diesel. The word, which at first referred only to the blade itself, started to mean the entire machine, one that was unrivaled in its ability to rip, shift and level earth.

Many technological innovations have spiritual origins, and so it is with the bulldozer. The Vermont-born industrialist Robert G. LeTourneau, who had the greatest impact on its early development, was an eccentric evangelical Christian who believed that he created his machines in collaboration with God. “God,” he declared, “is the chairman of my board.” A gifted engineer, he was responsible for hundreds of innovative advances in bulldozer design. Thanks to his ideas, wrote William R. Haycraft in “Yellow Steel,” “the bulldozer blade would evolve from a simple plate of flat steel to the hydraulically controlled, scientifically curved, box-section-reinforced, and heat-treated steel structure in use today.” 

LeTourneau wore trilby hats and flew up and down the country in a private plane — clocking, according to TIME, around 200,000 air miles per year. He sought to spread the word of the Lord Almighty and the bulldozer in tandem, and would sometimes fly with a quartet of professional singers so they could deliver gospel performances in the communities he visited. His bulldozers aided in the frenetic transformation of the 20th-century American landscape and were deployed to major construction sites all over the country, from the Boulder Highway to the Hoover Dam, helping create the infrastructure that modern life has come to depend on. 

“Robert G. LeTourneau, who had the greatest impact on the development of the bulldozer, was an eccentric evangelical Christian who believed that he created his machines in collaboration with God. ‘God,’ he declared, ‘is the chairman of my board.’”

In 1952, before a shipment of LeTourneau equipment left the docks of Vicksburg, Mississippi, in the direction of West Africa, the Rev. Billy Graham — one of 20th-century America’s most influential pastors — said a prayer and blessed the machines. The locals called the ship “The Ark of LeTourneau.” It carried $500,000 worth of earthmoving machinery, a year’s worth of food and 500 copies of the New Testament. LeTourneau flew ahead on his plane and waited on a wild and sandy beach at Bafu Bay, Liberia, for his ark to appear on the horizon. 

The flame of 20th-century techno-utopianism burned inside him, and bulldozers were just one of the many solutions he provided for a world that he felt needed to be cleared away, rebuilt and illuminated by the light of Christ. As his influence grew, so did his machines, in both size and stature. “There are no big jobs; only small machines,” he wrote in his autobiography, “Mover of Men and Mountains.” An enormous device he called the “Tournalayer” could raise an entire two-bedroom concrete home in just over a day. He designed experimental offshore oil platforms for George H. W. Bush’s mysterious, CIA-linked Zapata companies and developed 750-pound bombs for the United States military. But razing the land was LeTourneau’s ultimate passion — he was a flat-earther in the most literal sense. As a staunch capitalist, he believed that free markets thrived on terrains graded and cleared for the onrush of development. 

Inevitably, almost magnetically, he was drawn toward that ultimate nemesis for those who wish to experience an epic showdown with Earth: the jungle. In exchange for helping to build a section of the Trans-Andean highway for the Peruvian government, LeTourneau was given a large sweep of rainforest stretching from the Andes to the mouth of the Amazon. Flying over the area in 1953, he thought the gently rolling hills looked like waves in a vast green sea and felt giddy with excitement. “The thought of tackling it even with my biggest machines was awesome,” he wrote. Following in the footsteps of Henry Ford, who tried to build a model colony in the Brazilian jungle called Fordlândia, LeTourneau embarked on the construction of his own capitalist utopia in the Peruvian jungle. It would be a Protestant metropolis where he could spread the gospel while also exploiting the local oil industry. He called it Tournavista. 

Soon after they were first introduced into the Amazonian rainforest, modern bulldozers spread like a virus far and wide in a fluster of flattening that included loggers, ranchers and colonists. The 1950s marked the dawn of the widespread machine-led deforestation that has blighted the rainforests of South America ever since.

In Tournavista, the jungle became a marketing arena where LeTourneau could both test and advertise the effectiveness of machines he claimed to be capable of demolishing over an acre of wilderness in half an hour, clearing in days what would have taken months for men with hand tools. Like any good foe, the rainforest challenged and inspired him. His dozers were like ants when confronted by 200-foot-tall shihuahuacos and other colossal trees. So he created bigger, more ferocious machines to fell the ancient hardwoods. In came the “jungle crusher,” a 74-foot-long and 280,000-pound behemoth that was similar to a steamroller. (The U.S. military later brought them to Vietnam to assist with the fight against the Viet Cong.) “The whole machine just rears up like a dinosaur, the back roller pushing and the front roller pressing forward with all its power and weight,” LeTourneau wrote. “Something has to give, and so far it has always been the tree. Slowly and grudgingly, and with a terrific crashing noise, but down it comes.”

Through LeTourneau’s eyes, the wild was devoid of purpose; his dozers and crushers were the paintbrushes used to give it form and meaning. “In the short span of six years we have proved that the jungle, unconquered for centuries, can be put to work, and its extravagant wastefulness turned into extravagant production,” he wrote. And wherever the bulldozers went, God followed. LeTourneau built a church and Bible school in Tournavista and helped translate the New Testament into Indigenous languages. Tournavista was quickly populated with people LeTourneau called “technical missionaries” — those who could “handle a bulldozer as well as a Bible.” He believed that “machinery in the hands of Christ-loving, twice-born men can help [Peruvians] to listen to the story of Jesus and His love.” 

The victory of any human over nature is almost always a momentary illusion. Much like Fordlândia — where relics of Ford’s machines now rust in the derelict shells of old factory buildings and only 2,000 or so impoverished residents reside today — Tournavista was a failure. Political instability, heavy rains and the rapid regrowth of vegetation on cleared lands dogged LeTourneau’s holy city. By the time of his death in 1969, his interest in Tournavista had faded. The town still exists but now distances itself from the industrialist, celebrating its founding year as 1984, long after his death. When I contacted the municipal authority to see if anyone would be willing to speak about its prehistory, no one replied. As for Tournata, a similar colony he tried to build next to that wild and sandy beach on Bafu Bay, it’s hard to find anything about it online today. According to a 2013 blog post by a Christian missionary, all that remained of it was an overgrown airstrip, an abandoned medical facility and the ruins of a church.

Though not the only visionary to take the bulldozer global, LeTourneau was indicative of a certain bulldozing state of mind that swept through American society following the machine’s rapid evolution during World War II. It’s not commonplace to associate bulldozers with war, and yet they were as important to the Allied victory as the aircraft engine, the radar or the atomic bomb. “Of all the weapons of war,” wrote Col. K. S. Andersson of the U.S. Army Corps of Engineers in 1944, “the bulldozer stands first. Airplanes and tanks may be more romantic, appeal more to the public imagination, but the Army’s advance depends on the unromantic, unsung hero who drives the ‘cat.’” The war was largely defined by control of the air, and airplanes needed airfields within operating range of their targets. If the planes were dispatched from seafaring aircraft carriers, then those ships needed docks and dry docks. And those airfields, docks and dry docks needed bases and road systems. In essence, for airplanes to stay mobile as the front shifted across the planet, an entire network of ordinarily immobile infrastructure had to become mobile too. Bulldozers move wars.

“The idea of the ‘Bulldozer Man’ was born: a hyper-masculine all-action American cowboy refitted for an era of technological modernity.”

In the U.S. armed forces, the troops most often operating bulldozers were in the Naval Construction Battalions — the Seabees. Many were engineers and construction workers who’d been trained to fight, and accounts of their wartime experiences were captured in detail in historian Francesca Russello Ammon’s seminal 2016 book, “Bulldozer.” Their machines roared onto islands all across the Pacific, transforming large swaths of tropical environments into concrete fortresses. The bulldozers were being shipped to islands in such large quantities that when one broke down they simply bulldozed it into the ocean and continued working with a fresh one. Often, the Seabees didn’t even wait for the fighting to stop and set to work flattening land as bullets ricocheted around them. 

In 1943, on the island of Mono in the South Pacific, Japanese soldiers inside a pillbox fired on a 28-year-old bulldozer operator named Aurelio Tassone. Under the orders of his lieutenant, Tassone drove his bulldozer toward the pillbox, lifted his blade in the air and then dropped it with such force that it almost stalled the machine. 

“The blade bit through the obstructions as if they were snowdrifts,” he recounted in an interview unearthed by Ammon. “The gun mount toppled over and chunks of logs and Jap bodies flew up in the air. Everybody and everything was crushed and buried underneath that rip-roaring machine,” said Tassone. Afterward, he methodically bladed earth over the wreckage, leaving behind a flat, smoothed surface. Investigators later found the remains of 12 people buried beneath. This may be the first recorded instance of a bulldozer being used to intentionally kill people.

Tassone was celebrated as a hero. But that same year, Theodore Sturgeon — an aspiring writer who would become a key figure in the 1950s golden age of science fiction, inspiring Stephen King and Ray Bradbury — began to see something inherently mysterious and sinister about the bulldozer. Stationed with the Navy in Puerto Rico, he was working 70-hour weeks in 120-degree weather clearing the land with a bulldozer to make way for an airfield, dry dock and shipyard. “I fell in love with that machine,” he later recounted, yet something about the experience unnerved him. It’s unclear whether he’d heard about the bulldozer attack in Mono. But when he returned home to New York the following year, he entered a nine-day writing frenzy. The novella he produced was the story of eight men alone on a remote island who were clearing the land to make way for an airstrip when one of the bulldozers became possessed by a demonic force and ran amok in a murderous rampage. Titled “Killdozer,” it was published in the sci-fi magazine Astounding. “The thing wrote itself!” he exclaimed in a letter to his father, “and after it I could write nothing else.”

As WWII progressed, stories of the conquests of bulldozers began to permeate Western media coverage, and operators came to be seen as contemporary icons. As Ammon noted, the idea of the “Bulldozer Man” was born: a hypermasculine all-action American cowboy refitted for an era of technological modernity. In the heroic stories that abounded, operators would leap into bulldozers and shove their dead comrades aside to keep the machines moving and pushing, scraping and grading. John Wayne, the quintessential cultural icon of the American West, appeared in the Hollywood war film, “The Fighting Seabees” (1944) as a military hero who saves the day on his mighty dozer. An illustrated Coca-Cola advertisement from 1945 portrayed muscular Seabees sitting around their muscular machine, sipping cokes and showing off to Pacific Islanders depicted in headdresses and carrying drums. 

In an article for Life about the occupation of Guam, a full-page portrait showed a topless U.S. military grunt grinning in the sunshine as he pulled the levers of his dozer. The article read: “Having taken the island from the Japs, they promptly started to demonstrate another of their military specialties: high-speed conversion of a quiet little island into a huge war base. Even before fighting had stopped the battered island shook to the pounding rhythm of rock crushers and heavy engines and echoed with the sound of tractor treads crunching on coral. … Cats and Macks and bulldozers puffed and backed and hacked, shaving away the jungle growth. Guam became alive and bustling with roads and road builders. The peanut-shaped piece of land, a thousand ocean miles from anywhere, began to glitter at night like a continental metropolis.”

“Bulldozers were as important to the Allied victory as the aircraft engine, the radar or the atomic bomb.”

The bulldozer in this view was a creator, not a destroyer. Yet the legacy of this period still scars Guam, where the rainforest has fallen silent. The ships that brought the machines during and after WWII may have also accidentally carried with them an invasive species: the brown tree snake. With no natural predators, its population exploded, turning Guam into one of the most snake-infested places on Earth. The snakes had wiped out 10 of its 12 native forest bird species by the 1980s and nearly erased the sound of wild birdsong. Those birds used to eat spiders, and now there are too many of them, too.

In 2012, U.S. government scientists attached tiny parachutes made of green tissue and cardboard to 2,000 dead mice laced with snake poison and dropped them from helicopters. The rodent corpses hung from branches throughout the forests like nightmarish festive decorations. The experiment failed to significantly impact the overall brown snake population. With too few birds to disperse the seeds, researchers estimate that new tree growth has declined as much as 92%, and the forests have thinned. Some of Guam’s native birds have been nurtured back from near-extinction on nearby snake-free islands, but Guam itself still hosts a heavy and expanding U.S. military presence, with billions of dollars in construction planned for the next few years.

For American machinery manufacturers and operators, places like Guam were essentially test runs for the frenzy of construction that would begin at home when everyone returned. Following the war’s conclusion, the U.S. got drunk on bulldozers: Many of its major cities, despite never once being attacked, began to take on a bizarre resemblance to the area-bombed ruins of Europe and Asia as large swathes of natural landscape and farmland were leveled in preparation for a new dawn of infrastructure and suburbia. During the late 1950s in California, an orange tree was bulldozed on average every 55 seconds.

Of course, it wasn’t just bulldozers used for this nationwide transformation — wrecking balls and cranes did just as much work — but this particular machine became the symbolic metaphor of the era. “Mother Earth is going to have her face lifted!” read one earthmoving equipment advertisement in 1944, complete with an illustration of a feminized planet rolled on its side and looking to the ground as a man drives a bulldozer over her face. 

In New York, urban planner Robert Moses famously oversaw the clearance of vast tracts of land for public and private development. In his famous book “All That Is Solid Melts Into Air,” philosopher Marshall Berman captured his own conflicted feelings as a New Yorker during this period. He was awestruck by what was taking place around him. “To oppose his [Moses’s] bridges, tunnels, expressways, housing developments, power dams, stadia, cultural centers, was — or so it seemed — to oppose history, progress, modernity itself. And few people, especially in New York, were prepared to do that,” he wrote. 

But his perspective changed in 1953 when the bulldozers arrived at his door to begin work on a new expressway. “At first we couldn’t believe it; it seemed to come from another world,” he wrote. “They surely couldn’t mean what the stories seemed to say: that the road would be blasted directly through a dozen solid, settled, densely populated neighborhoods like our own; that something like 60,000 working- and lower-middle-class people, mostly Jews, but with many Italians, Irish and Blacks thrown in, would be thrown out of their homes.” And yet it was so. Berman remembered visiting the construction sites after the evictions, sometimes to weep for what was being destroyed, sometimes to “marvel” at how quickly his “ordinary nice neighborhood” was being transformed into “sublime, spectacular ruins.” 

The image of a person weeping for the devastation wrought by the bulldozer while still maintaining awe at its capabilities remains to this day a poignant summation of the seemingly irreconcilable paradox of this machine, both destroyer and creator, and the disorienting speed of erasure it has enabled. Around the world, these scenes are as common today as they were during the demolition of Berman’s Bronx.


Unlike most architects, Fahad Zuberi dedicates himself to studying the built environment’s destruction as much as its construction. Zuberi, currently a scholar at MIT Architecture, remembers when the earthmoving machines first arrived in his neighborhood in the Indian city of Aligarh. 

“It was around 20 years ago, when I was a kid,” he told me over Zoom. “I remember one of those yellow JCBs arrived in our area. It was a very new thing. Some of us had never seen them before, and it was fascinating to see what they could do. We would go to construction sites just to see them in action. There was a running joke among young people: If there is a JCB digging something nearby, then no work is getting done today. Everyone would want to go and watch. I come from a Muslim ghetto, and we were always in need of something: better roads, better drains. There was great malnourishment with regard to infrastructure. When these big yellow machines arrived, we saw them as aspirational: They would build a better city for us. But there’s been a dramatic change over the last two to three years. Now, nobody in India looks at a JCB with the same eyes — nobody.”

For many marginalized groups around the world, heavy earthmoving equipment has often been the visible part of the faceless bureaucratic megamachine that runs roughshod through communities in the name of urban renewal, beautification or “slum” clearance. In India, the yellow JCB has become more than that — a catchall term for earthmoving equipment transformed into weapons of both literal and metaphoric power. For some, they symbolize unspeakable terror; for others, righteous justice. 

It began in 2017 when the chief minister of Uttar Pradesh, Yogi Adityanath — a member of India’s ruling right-wing Hindu nationalist Bharatiya Janata Party (BJP) — started threatening to destroy the homes and assets of criminals in the state. By 2020, he’d begun making good on his threats, including demolishing the home of one of India’s most notorious gangsters, Vikas Dubey, not long after police teams sent to apprehend him were ambushed by his gunmen. 

By 2022, Adityanath’s use of demolition as a form of extrajudicial punishment had become widespread and indiscriminate. Earthmoving machines began appearing outside the homes of people, mostly Muslims, who’d been accused of rioting or even of just attending protests. Many had never been found guilty or tried in a court of law for what they’d been accused of. The authorities conducting the demolitions would often insist that the dwellings had been “illegally constructed.”

But “illegality” in the built environment is extremely common in India. In Delhi alone, estimates suggest that anywhere between 30% and 80% of properties could be considered illegal. 

“For many marginalized groups around the world, heavy earthmoving equipment has often been the visible part of the faceless bureaucratic mega-machine that runs roughshod through communities in the name of urban renewal, beautification or ‘slum’ clearance.”

“It was clear that it was only Muslim houses being targeted,” Zuberi told me. In the wake of demolitions, BJP politicians publicly celebrated them as a form of “vigilante justice,” and the phenomenon became known as “bulldozer raj” (rule by bulldozer). The anti-Muslim overtones were clear: In a now-deleted Twitter post from 2022, a BJP spokesperson equated the letters JCB with “jihadi control board.”

Local press and members of opposition parties branded Adityanath “Baba Bulldozer” — Papa Bulldozer — continuing the genealogy of authoritarian leaders who have been nicknamed after the machine, including Israel’s Ariel Sharon and Tanzania’s John Magufuli. Adityanath embraced the criticism, and the JCB became a symbol of his 2022 reelection campaign. “We have a special machine which we are using for building expressways and highways,” he said during a speech. “At the same time, we are using it to crush the mafia who exploited people to build their properties.” Those who failed to vote for Adityanath, BJP politician Raja Singh warned, would be found and bulldozed. 

Around the country that year, widespread glorification of the machine engulfed certain sections of society. Processions of JCB vehicles started appearing at BJP political rallies, adorned in flowers and carrying people in their buckets, while crowds of onlookers waved toy bulldozers in the air. Pop songs about the machines racked up millions of hits on YouTube. Grooms rode to weddings atop earthmovers, and shops sold “JCB Gorilla” condoms — a further emphasis on the machine’s enduring associations with a particular notion of hypermasculine heroism. At a mass wedding in March, newlywed couples were given toy bulldozers as gifts to “symbolize the victory of good over evil and also order in life,” according to a guest at the event. 

In Gujarat, Assam, Delhi, Madhya Pradesh and Uttar Pradesh, scores of homes were flattened and their inhabitants were offered no alternative accommodation or compensation. BJP politicians who wished to be seen as no-nonsense strongmen began gravitating toward the iconography of the machine. 

In Madhya Pradesh, then-Chief Minister Shivraj Singh Chouhan became known as “Bulldozer Mama.” On April 11, Hasina Bi and her family were at their home in Khargone, where they had lived for 40 years, having redeveloped it from a mud house into a permanent structure. They were all asleep — it was Ramadan and they had been fasting. They awoke to the sound of bulldozers. 

“I had no reason to believe my house would be demolished,” Bi said in an interview with a researcher from Amnesty International. “The officials of the Municipal Corporation stood in front of my house and ordered the demolition of the houses. … I kept running around them with all my paperwork. I begged them to check my paperwork first. … They asked me to go somewhere else with all this and did not hear a single word I said. … I told them I won’t leave this house. ‘I am so poor, where will I go?’ I asked. I stood there steadfast until the police started beating me up with lathis [batons] and yelled, ‘Get out of here!’ I did not move. I said ‘Raze me down with this bulldozer. Take my dead body with you. Where will I go in this poverty?’ Then my son came to me and begged me to move: ‘Ammi, the authorities won’t even think twice before killing you.’ All my life’s earnings and memories were in that house. They did not even allow us to collect my belongings. Everything was razed down.”

Five days later, in Jahangirpuri, a relatively poor neighborhood in North Delhi with a large Muslim population living among a Hindu majority, there were religious parades to celebrate the Hindu festival of Hanuman Jayanti. The first two were authorized by police and went off without incident. But a third, unapproved procession in the evening took a different route past a mosque where local Muslims were honoring Ramadan.

It was a sweltering night — a heatwave had begun that would become one of India’s hottest in a century. Some members of the crowd were carrying knives, swords, baseball bats and guns, and they chanted and played loud music outside the mosque. Arguments broke out and violence ensued. Stones, bricks and bottles were thrown, and shots were fired. In footage online, fires can be seen raging in the street. Police arrived to control the incident, and eight officers were injured in the chaos, one by a bullet. The head of the BJP in Delhi at the time, who kept a toy bulldozer on his desk, called the local mayor of North Delhi to action: “I request you to act strictly and swiftly and mark the illegal structures of these rioters and use the bulldozer against them.”

“In India, the yellow JCB has become — a catchall term for earthmoving equipment transformed into weapons of both literal and metaphoric power. For some, they symbolize unspeakable terror; for others righteous justice.”

Four days later, nine large earthmoving vehicles (several of which were JCB-branded) descended on the street where the mosque stood, accompanied by hundreds of police and paramilitary forces. Without warning, they began destroying homes and businesses. Crowds gathered and some owners tried to protect their properties from the looming, craned buckets and bulldozer blades. Dust clouds filled the air. A news anchor from the TV channel Aaj Tak climbed aboard one of the machines and broadcast from the cab as the operator jolted levers. “You are now watching live,” she said as the grim Ballardian spectacle unfolded, “as the crane destroys an illegal construction.” 

The demolitions only stopped when Brinda Karat, a 77-year-old politician from the Communist Party of India, emerged from the crowd and blocked one of the machines by standing in front of it while waving a physical copy of a Supreme Court order to stop the destruction. According to Amnesty International, at least 25 properties were destroyed that day, of which 23 were Muslim-owned. As before, the authorities insisted that the demolitions had taken place due to the illegality of the structures, but the timing was not lost on the locals. One man interviewed on Aaj Tak remarked: “This [shop] has been around for 50 years. … It suddenly became illegal after the Hindu-Muslim issue?”

Despite its role as the brand-du-jour of the bulldozer raj, JCB doesn’t actually make bulldozers. The company is better known for its backhoe loaders, excavators, telehandlers and forklifts. Yet among the general public in India, “bulldozer” is now a proxy word for any big machine used in these performative demolitions, of which JCBs have become the most visible. 

JCB, or J.C. Bamford Excavators Limited, is a U.K. company founded in 1945 by Joseph Cyril Bamford, who began by making agricultural machines from war surplus materials in a rented garage. Today, the company is among the world’s largest manufacturers of construction machinery. In the U.K., JCB’s iconic yellow machines are associated with construction, of course, but also with children’s toys and books, a whimsical theatrical show known as the “dancing diggers” and even a platinum-selling pop single from the 2000s (“JCB” by Nizlopi). 

The company is now run by Bamford’s son, Lord Anthony Bamford. 2024 was a good year for Bamford and his family, one of the wealthiest in the U.K.: They secured a $389 million payout from JCB — their biggest in nearly a decade — following the company’s 44% surge in profit since the year prior. India played a significant role in that growth; it is JCB’s biggest single market, with six manufacturing units and a network of more than 60 dealers and 700 outlets. 

The Bamfords are a colorful industrial dynasty. Lord Bamford, worth around $8.4 billion, flies to and from the JCB headquarters in a white Sikorsky S-76 helicopter. Renowned as both a flamboyant socialite and political kingmaker, he’s among the top donors to the U.K. Conservative Party and is close friends with former Prime Minister Boris Johnson, whose wedding he hosted in 2022. Bamford has also been investigated for hundreds of millions of pounds in tax avoidance. He and his wife, Lady Carole Bamford, live in the Cotswolds, a quaint pastoral utopia for the U.K.’s rich and famous, in a Georgian mansion once owned by a former British governor-general of India. According to a profile of Lady Bamford in W Magazine, the Bamfords’ “extravagant parties have become nearly legendary. For an India-theme party, Bamford had elephants carry guests up the drive. … Almost any sit-down they throw can be a full-blown affair, with liveried footmen and flowing Krug.” 

“In India, the yellow JCB has become a catchall term for earthmoving equipment transformed into weapons of both literal and metaphoric power. For some, they symbolize unspeakable terror; for others righteous justice.”

The day after the destruction in Jahangirpuri, Lord Bamford happened to be in India with Johnson, who was still the U.K. prime minister. They were inaugurating a new JCB factory in Gujarat, the home state of India’s prime minister, Narendra Modi. At one point, Johnson climbed into the cab of a machine and leaned out to wave at photographers. Neither he nor Bamford made any comments to the press about the violence in Delhi.

In February 2024, Amnesty International published two detailed reports on the human rights violations committed with JCB’s equipment, demanding that the company take action. “We’ve had a wall of silence. No acknowledgment at all of the issues we’ve raised,” Peter Frankental, a program director of economic affairs at Amnesty U.K., told me. “No concern being expressed that their equipment and brand is being used to violate human rights and strike fear into Muslim communities. For how much longer can they continue to escape scrutiny?”

In November 2024, India’s Supreme Court handed down a judgment that it hoped would finally bring an end to years of bulldozer raj: No person’s home could be demolished merely because they were accused or even convicted of an offense. “These legalities will protect people,” Zuberi, who helped advise legal experts on the Supreme Court guidelines, told me. “But I don’t think the cultural or political aspect of all this will be affected. This idea of collective punishment via the built environment has been socially accepted in India. The glorification of the demolition of homes is now mainstream.”

According to an investigation by Indian magazine Frontline, at least 7,407 houses were demolished in state-led eviction drives in 2024 alone, rendering over 41,000 people homeless. Addressing the years of destruction, jurist Kapil Sibal, currently the president of the India Supreme Court Bar Association, wrote: “My home is not just a brick-and-mortar structure. Its masonry and whitewashed walls do not even begin to tell the story. Within its womb lies all that I cherish. It saves me from the heat of the blazing sun, protects me from chilling winter nights, and holds the memories that live with me. The joys of my being are cradled within it. … A home where I am both alone and together [is] essentially a part of my very being. When you allow a bulldozer to wade through it, you don’t just destroy a structure, you destroy the essence of all I am. With it, all of me falls apart.”

For similar reasons, Palestinian poet Mahmoud Darwish once lamented that a home is not simply demolished; it is “murdered.”


“There is no parallel between bombs and bulldozers,” Bill Clinton said when questioned about Israeli-Palestinian tensions in 1997. In some sense, he was correct. The force delivered is incomparable. “Bombings are a cosmic phenomenon,” said French philosopher Paul Virilio, a survivor of WWII. “You don’t feel like a concrete person is doing this to you, it’s more like the apocalypse or a huge storm or the eruption of Vesuvius.” A bomb is a clear and blatant act of violence, but a bulldozer can appear banal and bureaucratic. The violence the machine enacts is slow, rumbling, grinding, drawn out. Not an instantaneous vaporization.

And yet the end result of both acts is largely the same: a home destroyed, a neighborhood flattened; in some cases, bodies beneath rubble. As philosopher and environmentalist Richard Sylvan wrote: “The Bomb and the Bulldozer are out of the same technological Pandora’s Box.” In the aftermath, it doesn’t much matter the cause of the destruction.

Headquartered in Texas, Caterpillar is the world’s largest construction equipment company and the foremost global producer of bulldozers. If you picture a bulldozer in your mind, it’s probably something like a Caterpillar D9. 

The company had a market capitalization of $165 billion as of May. In recent years, Caterpillar has become something of a style brand too. Its robust flip phone is one of the most sought-after in the “dumbphone boom,” and its boots, sneakers and collaborations with fashion designers are regularly featured on influential streetwear websites like Highsnobiety and Hypebeast. 

Caterpillar has been supplying heavy equipment to Israel since the 1950s. The most recognizable is the D9 bulldozer. The D9 is an impressive beast: As tall as a double-decker bus, it weighs around 54 tons, almost as heavy as an M1 Abrams tank. The blade alone weighs as much as an Asian elephant, and it can be augmented with a ripper on the back, a gigantic steel talon that can dig into earth and rock and carve out ditches. 

Once they arrive in Israel, bulldozers destined for the Israeli Defense Forces (IDF) often undergo a range of modifications. Thick plates of armor are added, as well as bulletproof glass and additional slats to deflect rocket-propelled grenade rounds. Extra modifications have been known to include crew-operated machine guns, smoke projectors and grenade launchers. These modifications make the IDF’s D9 — nicknamed the “Doobi” (Hebrew for “teddy bear”) — 20 tons heavier, about the same weight as 45 mid-sized cars. Inside the cabin, an isolation system keeps the air conditioned and the noise to a maximum of 77 decibels. In 2024, the IDF became the world’s first army to deploy autonomous unmanned D9s, which have become known as “Pandas.”

“When you allow a bulldozer to wade through it [my home], you don’t just destroy a structure, you destroy the essence of all I am. With it, all of me falls apart.”
—Kapil Sibal

It’s difficult to set eyes on an IDF-modified D9 without traveling to the Palestinian Territories and witnessing them in action. The closest I could get was by ordering a highly realistic 1/35 scale model from China and assembling its 200-plus individual pieces myself. Gradually, I put it together using tweezers, nail files and extra-thin cement, fixing together the armor plates, the ripper, the gunner’s seat and the mounted machine gun. Slowly, a menacing object the size of a kitten took shape in my living room. Next to it, a human would be around the size of my little finger.

The D9 has become central to the actions of the Israeli government in the West Bank and Gaza. “In the hostilities, the omnipresent bulldozers have as much strategic importance as the tanks,” wrote Christian Salmon, a French observer who traveled across Ramallah, Rafah and Gaza in 2002. “Never has such an inoffensive machine struck me as being more of a harbinger of silent violence. … Geography, it is said, determines war. In Palestine it is war that has achieved the upper hand over geography.” 

In the latest war between Israel and Hamas, D9s have been involved in the destruction of homes, hospitals, factories, olive and orange groves, greenhouses, graveyards and archaeological sites. Home and infrastructure demolitions in the West Bank and East Jerusalem using bulldozers have also increased in prevalence. A recent report by The New York Times showed D9s rolling through Tulkarm and Jenin, destroying shops and businesses as well as ripping up roads and water and sewage pipes, blocking emergency vehicles and destroying the trees and shrubs on a decorative roundabout.

In an interview with Breaking the Silence, an Israeli NGO established by IDF veterans, a former officer explained the ruinous and somewhat surreal outcome of mass bulldozing campaigns in Gaza during Operation Protective Edge in 2014: 

I don’t know how they pulled it off, the D9 operators didn’t rest for a second. Nonstop, as if they were playing in a sandbox. Driving back and forth, back and forth, razing another house, another street. And at some point there was no trace left of that street. It was hard to imagine there even used to be a street there at all. It was like a sandbox, everything turned upside down. And they didn’t stop moving. Day and night, 24/7, they went back and forth, gathering up mounds, making embankments, flattening house after house. From time to time they would tell us about terrorists who had been killed. … I remember that the level of destruction looked insane to me. It looked like a movie set, it didn’t look real. Houses with crumbled balconies, animals everywhere, lots of dead chickens and lots of other dead animals. Every house had a hole in the wall or a balcony spilling off of it, no trace left of any streets at all. I knew there used to be a street there once, but there was no trace of it left to see. Everything was sand, sand, sand, piles of sand, piles of construction debris. You go into a house by walking up a sand dune and entering it through a hole in the second floor, and then you leave it through some hole in its basement. It’s a maze of holes and concrete.

Other accounts of bulldozing in Gaza are even more graphic. In October 2024, CNN published an article about Eliran Mizrahi, a 40-year-old father of four and construction manager who took his own life after serving as a bulldozer operator for the IDF during the most recent war. He spent 186 days at work until he was injured when a rocket-propelled grenade struck his vehicle. “He got out of Gaza, but Gaza did not get out of him. And he died after it, because of the post-trauma,” his mother told CNN. “He always said, ‘No one will understand what I saw,’” his sister added. What Mizrahi saw from the seat of his D9 was revealed by his co-operator, Guy Zaken. “We saw very, very, very difficult things,” Zaken told CNN. “Things that are difficult to accept.” In testimony to the Knesset last June, Zaken said that on many occasions, soldiers had to “run over terrorists, dead and alive, in the hundreds.” He explained that, “everything squirts out.”

“The Bomb and the Bulldozer are out of the same technological Pandora’s Box.”
—Richard Sylvan

While armored D9s are the IDF’s bulldozer of choice for military endeavors, other machines — including those manufactured by JCB, Hyundai and Volvo — are used for home demolitions across the Palestinian Territories. According to Jeff Halper, the chair of the Israeli Committee Against House Demolitions, more than 55,000 homes were destroyed in the territories from 1967 to 2021. 

There are two main forms of state-sanctioned home demolition: for “deterring terrorist attacks” and for lack of building permits. The latter is more common. It is extremely difficult for most Palestinians to build a home legally. In July 2023, an IDF official confirmed during a Foreign Affairs and Defense Committee meeting that, on average, more than 90% of Palestinian requests for permits in the West Bank were rejected for administrative issues and sometimes for “political reasons,” while approximately 60-70% of Israeli settler requests for permits there were approved. Desperate for housing, many Palestinians build their homes without proper permits. These buildings, once identified by the authorities, are then subject to demolition. Under military orders, Israeli authorities are able to bulldoze homes within 96 hours of issuing a removal order. Often, due to the heavy penalties incurred for building without a permit, Palestinians tear down their own homes if Israel issues a demolition order.

Over the years, Israel’s Caterpillar bulldozers have drifted in and out of Western media attention. In 2003, Rachel Corrie, a 23-year-old American human rights activist, traveled to Gaza to join a group of volunteers who had agreed to act as human shields to protect Palestinian homes from Israeli demolitions. Within a month or so of arriving, Corrie wrote to her mother to say she was having nightmares about the machines, dreams in which they arrived outside her family home in the U.S. while she and her mother hid inside.

On March 16, 2003, on an overcast spring afternoon, Corrie — dressed in an orange fluorescent jacket and armed with a megaphone — stood atop a mound of dirt in the Rafah refugee camp and faced off against a D9 as it attempted to demolish the home of the Nasrallah family. As the Nasrallah children watched through a crack in their garden wall, the operator drove the D9 over Corrie and then reversed back over her again, crushing her skull, ribs and vertebrae. An IDF report concluded that the operator did not see her. Corrie was rushed to a hospital but succumbed to her injuries and was declared dead that evening. The story of a young American woman being killed in broad daylight by an American-made machine, paid for and shipped to Israel by the American government, made international headlines and started what would become a flurry of lawsuits against bulldozer manufacturers. But few of the cases managed to lay a substantial glove on any of the companies involved or achieve legal remedy or compensation for those affected by the activities of these machines. 

In 2005, Corrie’s parents, along with four Palestinian families whose relatives had been killed by D9s, filed a federal lawsuit in the U.S. against Caterpillar, accusing the company of aiding and abetting war crimes by providing bulldozers to the Israeli military knowing they would be used unlawfully to demolish homes and endanger civilians. Ultimately, the court ruled that the export of Caterpillar bulldozers to Israel as part of the military sales program was a foreign policy decision made by the government, and the court did not have the authority to question it. In response to inquiries from Human Rights Watch before the lawsuit was filed, James Owens, Caterpillar’s CEO at the time, stated that the company did “not have the practical ability or legal right to determine how our products are used after they are sold.” 

Though the company claimed it could not monitor the use of its machines, spying on the families of people killed by the machines turned out to be perfectly feasible. In 2017, an investigation by The Guardian and The Bureau of Investigative Journalism revealed that after the conclusion of the Corrie case, Caterpillar hired the corporate espionage company C2i to spy on the family and report back on any subsequent actions they might be planning.

In March 2011, another lawsuit was filed against Caterpillar, this time against its Swiss subsidiary. The complaint was led by TRIAL International, which was assisting six Palestinian families whose homes had been illegally demolished in the West Bank city of Qalqilya. While the Office of the Attorney General of Switzerland described the Israeli military’s actions as “punitive demolitions,” the prosecutor ultimately closed the case against on the grounds that the bulldozers in question were not weapons and that what the IDF did with them was not Caterpillar’s fault.

Criticism of Caterpillar has also come from within. Numerous shareholder proposals have asked the company to review its human rights policies due to the overwhelming evidence of Caterpillar bulldozers being weaponized against Palestinians, to no avail. As Doug Oberhelman, the company’s CEO from 2010 until his retirement in 2017, has said, “How our customers use [the bulldozers] is their business. We can’t stop them.” In 2022 and again in 2023, a nonprofit pension agency for retired Methodist clergy and a major Caterpillar shareholder requested that the company permit an independent third-party inquiry into potential human rights violations. Both times, the board voted against: “We believe we already deploy the right policies, processes and governance to ensure we make the right decisions about where and how we conduct our business aligned with our values.”

“Never has such an inoffensive machine struck me as being more of a harbinger of silent violence.”
—Christian Salmon

The question of whether a bulldozer can be considered a weapon, Frankental assured me, is fundamentally meaningless: “Whether or not you designed it for that purpose — it has been weaponized,” he said. The more pertinent inquiry lurking beneath all this — one that is broadly relevant in the 21st century as advances in industries like AI, social media and surveillance increasingly complicate our lives — is: How responsible are the manufacturers of a particular technology for the ways in which it is ultimately used? 

For some, the notion that the manufacturer of an earthmoving machine should be held responsible for what operators do with it on the streets of India or in distant wars in the Middle East seems unreasonable. Famously, gun manufacturers in the U.S. assume basically zero accountability for the deaths of people killed by their products. Roi Bachmutsky has heard this kind of argument a lot. Over Zoom, the international human rights lawyer told me: “The problem is, it assumes that these earthmoving companies are just distant manufacturers, far away from where these violations are committed. But they aren’t just manufacturers; most heavy machinery firms provide services to these machines long after they leave their hands. They certify them, provide maintenance, and they can tell where they are at any conceivable moment.”

These modern machines have evolved into what are essentially gigantic computers, equipped to both collect and transmit huge quantities of data. Telematic technology, as it is known, has become an industry standard in the earthmoving world. JCB’s is called LiveLink, and Caterpillar’s is called VisionLink. Almost every piece of Caterpillar or JCB equipment that leaves the factory now contains telematics. A machine can be tracked and monitored 24 hours a day, even when the engine is switched off. Fuel, oil and coolant can be monitored from anywhere in the world, and so too things as minute as the way a particular operator uses the clutch. Machines can be “geofenced” so that alerts go off whenever they enter or leave a predetermined area on a map. And they can be remotely immobilized to prevent any unauthorized use. Data continuously streams from machine to cloud, providing insights for anyone privy to all this information. In a recent interview, Caterpillar’s chief technology officer said: “We have over four million assets actively running around the world today, and 1.4 million of those are connected; connected to us, connected to our customers and our dealers.”

In the spring of 2022, Russian soldiers looted 27 machines worth nearly $5 million from a John Deere dealership in Melitopol, Ukraine, and shipped them 700 miles back to Russia. But when they tried to turn the machines on, they had been remotely “kill-switched” by the dealership. What enabled this remote disabling was a practice known as “VIN-locking,” which manufacturers use to prevent unauthorized repairs to their products, instead requiring a licensed or official company technician to do so. This controversial practice has been at the heart of the “right-to-repair” debate in the U.S. and has resulted in widespread “tractor hacking” by farmers who wish to mend their own equipment. As sci-fi author and tech journalist Cory Doctorow wrote in his analysis of the Melitopol story, “The technology was not invented to thwart Russian looters. … [I]t was invented to thwart American farmers.”

When it suits their purposes, then, the technology exists for heavy machinery companies to monitor, control and even disable products. Violence against Palestinians and Indians apparently doesn’t rise to the level of violating company values, let alone cause enough concern for the company to brick their machines. “They obviously generate all this data for improving their services and manufacturing. Why aren’t they generating it for human rights due diligence as well?” Frankental wondered. “Instead, they’ve taken a ‘see no evil, hear no evil’ approach.”

Bachmutsky painted a hypothetical picture of a knife manufacturing company: “What if that manufacturer knew that it was supplying those knives directly to people inclined to attack someone? Perhaps they knew one of their customers had a history of violent knife attacks, and sold them it anyway. What if they also provided that person with maintenance, like knife-sharpening services, despite that knowledge? What if they also had a technology that allowed them to see exactly where all of their knives are at every conceivable moment? Do you not think it’s starting to look a lot less strange that they would have some responsibility for the end use of those knives?”


In the 1850s, Ralph Waldo Emerson embarked on a lecture tour across the U.S. The lectures were later collected in a book of essays, “The Conduct of Life,” with chapters such as “Power,” “Wealth,” “Beauty” and “Behavior.” In one titled “Fate,” Emerson sought to address “the question of the times” — “How shall I live?” He wrote: “You have just dined, and, however scrupulously the slaughterhouse is concealed in the graceful distance of miles, there is complicity.” 

I remembered that quote on the gray and cold winter morning when I arrived at JCB’s global headquarters in the West Midlands. The enormous factory sat in front of three picturesque, human-made lakes lined with trees and populated with wildfowl introduced by the Bamford family, including mandarin ducks, great crested grebes and pochards. The yellow JCB logo was reflected in the rippling water. Across the road was the JCB Golf and Country Club, a private venue with a professional-level golf course and luxury accommodations. I passed a sinister, spider-like statue, 45 feet high, made from rusted old parts of earthmoving machinery, as well as a helipad where a white Sikorsky was parked. At the factory entrance stood another statue, bronze this time: five muscle-bound men shoveling dirt and carrying it uphill by the sackful. I assumed it was intended to remind visitors of how JCB had revolutionized such slow and laborious tasks. 

As I waited in the large white-marble reception hall for a public tour to begin, I glanced through a book about the history of the company. My eye was drawn to a photo of a chimpanzee driving a JCB and another of a just-married couple being carried away from a church in the mouth of a digger. Then the tour began. Quotes from Lord Bamford lined the stairwells — “Always looking for a better way,” “The power to change our world.” We saw a piece of machinery signed by Margaret Thatcher and a museum filled with early prototypes, advertising posters and a small set of models depicting JCB’s operational fleet of airplanes used to fly clients and management around the world. In the gift shop, JCB-branded USB cables, headlamps and champagne flutes were on sale.

On the factory floor, the tour came alive in a clamor of energetic industrial noise. Above our heads, enormous chassis hung on chains from conveyor belts in the ceiling, and huge hydraulic presses rose and fell. Forklifts glided around on indoor roads as workers in hoods and masks and breathing apparatuses welded, painted and assembled. Inside protective chambers, high-powered laser beams cut through thick metal plates. Something red glowed behind a heavy curtain. 

The tour’s final stop was the production line: a conveyor belt on the floor that moved almost imperceptibly past groups of workers who each had around 20 minutes to do their work. The process began with a raw and disembodied engine. Six hundred feet or so later, I watched as a worker climbed into the cab of a fully assembled backhoe loader and drove it out of the factory. 

There it was, destroyer and creator, molded in a process of bewitching efficiency. And despite everything I’d heard, learned and read over the previous three months of research and reporting, I couldn’t help but marvel at this sublime and spectacular machine.

Editor’s Note: This essay was updated after publication to clarify the percentages and locations of Palestinian and Israeli settler requests for building permits, as well as the case against Caterpillar’s Swiss subsidiary.

The post The Shrouded, Sinister History Of The Bulldozer appeared first on NOEMA.

]]>
]]>
A Digital Twin Might Just Save Your Life https://www.noemamag.com/a-digital-twin-might-just-save-your-life Thu, 21 Mar 2024 16:45:11 +0000 https://www.noemamag.com/a-digital-twin-might-just-save-your-life The post A Digital Twin Might Just Save Your Life appeared first on NOEMA.

]]>
On the morning of June 24, 1993, Yale University Professor David Gelernter arrived at his office on the fifth floor of the computer science department. He had just returned from vacation and was carrying a large stack of unopened mail. One book-shaped package was in a plastic ziploc — he thought it looked like a PhD dissertation. As he unzipped it, pungent white smoke poured out, followed by what he later described as a “terrific flash.” He never heard the bang. Shrapnel shot into his eyes, hands and torso, as well as the steel file cabinets around him. A fire ignited and triggered the sprinklers in the ceiling, which began to soak his books and papers.

Gelernter, as he later wrote in his memoir, had been “blown up.” He was the 14th person attacked by a serial killer who was still then at large and known only as the “Unabomber.” From 1978 to 1995, Theodore John Kaczynski, fueled by an ideology that sought to bring about the destruction of modern technological life and a return to primitive ways, murdered three people and injured 23 in a brutal mail-bombing campaign. Gelernter’s lungs and other internal organs were damaged, he lost the vision in his right eye and most of his right hand was destroyed. But he survived. 

Around two years later, another letter from Kaczynski arrived at the computer sciences department for Gelernter. No bomb this time — just a typewritten message in which Kaczynski explained the attack was provoked by Gelernter’s most recent book: a speculative work of nonfiction titled “Mirror Worlds.”

Back in 1991 when the book was first published, just over 1% of Americans were using the internet. But Gelernter claimed computing was about to revolutionize life on Earth. “This book describes an event that will happen someday soon,” he wrote in the opening line. “You will look into a computer screen and see reality. Some part of your world — the town you live in, the company you work for, your school system, the city hospital — will hang there in a sharp color image, abstract but recognizable, moving subtly in a thousand places.”

In essence, Gelernter believed that every aspect of life could soon be modeled in a parallel digital simulation. Everything happening in our lived reality would be tracked and monitored and fed into software “by a steady rush of new data pouring in through cables” to create a high-fidelity real-time digital representation of the world and all of its pulsing, swarming and sensuous qualities. This would be like Mark Zuckerberg’s metaverse on steroids: our exact world, our very lives, all digital. And you could view, manipulate, experience and interact with this mirror world, like a child with a dollhouse. A dashboard for reality.

These “high-tech voodoo dolls,” as Gelernter described them, “will mark a new era in mankind’s relationship to the man-made world. They change that relationship; for good.” It would be possible, he believed, to not just monitor what was happening around the world, but also to predict what could happen — endless simulations of possible future events would be running inside the mirror world. We could prepare ourselves for any outcome — any future — in the physical world because we would know what was coming. 

“You will look into a computer screen and see reality. Some part of your world … will hang there in a sharp color image, abstract but recognizable, moving subtly in a thousand places.”
—David Gelernter

In 2007, Technology Review called “Mirror Worlds” “one of the most influential books in computer science” and Jaron Lanier, a founding father of virtual reality, once hailed Gelernter as “a treasure in the world of computer science.” Reviewing the book in The New York Times the year it was published, Christopher Lehmann-Haupt wrote that it “tells you how Hamlet’s dream may be fulfilled: ‘I could be bounded in a nutshell, and count myself a king of infinite space.’” 

Gelernter believed mirror worlds might spring to life in the decade after his book was published, but nothing close came to pass. The dream was beyond the available technology. The buzz subsided.

There’s a phrase in the science fiction community known as “steam-engine time”: when lots of people have the same idea independently at the same time. It can be traced back to a line in Charles Fort’s 1931 novel “Lo!”: “A tree cannot find out, as it were, how to blossom, until comes blossom-time. A social growth cannot find out the use of steam engines, until comes steam-engine time.”

In the mid-2000s, a manufacturing expert named Michael Grieves started to spitball ways to make factories more efficient. Instead of a manager peering down on the factory floor from above, trying to sense how things were going, Grieves thought there should be an exact virtual replica of every physical nook, cranny, machine, forklift and worker that the manager could analyze on a computer screen. Create an endless stream of data from a network of sensors and cameras that flowed from the real factory to the virtual one, and you’d get an ever-changing real-time representation of its brick-and-mortar counterpart. Anything that changed in the factory would change in the model, instantly: the physical and the virtual locked together in what the French philosopher Jean Baudrillard might have called an “ecstasy of communication.” 

What power this would give a factory manager! Perhaps they wanted to see how a change to one of the production lines would impact the entire operation. Run a simulation to see how it might play out. Should anything go wrong, rewind it back, find out what the hell happened. Heck, the manager wouldn’t even need to be anywhere near the physical factory — they could be off in a beach house on some idyllic island. 

Grieves, along with a NASA researcher named John Vickers who had been mulling over a very similar idea, called this a “digital twin.” “Not only the factory manager, but everyone associated with factory production could have that same virtual window to not only a single factory, but to all the factories across the globe,” he wrote in 2014. Here at last was Hamlet’s king of infinite space.

“Create an endless stream of data from a network of sensors and cameras that flowed from the real factory to the virtual one, and you’d get an ever-changing real-time representation of its bricks-and-mortar counterpart.”

In the last decade, thanks to advances in AI, the internet of things, machine learning and sensor technologies, the fantasy of digital twins has taken off. BMW has created a digital twin of a production plant in Bavaria. Boeing is using digital twins to design airplanes. The World Economic Forum hailed digital twins as a key technology in the “fourth industrial revolution.” Tech giants like IBM, Nvidia, Amazon and Microsoft are just a few of the big players now providing digital twin capabilities to automotive, energy and infrastructure firms.

The inefficiencies of the physical world, so the sales pitch goes, can be ironed out in a virtual one and then reflected back onto reality. Test virtual planes in virtual wind tunnels, virtual tires on virtual roads. “Risk is removed” reads a recent Microsoft advertorial in Wired, and “problems can be solved before they happen.” 

All of a sudden, Dirk Helbing and Javier Argota Sánchez-Vaquerizo wrote in a 2022 paper, “it has become an attractive idea to create digital twins of everything.” Cars, trains, ships, buildings, airports, farms, power plants, oil fields and entire supply chains are all being cloned into high-fidelity mirror images made of bits and bytes. Attempts are being undertaken to twin beaches, forests, apple orchards, tomato plants, weapons and war zones. As beaches erode, forests grow and bombs explode, so too will their twins, watched closely by technicians for signals to improve outcomes in the real world.

The first city to begin the process of digitally twinning itself was Singapore, which deployed fleets of aircraft, drones and cars armed with lasers to scan the entire city-state from above and on the ground and then combine as much weather, demographic and movement data as possible. That twin will be used by the government to simulate construction projects, the effects of flooding and extreme heat, large-scale emergencies and more. The city is expanding the digital twin’s scope underground too, mapping a vast network of subsurface infrastructure. Tuvalu, a nation of low-lying islands and atolls in the Pacific, has also begun to digitally twin itself in the hope of preserving, at least in virtual reality, what may soon disappear entirely under the rising seas of physical reality. 

Digital twin projects are probably going on in your city or state too. Search YouTube and you’ll find countless experts in sharp attire giving giddy presentations, their slides filled with illustrations of physical objects reduced to luminescent geometric grids reminiscent of the original “Tron.” TED Talks abound — “How ‘Digital Twins’ Could Help Us Predict the Future,” “Digital Twin: Towards Next Generation Virtual World.” 

A skeptic might catch the faint rumble of a hype train and wince as a new multibillion-dollar industry intent on making spectral replicas of things that already exist comes to life. One has to wonder: How accurate can these things truly be? Whom do they serve? What damage to the environment will the necessarily massive amounts of computation do? Are we really interested in having yet more of our lives and shared public goods uploaded into digital realms and controlled by a tiny group of techno-capitalists?

Digital twin technology is still in its infancy, and many of these projects are in varying degrees of incompleteness. But there is a sense that this is all now a matter of time. Reflections of ourselves and our worlds are enticing, beckoning us. Digital twins are already changing how people think across many academic disciplines and professional fields. You see, they aren’t just being made of objects, places and processes — they are being applied to living entities and organisms. And this is where things get a bit weird, because these twins might just save your life. In fact, some of the most scientifically advanced and potentially life-changing projects are coming from the world of healthcare, where there is an ongoing quest to make a digital twin of you. 

“These digital twins might just save your life.”

On the western outskirts of Barcelona, next to a park filled with orange and cypress trees and a pond populated by ducks and geese, sits a Catholic chapel built of pink sandstone in the 1940s. It fell out of use in the 1970s and was eventually deconsecrated and donated to the city. In the early 2000s, the Barcelona Supercomputing Center (BSC) chose it as the unlikely location for the construction of one of the world’s most powerful supercomputers.

The first iteration of the supercomputer, MareNostrum (“our sea” in Latin), was switched on in 2004 and could do over 42 trillion calculations per second. Every few years, it undergoes an upgrade. At its peak, the current one, MareNostrum 4, crunches numbers at around 11 quadrillion calculations per second. It has been used to do the complex math behind exploding stars, map ocean currents and develop potential AIDS vaccines; it provides complex climate forecasts and predicts the flow of lung-damaging dust through the atmosphere. Its nickname is “Spain’s brain.”

The computer is outgrowing the chapel like a plant pressing against the panes of a greenhouse: MareNostrum 5 is next door. In the final stages of testing, it will be able to compute in one hour what takes the current machine a whole year.

Recently, some of MareNostrum’s precious time has been devoted to the creation of digital twins. In 2018, a group of scientists from BSC started a spinoff company called ELEM Biotech. The company’s tagline is “the virtual humans factory,” and its ultimate mission is to create highly advanced digital humans that can be used for medical experiments. The digital humans are mathematical models, Mariano Vázquez, a computational scientist and co-founder of ELEM, told me. It’s all computer code.

“I read in a very good book a phrase that I like a lot: ‘We are looking for the mathematical roots of reality,’” Vázquez said. “We do it with the weather, we do it with supernovas and galaxy formation, we do it with volcanoes. Why not do it with a human being?”

It was January and I had come by train to visit Vázquez at his office in the BSC building, a modern construction that looks like a gigantic internet router and is joined by an elevated walkway to the chapel next door. The walkway had been designed as a subtle optical illusion, getting gradually narrower and smaller as you traversed it. At its end, through a thick security door, we entered the chapel. Its stone vaults and stained glass windows of glowing angels reading from scrolls were a different world entirely from BSC’s white walls, white ceilings, white strip lights and concrete pillars.

From the choir on its upper floor, we gazed down into the nave at MareNostrum. It sat within a square chamber made of glass and steel around half the size of a basketball court, glowing with white light. Inside the chamber were rows and rows of immaculate monolithic black blocks. Green lights blinked and flickered all across it, giving it a sense of aliveness. Thick braids of cables — yellow, red, green, blue and aqua, more than 4,000 of them — ran between racks and down into the floor. “Everything is clean and perfect; everything is well ordered,” said Vázquez. “Neatness is a very important thing.”

MareNostrum 4 (Barcelona Supercomputing Center)

In his late 50s, Vázquez wore a blue hoodie and tan trousers and carried under his arm a laptop with a sticker of a cartoon cat on it. He spoke fluent English with speedy enthusiasm, even when showing me a simulation of his own heart’s possible future — 25 or 30 years from now — riddled with the diseases of aging. “I hope not,” he said with a smile. 

Born and raised in Buenos Aires, Vázquez cut his engineering teeth in aeronautics and has published research on topics ranging from supersonic aircraft to cloud formation to the flow of water around whales. In the mid-2000s, he began to focus increasingly on biomedicine. “Equations are just a way of describing nature,” he said. “Air is a fluid and blood is a fluid, so the same equations that model the air around an aircraft are the ones used to model the blood inside your body.” 

We sat down on red-cushioned seats in the choir, and the dull thrum of the supercomputer’s cooling system reverberated off the stone walls around us as if we were in a cave listening to a waterfall. Vázquez told me the story of ELEM. In the beginning, they wanted to create a highly complex and hyperrealistic computer simulation of the average human heart. That involved doing a lot of physics, from the electromechanics that determine a heartbeat to the fluid dynamics that determine blood flow. After that, they collaborated with a local hospital to gather as much cardiac data as possible, and they took heart scans of everyone who worked at the company. Eventually, they had a highly realistic average model that could be tweaked and adjusted based on personal data to create a whole range of virtual hearts that mirrored the diversity of sizes, shapes, ages and health levels that one might find in a random group of real people. 

Using these disembodied virtual hearts, ELEM has been able to run virtual clinical trials (known as in silico trials) inside the supercomputer that might have traditionally been performed on animals or humans (in vivo).

“There are plenty of pharmaceutical products that are tested on, say, a mouse, then a dog, and then swine, and in that move from dog to swine you suddenly realize it won’t work, and you’ve already wasted millions,” Vázquez said. “And that’s aside from all the ethical problems of animal tests.” He wasn’t exaggerating: It takes around 12-15 years and approximately $2.5 billion to get a single drug to market; 90% of clinical drug trials fail. Every year, researchers at Cruelty Free International estimate that more than 100 million animals — mice, frogs, dogs, rabbits, monkeys, fish, birds — are killed in laboratories around the world in search of solutions to human medical problems. When it comes to medicine, research and development and testing are slow, expensive and leave a long trail of blood. “We developed a computational tool for pharmacy companies to test the cardiac safety of their products in human beings — virtual human beings,” he emphasized. “But they are human beings.”

In the virtual world, the ethical and physical restrictions of science are nonexistent, for now at least. As Vázquez said: “There are some experiments you cannot do: You can’t go and measure the pressure around a supernova, and you can’t do certain experiments on people. It’s very difficult to run clinical trials on children, for instance. But virtual children? That’s just a mathematical model.” 


“I always knew cardiology was my thing,” said Jazmín Aguado-Sierra. “I remember playing with my friend when I was 5 or 6 years old, and there were these little apple-like fruits from a tree. I would dissect them and transplant the seeds from one fruit to another. Why? I have no idea whatsoever. But it always attracted my attention, this little thing that was in the middle of everything.”

Aguado-Sierra is a biomedical engineer at ELEM. When I met her, she was wearing oval glasses and enjoying a lollipop. Continuing a long tradition of scientists experimenting on themselves — a tradition littered with both success and horror — she had twinned her own heart. 

ELEM’s new ambition, now that the company has provided medical and drug companies a way to run tests on humans without the financial, time and ethical considerations of traditional trials, is to create precise digital twins of specific people’s hearts right down to the cellular level. Accomplishing this would mean opening up the possibility of a virtual heart accompanying you throughout your life, getting continually updated with whatever personal medical data is available from your doctors and from wearable devices and implants. Any medicines or therapies or devices (like a pacemaker) that you need to receive could be designed around your unique anatomy and then tested in your twin heart to gauge effectiveness and possible side effects. Surgeons, should you need them, could rehearse an operation in this risk-free zone. 

Aguado-Sierra’s digital heart is one of the closest attempts we have toward this ambition. Last year, she went in for MRI scans and an electrocardiogram (ECG) and gathered a trove of blood pressure data from her smartwatch. Then she blended all that with the complex physics ELEM had developed on heart functions. After a month of sorting and testing, she translated everything into a series of equations and then translated that into lines of code she fed to MareNostrum. The supercomputer needed nine hours to do the calculations. The virtual heart the machine produced beat three times.

Images from the digital twin of Jazmín Aguado-Sierra’s heart from the “Virtual Heart” display in the Engineers Gallery at the Science Museum, London. (Science Museum Group)

“What was that like, watching your digital heart beat?”

“I was like: Oh my god, it’s really cool,” she said. “It had a good amount of torsion — the heart doesn’t just beat, it also twists.” She locked her closed fists together and turned them in the air. “It was twisting exactly as it should be. I could see that, anatomically, my heart was good. The other thing I’ve learned, and I’m still learning, is the arrhythmic risk of my heart. It appears that my heart is a bit sensitive to drugs.”

“What kind of drugs?”

“Any drug you take can produce a little prolongation; your ECG expands a little bit. Drugs like antibiotics and antihistamines can have a very strong prolongation that produces arrhythmias. You see, I’m currently pregnant, and I’ve been taking antihistamines that are good for the nausea, but I’ve now observed that they actually produce supraventricular tachycardia in my heart. It’s mostly noticeable at rest or at night. You wake up to your heart racing and going crazy for no reason — that’s drug-induced arrhythmia. How much of that risk is actually dangerous? That’s something we can test and simulate with my digital twin.”

The risk of a drug-induced arrhythmia is relatively low but differs from person to person, depending on the specifics of their body. But most of the medical treatment you will receive throughout your life won’t be based on you or your actual, unique body. Modern medicine tends to rely on a one-size-fits-all approach in which treatment is based on historical data of people who might be a bit like you, but might not be, and might have been in the same situation as you, but also might not have. The mesmerizing variety of human body types, characteristics and abilities are simplified into models of averages. And the data for those averages is gleaned from clinical trials that consist largely of white males. For everyone else, it’s a bit murky. That’s why treatments, diagnoses and medicines that work for other people won’t necessarily work for you. The majority of common medications have small to medium effects, leaving many people untouched but exposing a select few to harmful side effects. Each year, around 1.3 million Americans visit the emergency room due to adverse drug events.

The case of women and heart disease provides a powerful example, but similar stories can be told around age and ethnicity. Around 44% of women in the U.S. are living with some form of heart disease — it is the number-one cause of female deaths, more than all cancers combined. But women’s hearts remain something of a blindspot for modern medicine. For much of the 20th century, heart disease was viewed as a predominantly male affliction. The American Heart Association hosted a conference in the 1960s titled “How Can I Help My Husband Cope with Heart Disease?” The Multiple Risk Factor Intervention Trial — published in 1982, one of the first studies to identify a link between cholesterol and heart disease — consisted of 12,866 men and no women. The Physicians Health Study, completed in 1995, which identified the benefits of aspirin in reducing the risk of a heart attack, consisted of 22,071 men and no women.

Modern trials and public campaigns have aimed to rectify this and the knowledge gap has narrowed, but the historical bias still haunts women who are experiencing heart disease. Today, women are 50% more likely than a man to experience a wrong diagnosis for a heart attack. They have worse outcomes than men in heart operations and surgical treatments. And there’s been a long-held assumption in medicine that females are essentially just “smaller versions” of males, and therefore the drugs that work for men will work for women, just in smaller doses. But when it comes to prescription drugs, women are 50-75% more likely to experience an adverse reaction than men.

“As a pregnant woman, when I go to the doctors with nausea, they say: Just take these pills, two every eight hours,” Aguado-Sierra said. “But if I take two every eight hours I will die, because it’s producing this effect on me. I’d rather take candy.” She pointed at the lollipop: mango and chili. “That’s why we need personalized medicine. I think it’s so important to actually make doctors and everyone realize that we need to be more aware of the personal, because we all respond completely differently.”

The dream of a digital twin is to bypass the averages and biases and develop a personalized and predictive form of healthcare that is built around a person’s specific physiology and pathology rather than vaguely representative historical data. “The idea [of the virtual human] starts from the premise that modern medicine isn’t really that modern in scientific terms,” Peter Coveney, a computer scientist and coauthor of “Virtual You” (2023), told me over Zoom. Coveney worked on a project to digitally twin the entire 60,000-mile-long circulatory system of a deceased South Korean woman named Yoon-Sun who had donated her body to science. Cross-sections were taken from her frozen cadaver to help trace the network of vessels, arteries, veins and capillaries. Once mapped, they created a digital simulation of how her blood flowed by feeding 200,000 lines of code into a supercomputer.

“It’s still debatable in some sense how far it’s a science,” Coveney went on. “A lot of decisions that are taken on how to treat people are based on the past: You look like someone we’ve dealt with in the past or have a similar condition to someone we’ve dealt with in the past, and this is how we treated that patient, so we are going to give you the same thing. That’s better than nothing. But ultimately it falls short of what we need, which is individuals being treated for what they are. We are using your data, not other people’s data, to tell you how you should be treated. That’s quite a compelling vision.”

According to Coveney and his coauthor, Roger Highfield, your digital doppelganger, should one ever become available, wouldn’t just spring into action when you’re ill, with data and diagnoses about what’s wrong and how to fix it. It would also assist in keeping you healthy, predicting the effects of diet and lifestyle the same way meteorologists predict when hurricanes will make land.

Their book attempted to capture in a single volume the global enterprise to build virtual humans, detailing the attempts to digitally twin livers, lungs, bones, guts, brains and more. In the final chapter, Coveney and Highfield envision “a lifelong, personalized clone that ages just like you, as it is constantly updated with each measurement, scan or medical examination that you have at the doctor’s, along with behavioral, environmental, genetic and a plethora of other data.” 

“The rise of the digital twin could and should give many people pause for thought about where this technology is taking us,” they conclude. “Some may welcome how digital twins allow them to take more responsibility for their destiny, others may condemn this as unnatural. We should not let the downsides deter us, however. The Janus-like nature of technology has been apparent for more than a million years: ever since we harnessed fire, we knew we could use it to stay warm and cook but also to burn down our neighbors’ houses and fields.”


Models have always been vital tools when it comes to creating representations of physical objects, phenomena, processes and systems to gain a deeper understanding of the world around us. Science is essentially the practice of model-making, and history is filled with replicas and representations, miniatures and prototypes. But all models are simulacra: simplified representations of the real thing. The trick is to make a model close enough to reality to be useful but not so close as to become as complex as the thing you are trying to understand. “What is simple is always false,” wrote the poet Paul Valéry in his 1942 book, “Bad Thoughts and Others.” “What is not is unusable.”

During the 20th century, models gave way to computer simulations, and the physical objects we’d relied on for centuries were usurped by superior and compelling virtual objects. “The molecular model built with balls and sticks gives way to an animated world that can be manipulated at a touch, rotated and flipped,” wrote Sherry Turkle, a sociologist at the Massachusetts Institute of Technology, in the 2009 book “Simulation and Its Discontents.” “The architect’s cardboard model becomes a photorealistic virtual reality that you can ‘fly through.’” During the creation of the thermonuclear bomb in the 1940s, there were fierce debates about whether the simulated detonations they created inside the vacuum tubes of the MANIAC were consequence-free worlds that perfectly replicated nature in all its complexity, or instead paltry simplifications that couldn’t be trusted as legitimate sources of scientific truth.  

Regardless, simulation has a spellbinding allure. The historian Peter Galison interviewed a physicist who worked on the first H-bomb and admitted that he couldn’t bear to look at the hardware of the bomb itself or the real-life explosions it created but worked rigorously on its computer simulations. “The alternative world of simulation, even in its earliest days, held enough structure on its own to captivate its practitioners,” Galison wrote in a 2011 paper. 

Already, there is something of a lie in the etymology of the digital twin. The words “model” and “simulation” both contain an undisguised wink toward that which they are intended to represent. Model comes from the Latin word “modulus,” meaning “measure.” Simulate comes from the Latin for “simulo,” meaning “imitate”. But twin comes from the Anglo-Saxon for “getwinne” meaning double. It conjures centuries of cultural baggage — of twins as eerily identical, deeply connected and, on occasion, telepathic. 

“While models are generally understood as an abstraction of reality, the strong focus on realism and comprehensiveness in the conceptualizations suggest that a Digital Twin aspires to move beyond being an abstraction and instead represent all functionalities of a concrete physical entity, and in some cases even suggest a kind of hyperrealism,” wrote the philosopher Paulan Korenhof and her colleagues in a 2021 paper.

Korenhof has been considering how digital twins of objects, places and processes might change our relationship with physical reality. In the paper, she and her coauthors give the example of a dairy farmer who is able to monitor and control milk production via a digital twin. No longer required to maintain proximity to the dairy or its animals, the farmer’s relationship with the farm is reduced to the occasional “confirmation check” to make sure the twin is functioning correctly. The human is optimized out of the picture. 

Could digital twins become yet another technology that fuels the odd and vague feeling of estrangement we feel from the natural world, from others, from ourselves — from all the things that used to feel close? And what if that estrangement begins to permeate our relationship with our own bodies?

When new technologies become omnipresent in our lives, we often end up relinquishing responsibility for the problems they miraculously solve. Our intimacy with our smartphones has contributed to a sense of memory loss known as “digital amnesia.” Our reliance on GPS apps like Google Maps has contributed to, among many things, a dearth in geographical knowledge. If our health were to be constantly monitored in real-time by a twin that informs us if there are any signs of forthcoming illness or injury, how could that disrupt our interoception — our internal sense of our bodily functions and well-being? 

“What is simple is always false. What is not is unusable.”
—Paul Valéry

Perhaps it could have the opposite effect. The people I know who’ve bought smartwatches that keep a constant record of their heart rates and blood pressures haven’t become disembodied by the experience. In fact, they are more obsessed with their bodies than ever before.

Other questions abound. If your digital twin communicates that you are on the brink of heart disease or at high risk of Alzheimer’s, are you already ill? Will you go to work tomorrow? If the digital twin makes predictions about forthcoming ill health, and you successfully make drastic changes in your life that improve the prognosis, how would you ever know if the twin was right? How can we build faith in its predictions? Or should we be more worried about how to build up suspicion?

More generally, how accessible will this technology actually become? It’s not hard to foresee a future in which personalized high-tech healthcare becomes the preserve of the rich. Billionaires could end up with their bodies digitized in supercomputers, an extra tool in the quest to prolong life, while the rest of us make do with the crumbling, overpriced healthcare system we have today.

And then there’s the data. In the U.S., people generally do not own their medical records: In all 50 states, medical providers, not patients, own medical data. In the U.K., the National Health Service recently signed a £330 million (about $419 million) contract with the U.S. spy-tech firm Palantir to build a new data platform, in turn granting it access to patient data and medical records. Those records might become available to health insurance companies, raising the cost of care when you need it. 

When I put some of these questions to Vázquez in the chapel, he conceded that we are at the frontier of this technology and that he didn’t have the answers. “In the very near future, all these sorts of discussions must be brought to the table. We need to integrate other people into the conversations — doctors, patient associations, philosophers and sociologists — to really analyze all the changes this might bring to society. It is very dangerous to leave the fate of the world to engineers.”

Matthias Braun, a professor of ethics and technology at the University of Bonn, who is leading a European Research Council-funded project on the ethics of digital twins, was keen to remind me that, while the potential downsides were abundant, digital twins might also contribute to human flourishing in fascinating ways. “When we talk to people with disabilities about digital twins, they say it would be so cool for them to have a kind of tool that lets them know, for example, when they might be about to experience a very severe phase. For example, dementia often develops in phases. If I could know when I have a bad phase coming, then I could plan to take my drugs then, so that I can still see my family, have a normal conversation, remember who they are — it could be life-changing. It’s like another life, another form of myself.”

But, he added: “It also confronts us with very interesting and fundamental questions about, for example, what does it mean to be human? What does it mean to have a physical body? Would a digital body part feel like a prosthesis or like an extension of the self?” 

Perhaps, he said, we should find a way to limit what they can tell us. There may be things we just don’t want to know, things we can’t un-see, un-hear or vanquish from our minds. Things that change how we feel about ourselves, our time and those around us. To make accurate predictions about the future may just keep us imprisoned by what-ifs, marooned from the present that unfolds around us. 


On the ground floor of the BSC building is an earth sciences department where the staff is working with the European Union and numerous other partners on a project called Destination Earth. It aims to create a digital twin of the whole world. The “full Earth replica,” reads the project’s website, will come online by 2030 and aims to produce “simulations which become indistinguishable from the reality.” As extreme weather becomes more frequent and the climate crisis more pronounced, the digital twin can “forecast these events with even greater accuracy, to predict their impact on the environment, life and property.”

“It’s a sort of crystal ball,” Francisco Doblas Reyes, the director of the department, told me. “But it’s a really expensive crystal ball, because every experiment we’ll be running will be an experiment with the most expensive climate models that can be run right now, producing huge amounts of data — petabytes of data per day, equivalent to the traffic of WhatsApp per day. … That’s why we need the machine like the one we have downstairs. The problem is, even that machine is not big enough for the problem we are dealing with: the whole Earth at an unprecedented resolution. But that’s what society really needs these days. If you want to be better prepared for what climate change is going to throw at us, you need to have the most reliable information you can produce, and that is what we are trying to do.”

When I finally left the BSC, I strode down the wide boulevards and narrow alleys of Gràcia toward Hibernian, a secondhand English-language bookshop, because I had become haunted, after months of researching and talking about digital twins and mirror worlds, by a Jorge Luis Borges short story that I couldn’t place. Borges had a perverse fascination with the reflected image. As a child, he had nightmares about discovering that his face was actually a mask. He was so terrified of his reflection that he feared the polished mahogany of the furniture in his bedroom should he glimpse his likeness, or something much worse. “Mirrors have something monstrous about them,” he wrote in 1940. This terror, as his biographer Edwin Williamson wrote, made him obsessed with “doubles, reproductions, copies, facsimiles, translations — with anything, indeed, that could undermine the uniqueness of an object or a person by dint of repeating it.”

When I got to Hibernian, they had the story I wanted on a shelf near the door: “The Aleph.” I bought it, sat down outside a nearby bar and started reading. In Borges’ tale, an acquaintance of a man (also called Borges) discovers an orb — an “Aleph” as he calls it — in his basement, into which he can peer and see everything that exists all at once. Imagining the man to be insane, Borges follows him into the basement and finds a “small iridescent sphere of almost unbearable brilliance.” Borges peers inside:

I saw the teeming sea; I saw daybreak and nightfall; I saw the multitudes of America; I saw a silvery cobweb in the center of a black pyramid; I saw a splintered labyrinth (it was London); I saw, close up, unending eyes watching themselves in me as in a mirror; I saw all the mirrors on earth and none of them reflected me; I saw in a backyard of Soler Street the same tiles that thirty years before I’d seen in the entrance of a house in Fray Bentos; I saw bunches of grapes, snow, tobacco, lodes of metal, steam; I saw convex equatorial deserts and each one of their grains of sand; I saw a woman in Inverness whom I shall never forget; I saw her tangled hair, her tall figure, I saw the cancer in her breast; I saw a ring of baked mud in a sidewalk, where before there had been a tree; … I saw the circulation of my own dark blood; I saw the coupling of love and the modification of death; I saw the Aleph from every point and angle, and in the Aleph I saw the earth and in the earth the Aleph and in the Aleph the earth; I saw my own face and my own bowels; I saw your face; and I felt dizzy and wept, for my eyes had seen that secret and conjectured object whose name is common to all men but which no man has looked upon — the unimaginable universe.

I felt infinite wonder, infinite pity. 

“Feeling pretty cockeyed, are you, after so much spying into places where you have no business?” said a hated and jovial voice. “Even if you were to rack your brains, you couldn’t pay me back in a hundred years for this revelation. One hell of an observatory, eh, Borges?”

The post A Digital Twin Might Just Save Your Life appeared first on NOEMA.

]]>
]]>
The Secret History And Strange Future Of Charisma https://www.noemamag.com/the-secret-history-and-strange-future-of-charisma Wed, 24 May 2023 14:13:00 +0000 https://www.noemamag.com/the-secret-history-and-strange-future-of-charisma The post The Secret History And Strange Future Of Charisma appeared first on NOEMA.

]]>
In 1929, one of Germany’s national newspapers ran a picture story featuring globally influential people who, the headline proclaimed, “have become legends.” It included the former U.S. President Woodrow Wilson, the Russian revolutionary Vladimir Lenin and India’s anti-colonialist leader Mahatma Gandhi. Alongside them was a picture of a long-since-forgotten German poet. His name was Stefan George, but to those under his influence he was known as “Master.”

George was 61 years old that year, had no fixed abode and very little was known of his personal life and past. But that didn’t matter to his followers; to them he was something more than human: “a cosmic ego,” “a mind brooding upon its own being.” Against the backdrop of Weimar Germany — traumatized by postwar humiliation and the collapse of faith in traditional political and cultural institutions — George preached an alternate reality through books of poetry. His words swam in oceans of irrationalism: of pagan gods, ancient destinies and a “spiritual empire” he called “Secret Germany” bubbling beneath the surface of normal life. In essence, George dreamed of that terribly persistent political fantasy: a future inspired by the past. He wanted to make Germany great again.

George dazzled Germans on all sides of the political spectrum (although many, with regret, would later distance themselves). Walter Benjamin loitered for hours around the parks of Heidelberg that he knew the poet frequented, hoping to catch sight of him. “I am converting to Stefan George,” wrote a young Bertolt Brecht in his diary. The economist Kurt Singer declared in a letter to the philosopher Martin Buber: “No man today embodies the divine more purely and creatively than George.”

Max Weber, one of the founding fathers of sociology, met Stefan George in 1910 and immediately became curious. He didn’t buy George’s message — he felt he served “other gods” — but was fascinated by the bizarre hold he seemed to have over his followers. At a conference in Frankfurt, he described the “cult” that was growing around him as a “modern religious sect” that was united by what he described as “artistic world feelings.” In June that year, he wrote a letter to one of his students in which he described George as having “the traits of true greatness with others that almost verge on the grotesque,” and rekindled a particularly rare word to capture what he was witnessing: charisma.

At the time, charisma was an obscure religious concept used mostly in the depths of Christian theology. It had featured almost 2,000 years earlier in the New Testament writings of Paul to describe figures like Jesus and Moses who’d been imbued with God’s power or grace. Paul had borrowed it from the Ancient Greek word “charis,” which more generally denoted someone blessed with the gift of grace. Weber thought charisma shouldn’t be restricted to the early days of Christianity, but rather was a concept that explained a far wider social phenomenon, and he would use it more than a thousand times in his writings. He saw charisma echoing throughout culture and politics, past and present, and especially loudly in the life of Stefan George.

“I knew: This man is doing me violence — but I was no longer strong enough. I kissed the hand he offered and with choking voice uttered: ‘Master, what shall I do?’”
— Ernst Glöckner

It certainly helped that George was striking to look at: eerily tall with pale blueish-white skin and a strong, bony face. His sunken eyes held deep blue irises and his hair, a big white mop, was always combed backward. He often dressed in long priest-like frock coats, and not one photo ever shows him smiling. At dimly lit and exclusive readings, he recited his poems in a chant-like style with a deep and commanding voice. He despised the democracy of Weimar Germany, cursed the rationality and soullessness of modernity and blamed capitalism for the destruction of social and private life. Instead, years before Adolf Hitler and the Nazis came to power, he foresaw a violent reckoning that would result in the rise of a messianic “fuhrer” and a “new reich.”

Many were immediately entranced by George, others unnerved. As the Notre Dame historian Robert Norton described in his book “Secret Germany,” Ernst Bertram was left haunted by their meeting — “a werewolf!” he wrote. Bertram’s partner, Ernst Glöckner, on the other hand, described his first encounter with George as “terrible, indescribable, blissful, vile … with many fine shivers of happiness, with as many glances into an infinite abyss.” Reflecting on how he was overcome by George’s force of personality, Glöckner wrote: “I knew: This man is doing me violence — but I was no longer strong enough. I kissed the hand he offered and with choking voice uttered: ‘Master, what shall I do?’”

As German democracy began to crumble under the pressure of rebellions and hyperinflation, George’s prophecy increased in potency. He became a craze among the educated youth, and a select few were chosen to join his inner circle of “disciples.” The George-Kreis or George Circle, as it came to be known, included eminent writers, poets and historians like Friedrich Gundolf, Ernst Kantorowicz, Max Kommerell, Ernst Morwitz and Friedrich Wolters; aristocrats like brothers Berthold, Alexander and Claus von Stauffenberg; and the pharmaceutical tycoon Robert Boehringer. These were some of the country’s most intellectually gifted young men. They were always young men, and attractive too — partly due to George’s misogynistic views, his homosexuality and his valorization of the male-bonding culture of Ancient Greece. 

Between 1916 and 1934, the George Circle published 18 books, many of which became national bestsellers. Most of them were carefully selected historical biographies of Germanic figures like Kaiser Frederick II, Goethe, Nietzsche and Leibniz, as well as others that George believed were part of the same spiritual empire: Shakespeare, Napoleon and Caesar. The books ditched the usual objectivity of historical biographies of the era in favor of scintillating depictions and ideological mythmaking. Their not-so-secret intention was to sculpt the future by peddling a revision of Germany’s history as one in which salvation and meaning were delivered to the people by the actions of heroic individuals.

In 1928, he published his final book of poetry, “Das Neue Reich” (“The New Reich,”) and its vision established him as some kind of oracle for the German far-right. Hitler and Heinrich Himmler pored over George Circle books, and Hermann Göring gave one as a present to Benito Mussolini. At book burnings, George’s work was cited as an example of literature worth holding onto; there was even talk of making him a poet laureate. 

“Their not-so-secret intention was to sculpt the future by peddling a revision of Germany’s history as one in which salvation and meaning were delivered to the people by the actions of heroic individuals.”

Weber had died in 1920, before George truly reached the height of his powers (and before the wave of totalitarian dictatorships that would define much of the century), but he’d already seen enough to fatten his theory of charisma. At times of crisis, confusion and complexity, Weber thought, our faith in traditional and rational institutions collapses and we look for salvation and redemption in the irrational allure of certain individuals. These individuals break from the ordinary and challenge existing norms and values. Followers of charismatic figures come to view them as “extraordinary,” “superhuman” or even “supernatural” and thrust them to positions of power on a passionate wave of emotion. 

In Weber’s mind, this kind of charismatic power wasn’t just evidenced by accounts of history — of religions and societies formed around prophets, saints, shamans, war heroes, revolutionaries and radicals. It was also echoed in the very stories we tell ourselves — in the tales of mythical heroes like Achilles and Cú Chulainn. 

These charismatic explosions were usually short-lived and unstable — “every hour of its existence brings it nearer to this end,” wrote Weber — but the most potent ones could build worlds and leave behind a legacy of new traditions and values that then became enshrined in more traditional structures of power. In essence, Weber believed, all forms of power started and ended with charisma; it drove the volcanic eruptions of social upheaval. In this theory, he felt he’d uncovered “the creative revolutionary force” of history. 

Weber was not the first to think like this. Similar ideas had been floating around at least as far back as the mid-1700s, when the Scottish philosopher David Hume had written that in the battle between reason and passion, the latter would always win. And it murmured in the 1800s in Thomas Carlyle’s “Great Man Theory” and in Nietzsche’s idea of the “Übermensch.” But none would have quite the global impact of Weber, whose work on charisma would set it on a trajectory to leap the fence of religious studies and become one of the most overused yet least understood words in the English language.


Come the spring of 1968, the New York Times columnist Russell Baker was declaring that “the big thing in politics these days is charisma, pronounced ‘karizma,’” and that all the Kennedys had it. Since then, charisma has been used to explain everything from Marilyn Monroe to anticolonial uprisings, New Age gurus and corporate CEOs. When the Sunni jihadist preacher Anwar al-Awlaki — whose YouTube videos were linked to numerous terrorist attacks around the world — was executed by drone strike by the Obama administration in 2011, some observers suggested that his main threat had been his “charismatic character.”

Today, a Google Ngram of its usage in American English shows it to be still on a steep upward trend. And not just in American English: Charisma has migrated to Chinese in its Western pronunciation, to Japanese as “karisuma” and to Spanish, French and Italian as “carisma,” “charisme” and “carisma” respectively. The wholesale migration of the word in exact or close to its original form suggests that no equivalent previously existed in those languages to express its magnetic and mysterious quality. On TikTok, charisma has become a viral term; shortened to “rizz” or “unspoken rizz,” it refers to a person’s wordless ability to seduce a love interest with body gestures and facial expressions alone. The hashtag #rizz has over 13 billion views. 

A word survives and thrives because it continues to quench an explanatory thirst; it meets a need or desire. And any word carefully examined will reveal itself to be a wormhole — an ongoing exchange between the past and the present. The prevalence of charisma implies a widespread belief in the power of it, and also in the ability of extraordinary individuals to change history. Weber’s terms still echo: Something magical and dangerous, something unfathomable, is afoot when charisma is present. “The pertinent question,” pondered the cultural theorist John Potts, “is not whether charisma actually exists, but why it exists.”

Most of us will have experienced the allure of a charismatic individual in our lives. Few have experienced the feeling of being charismatic, where your desires, beliefs and actions are having a disproportionately powerful influence on those around you. But when people try to break down how it feels to experience it, they veer into cryptic comparisons. “When she [Elizabeth Holmes] speaks to you, she makes you feel like you are the most important person in her world in that moment,” Tyler Shultz, a whistleblower who worked at Theranos, told CBS News. “She almost has this reality distortion field around her that people can just get sucked into.” 

About a meeting with Leo Tolstoy, Maxim Gorky wrote: “I can not express in words what I felt rather than thought at that moment; in my soul there was joy and fear, and then everything blended in one happy thought: ‘I am not an orphan on the earth, so long as this man lives on it.’” Reflecting on her rare experiences of charisma across 25 years of interviewing notable figures, the newspaper columnist Maggie Alderson wrote: “I still don’t understand what creates the effect. … If not fame, beauty, power, wealth and glory then what? It must be innate. I find that quite thrilling.”

“Something magical and dangerous, something unfathomable, is afoot when charisma is present.”

It certainly seems to be a subjective and circumstantial spell: a “prophet” to some is a “werewolf” to others. Not all young men and boys are drawn toward the “charisma” of the misogynistic influencer Andrew Tate; not all financiers and experts who encountered Holmes and Theranos were convinced to invest in a technology that turned out to not exist. “We tend to think of charisma in a sinister register — a kind of regressive thing, where people are being affirmed in their prejudices,” the University of Chicago anthropologist William Mazzarella explained to me. “Yielding is the problem from this point of view. It’s viewed as submitting to domination, being taken for a ride and not being the master of your own destiny. But then there’s also the sense of yielding as being selfless and participating in something greater than yourself. It’s the thing that allows us to be our most magnificent as human beings.”

As Mazzarella reminded me, people also use charisma to talk about the most admired and inspiring figures in their lives and the charismatic teachers they’ve had. “There the implication is that this person helped me to become myself or transcend myself in a way that I wouldn’t otherwise have been able to do,” he said. “That’s what’s interesting about charisma: It touches the darkest fundamentals of human impulses while having the capacity to point to our highest potentials. Charisma has these two faces, and it’s the fact that we seem to not be able to have one without the other that is so uncanny and disturbing. Inspiring charismatic figures can become exploitative, manipulative or violent. Violence gives way to liberation, or liberation gives way to violence. The problem is not just that we have a hard time telling the good charisma from the bad charisma, but that one has a way of flipping into the other.”

Weber believed that whether we thought of ourselves as explicitly religious or not, humans had a fundamental need for mysticism. As the modern world was becoming increasingly secular, industrialized and rationalized — in his now famous term, “disenchanted” — and more faith was placed in a demystified scientific worldview rather than in gods or shamans, the irrational and mystical appeal of charismatic power wouldn’t just fade away; we would crave it even more. 

This is perhaps most evident in our political realm, where a longing for charisma prevails, and a lack of it is frequently commented on. In the U.K.’s left-leaning newspaper The Guardian this year, Andy Beckett bemoaned the Labor leader Keir Starmer’s lack of “messianic qualities” — unlike Tony Blair, he wrote, “Starmer can’t use personal charisma.” Meanwhile in America’s conservative magazine National Review, Nate Hochman wrote that while Ron DeSantis might be focused and competent, Donald Trump “beats him in raw charisma.” In fact, wrote the American historian David Bell, “Trump’s base [is] tied to him by one of the most remarkable charismatic relationships in American history.” Last month, Vanity Fair reported a theory that Tucker Carlson’s departure from Fox News was linked to Rupert Murdoch’s distaste for Carlson’s “messianism” and Murdoch’s ex-fiancée’s belief that Carlson was “a messenger from God.”

“I’m convinced that the way we frame political discussions has far more of an impact on politics than we realize,” explained Tom Wright, a cultural historian at the University of Sussex. “If one of the terms of debate is that some people have a gift and others don’t, then that conditions the way we reflect on the political process and the kind of leadership we want, the kind of disruption that’s possible, the kind of people that can and don’t enter politics.” A good example of this was a 2007 campaign slogan for Gordon Brown in the U.K.: “Not flash, just Gordon.” The goal was to communicate that his blatant lack of charisma shouldn’t detract from his trustworthy competence as a political leader. Brown would go on to lose his first and only general election to the charismatic David Cameron.

“That’s what’s interesting about charisma: It touches the darkest fundamentals of human impulses while having the capacity to point to our highest potentials.”
— William Mazzarella

A scientifically sound or generally agreed-upon definition of charisma remains elusive even after all these years of investigation. Across sociology, anthropology, psychology, political science, history and theater studies, academics have wrestled with how exactly to explain, refine and apply it, as well as identify where it is located: in the powerful traits of a leader or in the susceptible minds of a follower or perhaps somewhere between the two, like a magnetic field. 

The Cambridge Dictionary reports that charisma is “a special power that some people have naturally,” but this association with individual influence is criticized as just another tedious expression of the Great Man Theory and overlooks much interconnected complexity. In her book, “Charisma and the Fictions of Black Leaders,” Erica Edwards argued that this view has “privileged charismatic leaders, from Frederick Douglass to Martin Luther King Jr., over the arduous, undocumented efforts of ordinary women, men and children to remake their social reality.” This uncritical faith in charisma as a motor of history, she wrote, “ignores its limits as a model for social movements while showing us just how powerful a narrative force it is.”

As Wright explained to me, Weber himself would disagree with the individualized modern understanding of charisma. “He was actually using it in a far more sophisticated way,” he said. “It wasn’t about the power of the individual — it was about the reflection of that power by the audience, about whether they receive it. He saw it as a process of interaction. And he was as fascinated by crowds as he was by individuals.” In Weber’s words: “What is alone important is how the [charismatic] individual is actually regarded by those subject to charismatic authority, by his ‘followers’ or ‘disciples.’ … It is recognition on the part of those subject to authority which is decisive for the validity of charisma.”

Charisma then, like love or beauty, may be in the eye of the beholder: intoxicating love and belief, enacted on a mass scale, during particular historical circumstances. Along these lines, the late American political scientist Cedric Robinson believed charisma to be a “psychosocial force” that symbolized the ultimate power of the people: the expression of the masses being focused into one chosen individual. Such an individual, he argued, is totally subordinate in the relationship: They must enact the will of the people or their charismatic appeal will vanish. “It is, in truth, the charismatic figure who has been selected by social circumstance, psychodynamic peculiarities and tradition, and not his followers by him.”

Charisma, he wrote, “becomes the most pure form of a people’s authority over themselves.” The charismatic leader, for better or worse, could be understood as a mere mirror or a charming marionette — the “collective projection of the charismatic mass, a projection out of its anguish, its myths, its visions, its history and its culture, in short its tradition and its oppression.” The reason they seem to read the minds of their followers is because they are the chosen embodiment of the group mind. In the leader they see themselves. 

As the Dutch socialist Pieter Jelles Troelstra once wrote, “At some point during my speeches, there often came a moment when I wondered who is speaking now, they or myself?” 


“I’d pretty much adamantly say that most of the research done [on charisma] until the last 10 years has been utterly useless,” said John Antonakis, a professor of organizational behavior at the University of Lausanne. “It’s just been extremely hypothetical — not really putting our fingers on what it is and then not being able to define it in a way where we can experimentally manipulate the behavior and do real scientific field experiments.” Antonakis isn’t a sociologist, historian or cultural theorist; he’s a psychologist and scholar of leadership with a background in math and statistics. He doesn’t believe charisma is a slippery concept at all. “I focus on what I believe to be the core elements of charisma,” he told me. “How one speaks, what one says and how one says it.”  

For over a decade, Antonakis has been experimenting with ways to break charisma down into its composite parts, therefore making it measurable and teachable. He believes it can be the great leveler in a world obsessed with physical appearance. His resulting definition is that charisma is “values-based, symbolic and emotion-laden leader signaling.” 

Along with a team of researchers, he boiled it all down to 12 “charismatic leadership tactics,” or CLTs for short. The CLTs include nine verbal techniques — like the use of metaphors, anecdotes, contrasts and rhetorical questions — as well as three nonverbal ones like facial expressions and gestures. Anyone trained in these CLTs, he said, can become more “influential, trustworthy and leaderlike in the eyes of others.” He and his team developed an artificial intelligence algorithm, which they trained on almost 100 TED talks, that can identify the charismatic quality of speeches. The algorithm is called “Deep Charisma” but Antonakis calls it his “charismometer.” 

In one experiment, they used the algorithm to show that a higher prevalence of CLTs in a TED talk correlated with higher YouTube views and higher ratings of inspiration reported by test subjects. Charisma, in other words, can equate to internet virality. We shared screens on our Zoom call and he opened up Deep Charisma for me to see. “Think of a famous speech and let’s put it in the machine,” he said. Having just started Malcolm X’s autobiography, I asked him to put in X’s 1964 speech “The Ballot or the Bullet.” 

“Harder to explain is the allure of unconventional individuals who can draw us in against all rationality for a myriad of complex reasons, subconscious desires and historical circumstances.”

“Ah, this is already fantastic,” said Antonakis. “It’s already a metaphor.” He pasted a transcript into Deep Charisma and numbers began to fill the screen. Within a minute, we had a final score: 350. “That’s a very high score,” he told me. “You cannot fool my charismometer.”

Out of interest, I asked him if we could try putting in a speech created by ChatGPT, so he asked it to write one in the style of Winston Churchill. “My fellow citizens,” it began, “we stand here today at a pivotal moment in history.” He pasted the finished speech into Deep Charisma, and it began to analyze. “So now we’re using one artificial network to generate a speech and another one to code it for charisma,” he said. It conjured a calculation. “Oh shit,” he said, “it’s very good.” I asked him to test it again, but this time by asking ChatGPT to just write a “charismatic speech.” I wanted to see if it actually could determine what charisma was rather than simply emulating the style of a known charismatic speaker. A score appeared. “Yeah, this is average,” said Antonakis.

I admired his scientific formulation of charisma and the possibility of democratizing something that was previously thought to be innate and ineffable. But I couldn’t help but feel that to make charisma measurable, he’d had to redefine it, and in that process something integral to the phenomenon had been lost. Deep Charisma can identify the persuasive and uplifting habits of gifted orators and the characteristics of rousing speeches, but perhaps harder to explain is the allure of unconventional individuals who can draw us in against all rationality for a myriad of complex reasons, subconscious desires and historical circumstances.  

I thought about John de Ruiter, the shoemaker from rural Canada who started a religious movement in the 1980s that became a multimillion-dollar spiritual organization with thousands of followers. De Ruiter, who was recently charged with sexually assaulting several women, developed a charisma not through what he said or how he said it, but what he didn’t say: His sermons were just long periods of complete silence, during which he stared at his followers for hours. Or the fact that Trump’s speeches, when read as transcripts, are often rambling and incoherent rather than great works of rhetoric and metaphor. The CLTs don’t seem to touch the deeper mystique. Deep Charisma, Antonakis told me, rated Trump as distinctly average. “He’s not that charismatic,” he said. Millions of Americans, I think, would disagree.

“Human relationships with technology have always been implicitly spiritual.”

The CLTs are adept at pushing human buttons that will make us feel engaged, inspired and impressed. For that reason, it’s no surprise that Antonakis’ work has been picked up by researchers working on artificial intelligence. Last December, a group of computer scientists published a paper titled “Computational Charisma — A Brick by Brick Blueprint for Building Charismatic Artificial Intelligence.” The abstract for the paper begins: “Charisma is considered as one’s ability to attract and potentially influence others. Clearly, there can be considerable interest from an artificial intelligence’s perspective to provide it with such skill,” before concluding with the provocation: “Will tomorrow’s influencers be artificial?” 

Björn Schuller, a professor of artificial intelligence at Imperial College London and the lead author of the paper, told me the most exciting avenue of this research is the voice. “We’re a long way from seeing and accepting visually rendered agents, but we don’t have those issues with the voice anymore,” he said. “We can render a voice from a few seconds of your voice and make a piece of audio that sounds just like you. So if people are just interacting with a voice interface, we’re less worried about the uncanny valley.” The aim is to create a charming and persuasive AI entity you could call up and converse with. “If you have a virtual doctor or mental health therapist, then a charismatic one would probably reach you better,” explained Schuller. “In other words, in human-computer interaction, it gives AI a huge leap forward in terms of acceptance and … I wouldn’t want to say obedience, but …”

Once an AI is perfecting this form of charisma through endless reinforcement and imitation learning, Schuller believes it could become far better at it than humans. “We lose our charisma now and then, because we have our temperament and only so much effort is available,” he said. “But an AI would have no problem controlling expression, tone of voice and linguistics all at the same time. Add that to the fact it’s constantly learning about your likes and dislikes.” 

“At some point,” he concluded, “once the AI has established new approaches and achieved success with it, it might become charismatic in ways that humans haven’t even thought about. We might end up picking up charismatic behavior that has originated from an AI.”


The Eurocentric version of how Weber conceptualized charisma is that he took it from Christianity and transformed it into a theory for understanding Western culture and politics. In truth, it was also founded on numerous non-Western spiritual concepts that he’d discovered via the anthropological works of his day. In one of the less-quoted paragraphs of his 1920 book “The Sociology of Religion,” Weber wrote that his nascent formulation of charisma was inspired by mana (Polynesian), maga (Zoroastrian, and from which we get our word magic) and orenda (Native American). “In this moment,” Wright wrote in a research paper exploring this particular passage, “we see our modern political vocabulary taking shape before our eyes.”

Native American beliefs were of particular interest to Weber. On his only visit to America in 1904, he turned down an invitation from Theodore Roosevelt to visit the White House and headed to the Oklahoma plains in search of what remained of Indigenous communities there. Orenda is an Iroquois term for a spiritual energy that flows through everything in varying degrees of potency. Like charisma, possessors of orenda are said to be able to channel it to exert their will. “A shaman,” wrote the Native American scholar J.N.B. Hewitt, “is one whose orenda is great.” But unlike the Western use of charisma, orenda was said to be accessible to everything, animate and inanimate, from humans to animals and trees to stones. Even the weather could be said to have orenda. “A brewing storm,” wrote Hewitt, is said to be “preparing its orenda.” 

This diffuse element of orenda — the idea that it could be imbued in anything at all — has prefigured a more recent evolution in the Western conceptualization of charisma: that it is more than human. Archaeologists have begun to apply it to the powerful and active social role that certain objects have played throughout history. In environmentalism, Jamie Lorimer of Oxford University has written that charismatic species like lions and elephants “dominate the mediascapes that frame popular sensibilities toward wildlife” and feature “disproportionately in the databases and designations that perform conservation.” 

Compelling explorations of nonhuman charisma have also come from research on modern technology. Human relationships with technology have always been implicitly spiritual. In the 18th century, clockmakers became a metaphor for God and clockwork for the universe. Airplanes were described as “winged gospels.” The original iPhone was heralded, both seriously and mockingly, as “the Jesus phone.” As each new popular technology paints its own vision of a better world, we seek in these objects a sort of redemption, salvation or transcendence. Some deliver miracles, some just appear to, and others fail catastrophically. 

Today, something we view as exciting, terrifying and revolutionary, and have endowed with the ability to know our deepest beliefs, prejudices and desires, is not a populist politician, an internet influencer or a religious leader. It’s an algorithm. 

“The idea that charisma could be imbued in anything at all has prefigured a more recent evolution in the Western conceptualization of charisma: that it is more than human.”

These technologies now have the power to act in the world, to know things and to make things happen. In many instances, their impact is mundane: They arrange news feeds, suggest clothes to buy and calculate credit scores. But as we interact more and more with them on an increasingly intimate level, in the way we would ordinarily with other humans, we develop the capacity to form charismatic bonds. 

It’s now fairly colloquial for someone to remark that they “feel seen” by algorithms and chatbots. In a 2022 study of people who had formed deep and long-term friendships with the AI-powered program Replika, participants reported that they viewed it as “a part of themselves or as a mirror.” On apps like TikTok, more than any other social media platform, the user experience is almost entirely driven by an intimate relationship with the algorithm. Users are fed a stream of videos not from friends or chosen creators, but mostly from accounts they don’t follow and haven’t interacted with. The algorithm wants users to spend more time on the platform, and so through a series of computational procedures, it draws them down a rabbit hole built from mathematical inferences of their passions and desires. 

Like crowds who feel a charismatic leader somehow understands their individual anguish and aspirations, many users of TikTok experience a computational process as akin to mind-reading. People speak of eerie revelations in which the curation of videos in their personal feed has triggered them to reconsider their sexuality (“TikTok’s algorithms knew I was bi before I did. I’m not the only one.”), radicalize their politics (“From transphobia to Ted Kaczynski: How TikTok’s algorithm enables far-right self-radicalization”), or reassess their mental health (“How do I go about bringing this up to my doctor? Because I feel like TikTok says I have ADHD will be laughed at.”)

Users are drawn to the algorithm on an emotional level, wrote Holly Avella, a professor in communication at Rutgers University, not because its gaze is genuinely insightful but because the impression of feeling seen is intoxicating. This, she wrote, works to create “cult-like” beliefs “about algorithms’ access to the unconscious self. … A sort of metaphysical understanding.”  

The inability to understand quite how sophisticated algorithms exert their will on us (largely because such information is intentionally clouded), while nonetheless perceiving their power enables them to become an authority in our lives. As the psychologist Donald McIntosh explained almost half a century ago, “The outstanding quality of charisma is its enormous power, resting on the intensity and strength of the forces which lie unconscious in every human psyche. … The ability to tap these forces lies behind everything that is creative and constructive in human action, but also behind the terrible destructiveness of which humans are capable. … In the social and political realm, there is no power to match that of the leader who is able to evoke and harness the unconscious resources of his followers.”

In an increasingly complex and divided society, in which partisanship has hindered the prospect of cooperation on everything from human rights to the climate crisis, the thirst for a charismatic leader or artificial intelligence that can move the masses in one direction is as seductive as it has ever been. But whether such a charismatic phenomenon would lead to good or bad, liberation or violence, salvation or destruction, is a conundrum that remains at the core of this two-faced phenomenon. “The false Messiah is as old as the hope for the true Messiah,” wrote Franz Rosenzweig. “He is the changing form of this changeless hope.”


By 1933, Hitler had risen to power and the violent and bloody cataclysm Stefan George had beckoned was alive on the streets. His dream of a Secret Germany that would rise to the surface and destroy the old order was afoot. And yet he stayed remarkably quiet and ambiguous. He took a long vacation to Switzerland, which some described as voluntary exile, and died there without ever explicitly revealing whether or not he supported the Nazi Party. At his funeral, younger followers were seen giving salutes, much to the horror of his Jewish followers. Walter Benjamin, now a critic of George, had fled to the Spanish island of Ibiza, from where he wrote in a letter to his friend Gershom Scholem: “[I]f ever God has punished a prophet by fulfilling his prophecy, then that is the case with George.”

The German army officer Claus von Stauffenberg was one of the many devoted George disciples who eventually joined the Nazi movement, and he took part in the invasion of Poland in 1939. But as he became aware of the atrocities being committed, he chose to join the German Resistance in an attempt to close the Pandora’s box that the George Circle had helped to open. 

On July 20, 1944, Stauffenberg walked into a briefing meeting attended by Hitler, shook him by the hand, placed a briefcase (filled with explosives) under the solid oak conference table, and then left the room to take a call. When the bomb exploded, it killed three officers and a stenographer, but Hitler survived, having been shielded from the blast by one of the table legs. 

Just after midnight that night, Stauffenberg and his co-conspirators were lined up against a wall, illuminated by the glaring headlights of a truck, and assassinated by firing squad. Before he was shot, he shouted his last words. “Es lebe das heilige Deutschland!” (“Long live our sacred Germany!”) is typically what historians think he said. But some witnesses disagree, having heard “Es lebe unser geheimes Deutschland!” (“Long live our secret Germany!”)

The post The Secret History And Strange Future Of Charisma appeared first on NOEMA.

]]>
]]>
Concrete Built The Modern World. Now It’s Destroying It. https://www.noemamag.com/concrete-built-the-modern-world-now-its-destroying-it Tue, 06 Dec 2022 14:38:42 +0000 https://www.noemamag.com/concrete-built-the-modern-world-now-its-destroying-it The post Concrete Built The Modern World. Now It’s Destroying It. appeared first on NOEMA.

]]>
ZÜRICH — In the aftermath of World War II, the leaders of Switzerland decided that the country needed to urgently modernize, and concluded that a remote and picturesque valley high in the Alps could be developed for hydropower. Nearly 350 square miles of snow and ice covered the mountains there, and much of it turned to liquid in the spring and summer, a force of water that, if harnessed correctly, could turn turbines and create electricity. And so a plan was hatched to conquer this “white coal” by building the tallest concrete gravity dam the world had ever seen: the Grande Dixence. At nearly 1,000 feet high, it’d surpass the Hoover Dam and be only slightly shorter than the Empire State Building, then the tallest building in the world.

From 1951 onwards, around 3,000 geologists, hydrologists, surveyors, guides and laborers, outfitted with an assemblage of trucks, diggers, dumpsters and drills, advanced like an army into a mostly untouched area of the Alps. To paraphrase Leo Marx: The machines had entered the garden. 

Up there, the workers were met with freezing temperatures that seared the chest and burst the lips, a blazing sun that burned the skin, and a constant threat of avalanches. They lacked waterproof clothing and lived in makeshift shacks, at least until social services forced the construction of an accommodation block that they nicknamed “the Ritz.” The fine dust of pulverized rock coated their lungs, developing, for some, into a slow and deadly disease called silicosis. The site had its own chaplain, Pastor Pache, who was available to counsel the men about their confrontations with nature and death. 

Ultimately, the job these men performed, more than anything else, was pouring concrete, and at a nearly unimaginable scale: more than 200 million cubic feet of it, just about enough to build a wall five feet high and four inches wide around the equator. Cableways carried an endless procession of 880-pound buckets of cement (a primary ingredient in concrete alongside sand, gravel and water) up and down the mountains at a pace of 220 tons each hour. 

For more than a decade, through snow and rain and fog, the workers poured that thick grey mixture day after day after day, and gradually a monolith began to rise between the mountains.

One of the workers was 23-year-old Jean-Luc Godard, who would go on to become one of the most influential filmmakers in the modern era. After sweet-talking his way into a cushy job as a telephone operator, he borrowed a camera and began capturing images of the never-ending flow of concrete. In the film he eventually made (not one of his finest), workers were portrayed like little ants alongside grand machines, and a triumphant soundtrack of classical music played beneath a cheerful voiceover that celebrated the national importance of this monumental construction. Opération Béton, he named it — Operation Concrete — and the construction company bought it off him and rolled it out as an advertisement in cinemas across the nation.

“The modernist utopian dream that the speed and malleability of concrete might solve housing crises, revolutionize cities and birth new ways of living and being was already being shattered by a spiraling capitalist cycle of speculation, construction, deterioration and demolition.”

Godard had moved on by the time the dam was completed in 1961. The finished wall weighed 16.5 million tons and held back more than 14 billion cubic feet of water, and the finished complex now generates some 2 billion kWh of power per year and accounts for 20% of Switzerland’s energy storage capacity. A huge crowd gathered at the top to watch workers pour the final load of concrete, clapping and cheering. A mythology of man triumphing over nature spread through documentaries, books and tourist guides. One booklet described it as a “concrete temple enthroned in a mineral universe,” another as akin to the great pyramids of Egypt, except “useful.” It took on a divine air, like a modern cathedral. Raw material from the dam was even trucked to a nearby village to build a futuristic new concrete church. 

It was the beginning of an era of rampant construction in Switzerland. In the 1950s and 60s, the Swiss poured more concrete per capita than any other country; before the century was out they would expand beyond their borders and become globally recognized concrete connoisseurs, building dams in Morocco and Kenya, housing projects in Iran and airports in Saudi Arabia, each with their own cement factories to provide material. But the Swiss were not alone: Across the Global North, concrete mania had taken hold. 

By the mid-1960s, Godard was in Paris making some of the early masterpieces of the French New Wave, but the earlier wondrous optimism he’d felt for concrete was now replaced by a horrified fascination. The modernist utopian dream that the speed and malleability of concrete might solve housing crises, revolutionize cities and birth new ways of living and being was already being shattered by a spiraling capitalist cycle of speculation, construction, deterioration and demolition. 

In “Alphaville” (1965), a tyrannical dystopia, Godard used the newly concreted areas of Paris as a backdrop. Two years later, in the opening scene of “Deux ou trois choses que je sais d’elle,” a wheelbarrow caked in concrete sat on a recently built motorway, surrounded by a deafening cacophony of traffic and construction. Everywhere the camera looked, Paris was full of holes and craters; cranes filled the sky and the new concrete tower blocks were portrayed as monuments of alienation and loneliness. Godard theorized that the city, like his female protagonist, had been forced to prostitute itself just to survive in an era of “progress.” 

Concrete had been poured before World War II, but it was nothing compared to the scale of what was now taking place. In 1900, minerals associated with the production of cement accounted for only 15% of construction material; by the beginning of the 1970s, it was more than 60% and rising rapidly. The American architect Frank Lloyd Wright described the amount of construction afoot as an “amazing avalanche of material.” In Lagos, the arrival of some 20 million tons of imported cement caused a traffic jam of ships that paralyzed the port for almost a year. 

Godard focused so acutely on concrete because its transformation of the Earth’s surface was happening in front of him. But like anything that becomes ubiquitous, now we hardly notice it. Today, like a heartbeat, concrete is rarely acknowledged, even as our lives depend on it. 


Most of humanity now lives in cities made possible by concrete. The majority of buildings, from skyscrapers to social housing, are made of concrete or contain large amounts of it. Even buildings made from steel, stone, brick or timber are almost always resting on concrete foundations and are sometimes masking an unseen concrete frame. Inside, concrete is ceilings and floors. Outside, it is bridges and sidewalks, piers and parking lots, roads and tunnels and airport landing strips and subway systems. It is water pipes, sewers and storm drains. It is electricity: dams and power plants and the foundations of wind turbines. Concrete is the wall between Israel and Palestine and the Berlin Wall and most other walls. It is “almost anything,” wrote the architect Sarah Nichols in an essay this year, “almost anywhere.”

Concrete is modern, yet ancient. There’s a sense in which it was born in the bowels of volcanoes, formulated by the eruptions of the Earth. Around 100 B.C., Romans discovered that volcanic ash from the slopes of Mount Vesuvius could be mixed with lime and wetted to create a cement, to which they added aggregate. Roman concrete was used to build structures like the Pantheon and the Colosseum, original parts of which still stand today. The story goes that their recipe was lost until it was rediscovered in the ancient books of Vitruvius. What seems more likely is that the use of concrete became much rarer, but never completely died, and still circulated via artisanal builders and craftsmen, until engineers and scientists across Europe eventually understood and then industrialized it. 

To make concrete, you need cement. To make cement nowadays, kilns are heated to more than 1,400 degrees Celsius — similar to the temperature inside a volcano. Into the kilns goes a combination of crushed raw materials (mainly limestone and clay). The heat causes a chemical reaction that creates a new product, clinker, which is then ground down to create the grey powder you see in cement bags. This is then mixed with sand, gravel and water to create concrete.

Concrete is now the second-most consumed substance on Earth behind only water. Thirty-three billion tons of it are used each year, making it by far the most abundant human-made material in history. To make all that, we now devour around 4 billion tons of cement each year — more than in the entire first half of the 20th century, and over a billion tons more than the food we eat annually. 

Such a monstrous scale of production has monstrous consequences. Concrete has been like a nuclear bomb in man’s conquest of nature: redirecting great rivers (often away from the communities that had come to rely on them), reducing quarried mountains to mere hills, and contributing to biodiversity loss and mass flooding by effectively sealing large swathes of land in an impermeable grey crust. The other key ingredients all bring their own separate crises, from the destructive sand mining of riverbeds and beaches to the use of almost 2% of the world’s water.

“Concrete is now the second-most consumed substance on Earth behind only water. Thirty-three billion tons of it are used each year, making it by far the most abundant human-made material in history.”

But most significantly, the carbon-intensive nature of cement has been catastrophic for the atmosphere. The kilns used to heat limestone are commonly run on fossil fuels, which produces greenhouse gases, and as it heats up, the limestone itself releases more CO2. Every kilogram of cement created produces more than half a kilogram of CO2. The greenhouse gas emissions of the global aviation industry (2-3%) are dwarfed by those of the cement industry (around 8%). If concrete was a country, it would be the third largest CO2 emitter, behind only the U.S. and China. In Chile, the region that houses most of the cement plants, Quintero, has become so polluted that it was nicknamed “the sacrifice zone.”

Sacrifice is a fitting word for this paradox: On the one hand, we have the destruction wrought by concrete, and on the other is our desperate need for it to exist. It’s been estimated that to keep up with global population growth, we need to build the urban equivalent of another Paris each week, another New York each month. “A lot of people say, ‘Oh, we shouldn’t use concrete. We should be using something else,’” Karen Scrivener, a leading scientist in the race to create lower-carbon concrete, said in 2012. “This is a totally meaningless comment because it is just not physically possible to produce any other material in such large quantities.” Tyler Ley, a professor of civil engineering at Oklahoma State University, told me: “We never complain about water, but producing freshwater has a massive carbon footprint. We think water is essential. Concrete is in that same vein.”

Concrete has become global because it is produced from some of the most abundant materials on Earth, which means it can usually be manufactured locally, almost anywhere. Building a basic concrete structure is usually easier than using other materials like wood or steel. And it’s cheap: Adjusted for inflation, the cost of cement in the U.S. has barely risen since the beginning of the 20th century. These factors mean concrete has been the great emancipator in poorer parts of the world, enabling low-cost construction of housing, schools and hospitals, even in communities neglected by their governments. “Production and consumption of cement alone,” wrote the anthropologist Cristián Simonetti, “is in almost perfect correlation with the World Bank’s development indicators.”

As the climate crisis accelerates and extreme weather events become more common, concrete will be more important than ever: It is waterproof, fireproof, strong enough to withstand powerful winds and will usually be sturdy for a lifetime or more. As seas rise, coastal walls are being built of concrete to protect urban areas — around 14% of the American coastline and 60% of the Chinese coastline is effectively concrete. On the coast of Nigeria, a 5-mile concrete barrier known as “the Great Wall of Lagos” is being constructed to protect the city’s more affluent neighborhoods from coastal erosion.

In architecture, discussions about concrete have become polarized: Talking about it, as the director of the Swiss Architecture Museum said in a panel discussion earlier this year, is like “talk[ing] about vaccination.” The realization of concrete’s destructive nature has fueled a growing movement to use wood instead, specifically what’s called mass timber: panels of wood that are glued and pressed so tightly together that they come close to the strength of concrete. Three years ago, the world’s tallest timber skyscraper, Mjøstårnet, opened in Norway. The Rocket&Tigerli building in Winterthur, Switzerland, will surpass it when it is completed in 2026. There are plans to emulate these approaches around the world, and there is an enthusiasm among architects and urban planners that we may be at the beginning of a new “skyscraper age” for wood similar to the one fueled by the capabilities of concrete.


On one of the 40 islets of the Enewetak Atoll in the Marshall Islands, there is a gigantic concrete dome known as “the Tomb.” Between 1946 and 1958, the U.S. military conducted 67 nuclear tests in the Marshall Islands, dropping the equivalent of about 1.5 Hiroshima bombs per day. A decade later, the Tomb was built to contain 110,000 cubic yards of highly radioactive soil. A few years ago, someone climbed up onto its surface, which sits just above ground level, and spray painted a message: “Nuclear Waste. Property of USA Government. Please Return to Sender.” 

Modern concrete usually lasts around 100 years before it starts to crumble and fall apart. The half-life of plutonium-239, one of the radioactive particles present in the Tomb, is around 24,000 years. There are already cracks around the edges of the Tomb, and toxic waste is seeping into the surrounding soil and ocean.

The mythical power, permanence and strength of concrete — its ability to protect us from what is dirty and dangerous — still lingers in the public imagination: a magic liquid rock that could be poured to create shapes and forms that were possible in no other material. Stone takes nature millions upon millions of years to create, but we do it in a few hours. Mankind, it seems, has harnessed the geological forces of deep time. 

Architects and builders once regarded concrete as permanent, everlasting. “Cement means concrete; concrete means stone; and stone spells eternity, so far as our finite minds can comprehend,” wrote Floyd Parsons in 1924. But, as Vyta Pivo, an assistant professor of architectural and urban history at the University of Michigan, told me over the phone, “Concrete is none of the things we were led to believe it is.” She explained: “Our addiction to concrete is not just a scientific or technological issue, it’s a deeply cultural issue. This idea of concrete as a miracle material was actively pursued by people in positions of power. In the U.S., for example, manufacturers and companies created movies, booklets and magazines. They trained people to go into rural areas and teach people how to work with cement and use concrete in all kinds of daily applications. It was a real education project to teach us how to accept concrete into our everyday environments. So it wasn’t that concrete was necessarily the best or most obvious or cheapest. There were certain actors who made it that way.” 

“The mythical power, permanence and strength of concrete — its ability to protect us from what is dirty and dangerous — still lingers in the public imagination.”

Throughout the 20th century, as the U.S. expanded its power abroad, a cult of concrete followed. Prior to the American occupation of the Philippines in 1898, not a single bag of cement had ever been shipped there. By 1913, wrote Diana Martinez in her book “Concrete Colonialism,” visitors were describing “Manila’s approaching horizon as a ‘huge mass of concrete.’” Likewise, wrote Pivo, “In Vietnam, concrete was deployed to create a material surface on which U.S. foreign policy unfolded.” 

But modern concrete does not operate on the deep time of rock. Its durability is severely limited. It is restless. “Reinforcement really is the only reason concrete is everywhere today,” said Lucia Allais, an associate professor of architecture at Columbia University. Experiments with reinforced concrete began in the mid-1800s as people sought to mask its weaknesses and make it do things it couldn’t. The reason for this is that concrete has extremely high compressive strength: It’s really difficult to crush. Today’s strongest concretes can withstand pressure of more than 100 megapascals — “about the weight of an African bull elephant balanced on a coin,” as the historian Vaclav Smil put it. But concrete has low tensile strength: It’s easy to pull apart. Steel bars, it turned out, have pretty much the opposite qualities, so rebar (reinforcing bar) became commonly used to create a strengthening skeleton for concrete to be poured around. Almost all of the concrete you see today is reinforced.

“The problem with that is the process of carbonation,” said Allais. “There is carbon dioxide everywhere in the atmosphere, and any time concrete is exposed to carbon dioxide, it permeates its pores.” When the CO2 permeates it triggers a chemical reaction in the concrete that causes the rebar to rust. “The steel expands because it’s rusting. And the concrete cracks and fails. … And what’s especially interesting is that the amount of CO2 in the atmosphere in the last 100 years has greatly expanded, due in no small part to the fact the concrete industry is emitting massive amounts of it into the atmosphere.”

Scientists are trying to figure out how long it takes reinforced concrete to degrade because of carbonation. The average result for a standard structure is 100 years, Allais said. “When you consider that reinforced concrete was invented around 100 years ago,” she went on, “you get this amazing image that the concrete all around the world is beginning to fail.”

“Modern concrete does not operate on the deep time of rock. Its durability is severely limited. It is restless.”

There have already been several high-profile fatal accidents with concrete infrastructure, like the Morandi Bridge in Italy that collapsed and killed 43 people in 2018, or Champlain Towers South, a 12-story building in Miami that fell and killed 98 people in 2021. But the rot is widespread: Last year’s Infrastructure Report Card for the U.S. graded much of the country’s concrete infrastructure — roads, dams, airports, stormwater systems, inland waterways — as a D, meaning poor, at-risk and exhibiting significant deterioration. In the U.K., a health minister disclosed to Parliament this year that 34 hospital buildings have concrete roofs that are in danger of sudden collapse. 

We find ourselves on a treadmill of dependency on a material that is slowly deteriorating from the moment it is first poured. While much of the Global South is embarking on a century of construction, the built environment of the Global North is destined for the monumental challenge of maintenance, demolition and, in the worst-case scenario, ruination. 

Demolished concrete buildings will go mostly into landfills. Concrete can, in theory, be recycled, but the process of separating rubble from rebar is expensive and time-consuming, and therefore done on nowhere near the scale to make an impact. Observing landfills in the Lehigh Valley in Pennsylvania, Pivo noticed concrete was being dumped back in the huge craters of old limestone quarries that its production created. Mass deposits of crumbled concrete, academics have suggested, will become the stratigraphic marker of our age: a scar left by the great acceleration that really will last for eternity.


On the evening I arrived in Zürich, I walked past a bistro showing a documentary called “Dead End Concrete,” about concrete’s role in environmental degradation. Materials are really on the mind around here, most likely because of the influence of ETH Zürich, the city’s esteemed public research university. Nicknamed “the M.I.T. of Europe,” ETH is a hotbed for science, technology, engineering and mathematics, and boasts 22 Nobel laureates among its alumni, including Albert Einstein. 

I’d come by train to meet Philippe Block, the head of ETH’s Institute of Technology in Architecture. Block is quick to acknowledge concrete’s problems, but his mantra is “Do concrete right.” “Sustainability is not just about materials,” he said as we walked around one of the campuses. “It’s about what you do with materials. Just saying that concrete is bad and wood is good is just wrong. … In the developing world, we have to provide adequate dwellings, infrastructure and so on. There is no way around it. Now, if we were to build all that in timber — can you even imagine the deforestation and the monocultures required to grow all that wood? The amount of biodiversity destroyed? It would be like palm oil times a thousand. It would be a total disaster.”

There won’t be a single perfect solution to the problem of concrete. One potential advancement is green concrete, which is decarbonized by changing the recipe, production process or longevity. Another is to capture emitted carbon at cement plants and store or reuse it. 

Block is most interested in drastically reducing the sheer amount of it that we pour into our built environment. His research focuses on the ways we can more intelligently design and build. “I want to show that this so-called worst material can be the opposite, if you do it properly.” Thinking of concrete as artificial stone, he claims, can help us recover the lost art forms of the master masons. “And the language of stone,” he said, “is the arch.” 

Block’s early research was focused on heritage architecture — vaulted cathedrals and other historic structures. “Our modern engineering tools are quite inadequate in explaining how safe these buildings are,” he said. Sometimes, he went on, digital engineering models of old buildings concluded that there should be no way they could still be standing, that they should’ve collapsed centuries ago. But there they were, solid, unmoving. Maybe, Block thought, there was something wrong with the models. 

“If we were to build all that in timber — can you even imagine the deforestation and the monocultures required to grow all that wood? The amount of biodiversity destroyed?”
— Philippe Block

Block was drawn in particular to vaults: self-supporting arched roofs or ceilings, different types of which have recurred in various architectural styles, from the muqarnas vaults across the Islamic world to the Nubian vaults in Sudan and Egypt to Gothic architecture in Europe. He went to visit Kings College Chapel in Cambridge, which was completed in 1515 and boasts a peculiarly English style of vault called a fan vault, because the pattern resembles spread fans. Constructed by the master mason John Wastell, the chapel vaults were designed with a flawless geometry that supported the entire ceiling through compression alone, without any mortar or cement. Staring up at that ceiling, both beautifully simple yet overwhelmingly complex, you feel like you’re beneath something organic, like a spider’s web. William Wordsworth called it a “glorious Work of fine intelligence”: “Where light and shade repose, where music dwells / Lingering — and wandering on as loth to die; / Like thoughts whose very sweetness yieldeth proof / That they were born for immortality.”

Block went for a walk on top of the vaults. “In thickness-to-span ratio, they were proportionally as thin as an eggshell. They were so thin I could feel them vibrating. When I jumped, I felt them bounce. And yet they were still strong and standing. When you have an experience like this, and really feel how exceptionally thin this totally unreinforced structure is, it really motivates you. You realize: Damn, we have forgotten something. We have forgotten this knowledge.” 

Block’s research group started to investigate various styles of vaults, and crunched their logic to create bridges, pavilions and shell structures that were exhibited around the world. He wanted to show the possibility of what he calls “strength through geometry.” But after being provoked by a colleague to demonstrate how these discoveries and ideas solved the major challenges faced by modern construction, Block’s team decided to focus on one of the most banal yet important structural elements of a building: floors. 

“We need to talk about floors,” he told me. To combat sprawl, Block acknowledges, we have to build a little higher. But for a mid-rise concrete building, say 20-40 stories, 75% of its mass is in the structure, which exists mainly to keep the building up, and nearly half the mass of the structure is floor. “In other words,” Block said, “providing a flat and horizontal surface for people to walk on is super materially intensive.” An estimated 2 trillion square feet of floor, most of which will consist of thick slabs of reinforced concrete, is expected be constructed all over the world between now and 2050.

“When you feel how exceptionally thin this totally unreinforced structure is, it really motivates you. You realize: Damn, we have forgotten something. We have forgotten this knowledge.”
— Philippe Block

After 10 years of research and development, Block and his team devised what they call the Rippmann Floor System (RFS). It is an unreinforced concrete panel that is designed to redistribute the forces of compression around the floor using roughly the same logic as vaults. The final version of the panel is a secret for now, but Block showed me a small prototype. It was rectangular and composed of five interlocking pieces, and to my eye looked like a simplified version of the vaults you might see in a cathedral. 

Block is sometimes criticized for his ties to the cement industry — he is on the board at Holcim, Switzerland’s biggest cement company. But he is not apologetic about it. He hopes those relationships can steer architecture in a more sustainable direction. His floor panels would reduce the amount of concrete and steel used in the floors of an average high-rise by around 65% and 80% respectively, according to his team’s calculations. For an average 25-story concrete building, that would mean 1,200 concrete trucks that don’t need to come to the construction site. With no rebar, RFS doesn’t carbonate and deteriorate in the same way as common reinforced concrete. And because the panels are dry-assembled and held in place just by compression, they can be disassembled and reused when the building reaches the end of its life. The RFS floor is already being planned for several multi-story building projects, including one in Brussels and another in Zug.

One challenge Block faces is convincing people that don’t have a rich understanding of masonry and geometry that such a lightweight and fragile-looking floor panel, held in place without any binder, is actually as safe as a thick slab of concrete. Instinctively for most of us, “light” seems like “weak.” 

“We’re so used to thinking: If we add material, it will make things stronger,” Block said. “But geometry is so much more effective at giving you strength and structural stability.” He likes to give famous examples, like Grand Central Oyster Bar in New York, the delicate tile ceiling of which supports the vast Vanderbilt Hall above. Proportionally, Block said, those vaulted ceilings are even thinner than his floor plates. 


In the poorer suburbs of Inhambane in Mozambique, people are making blocks. Men, women, employed, unemployed — everyone is making blocks. Buy cement, mix it, make blocks, pile them high, build a concrete house. 

For some, this takes months; for others, years. The price of a bag of cement has become as colloquial as a beer or a pack of cigarettes. As the anthropologist and sociologist Julie Soleil Archambault wrote, people in Inhambane even think in terms of cement, assessing jobs in terms of how many bags they’ll be able to purchase with their wages. A joke in bars, she noted, is that a slow drinker must be making blocks. 

After poured blocks have set, they then must be cured with water to prevent them from cracking. Around neighborhoods, you’ll see people watering their blocks with a hose as if they were flowers. In 2016, one of the most popular songs in the city — “Uma cerveja, um bloco” (“One beer, one block”) by DJ Ardiles — reminds young people that they could be making concrete instead of drinking. Making blocks, Archambault wrote, isn’t just “making blocks” — “it’s a euphemism for forward-thinking.”

Concrete was once viewed as a colonial material in Mozambique. During Portuguese rule, which only ended in 1975, the capital was divided between the “City of Cement” populated by “colonial bourgeoisie” and filled with high rises, apartment blocks and the state apparatus, and the “City of Reeds” of the suburbs, where a largely black Mozambican population lived. The colonial regime prohibited people in the suburbs from building their houses out of anything other than reed, wood or zinc — concrete was banned. That way, wrote David Morton in his book “Age of Concrete,” precarious neighborhoods “could be easily demolished to make way for future expansions of the City of Cement.” 

“We find ourselves on a treadmill of dependency on a material that is slowly deteriorating from the moment it is first poured.”

No longer viewed as colonial, concrete in Mozambique has now become symbolic of urbanization, success and — importantly in a place plagued by extreme weather exacerbated by the climate crisis — safety. This is part of a trend across the continent: Africa is the fastest-growing cement market in the world. It’s the continent’s “new oil,” Bloomberg called it. The continent’s richest man, Aliko Dangote, runs its biggest domestic cement company. Sub-Saharan Africa is projected to contribute more than half of the global population increase between now and 2050. There will be 120 cities of more than 1 million people, including several megacities. Homes will be built, roads will be built, infrastructure will be built. Concrete will likely need to be poured on a scale orders of magnitude greater than it was in the West in the last century. “In Europe, very few people know how to pour concrete,” the geographer Armelle Choplin said in an interview. “In Africa, everyone does.” 

Some observers worry that concrete’s spread across the continent has eroded age-old artisanal craftsmanship and the use of local and more sustainable materials like mud, clay and wood. “Concrete in construction is quicker, uses less manpower and requires fewer workers involved in artisanal trades,” said Ola Uduku, a British-Nigerian architect and the head of the University of Liverpool’s School of Architecture. Skilled artisans, she told me, are becoming harder and harder to find. 

In response to this, some architects, primarily based in or originating from West Africa, have tried to challenge the rise of concrete by championing local knowledge and resources. Diébédo Francis Kéré’s elegant mud constructions — including clinics and schools — made him the first African architect to win the Pritzker Prize in 2022. There are many others emerging, like Clara Sawadogo in Burkina Faso, Mariam Kamara in Niger and Nzinga Mboup in Senegal. Their styles are radically different, but their general principle is the same: to eschew concrete as a universal solution in favor of what’s local and traditional — often mud, clay, stone or wood. These materials, they say, better suit the extreme and fluctuating conditions of their regional climates.

“The reality is something different,” Uduku cautioned. “In terms of the actual construction of low-cost housing in townships, people are still using cement blocks.” Concrete, like it always has been, is cheap and growing ever more universal.

“There are some things that will always need concrete,” Uduku concluded. “But for the poorest parts of Africa, we need to be looking at our local materials. We’re still locked into a system that may have been fantastic in the 50s and 60s, but now we know about the maintenance concrete needs over time — and its role in the climate crisis. It just can’t be the everyday material anymore.”


Before I left Block in Zürich, I asked him what he thought of the Grande Dixence dam. He’d never heard of it. Block was born in Belgium, so being unaware of Switzerland’s biggest and most productive dam was excusable. But it felt indicative of how the gigantic concrete infrastructure of our lives can play such a hidden and secretive role. 

I decided to go see it for myself. I boarded a train headed south, and 24 hours later found myself on a public bus, empty aside from myself and the driver, hurtling around stomach-turning hairpin bends inches away from precipitous drops, ascending into the Alps. The dam and the road leading to it were already closed for the winter, so I got off the bus at the last possible stop, in a small village called Mâche, and began to walk. 

The guides I’d read had warned of challenging conditions so late in the year, recommending snowshoes and checking for avalanche warnings. But the weather was sunny and clear and unusually warm. Switzerland had experienced one of its hottest years on record; its glaciers had lost 6% of their volume due to melting, triple the amount usually seen in extreme years. The Alps in general are expected to lose 80% or more of their current frozen mass by the end of this century. 

“Mass deposits of crumbled concrete, academics have suggested, will become the stratigraphic marker of our age: a scar left by the great acceleration that really will last for eternity.”

Pine and spruce towered around me and cobwebs hung in the air, so thin that the pine needles caught in them appeared to be levitating. I paused at any sound from among the trees, hoping to see the ibex that are common to this area. But I saw none — just a lonely donkey in a fenced-off field.

At around 6,000 feet, the temperature finally began to dip, the leaves crunching frostily beneath my boots. I thought about the 10 cold winters the dam builders had endured, and the sheer hubris of even embarking on the construction of such a gigantic thing in this terrain. Then the forest ended and I was on a mountainside, high above a valley. Across it, I could see the dam; a perfectly straight horizontal line in a jagged, chaotic landscape. 

I’d expected to be impressed, to be struck by the techno-industrial sublime of a gargantuan monolith. But straddled between two enormous snow-capped mountains, it looked exceptionally small, and I couldn’t help but feel underwhelmed.

The post Concrete Built The Modern World. Now It’s Destroying It. appeared first on NOEMA.

]]>
]]>
Rise Of The Plant Destroyer https://www.noemamag.com/rise-of-the-plant-destroyer Wed, 06 Jul 2022 15:57:01 +0000 https://www.noemamag.com/rise-of-the-plant-destroyer The post Rise Of The Plant Destroyer appeared first on NOEMA.

]]>
The first reports in Europe came from West Flanders, Belgium, although it most certainly started elsewhere. It was August 1845, and the potatoes were dying. Dark flecks would develop on the leaves of one plant, and within days the entire field was destroyed. The stems seemed almost melted and the leaves hung dead, charred and black, as if an invisible fire had ripped through the land. When the potatoes themselves were dug up, they’d been reduced to a slimy ruin.

Within weeks, the disease had spread across Europe into the Netherlands, Germany, Switzerland, Poland, France and across the English Channel to the British Isles. When a Miles Joseph Berkeley of England received infected leaf samples from a friend in Paris, he remarked that his own crops in Northamptonshire were “never more abundant or finer.” In less than a week, the potato fields around his county too were blackening.

Crop diseases had struck before but never with such terrifying speed. Panic took hold across Europe, and theories began to spread that the disease was caused by static electricity generated by the new steam trains, or vapors of “bad air.” Or perhaps it was simply a judgement from God.

Nobody knew what to do — leave the potatoes in the soil or dig them up? Expose them to light or keep them in the dark? Smother them in salt or soak them in chlorine? Also: Could these diseased spuds still be consumed? For three whole days, Monsieur Bonjean, a man from the French town of Chambéry, exclusively ate rotten potatoes and drank water boiled with them to see if it caused any ill effects. He survived but reported a disgusting taste and a “disagreeable heat oppressing the chest.”

In the preceding centuries, new trade routes had uprooted the potato from its native South America and shipped it to Europe. Just as it had supported the rise of the Incan Empire, it supported the rise of Europe, becoming a vital food staple from the 1700s onwards. It was a durable crop, easier to prepare than bread and highly nutritious. Monarchs promoted the potato to their populations as a solution to famine and food riots. And because it grew underground, it was rarely damaged or pillaged during Europe’s endless wars. 

In France, Louis XVI and Marie Antoinette even began wearing potato blossoms as accessories, and the royal gardens of Tuileries Palace were planted with potatoes instead of flowers. In Prussia, King Fredrick II ordered his government to distribute free seed potatoes with growing instructions, and Nicholas I did the same in Russia. Populations across the continent boomed partly as a result of the potato’s rise, and the historian Charles Mann likened its impact on Europe to the invention of the steam engine. If archeologists arranged human history using organic rather inorganic materials, this would have been called the Potato Age. 

Nowhere took to the potato quite like Ireland, where the cool, damp climate was especially suited for its proliferation. The population nearly doubled between 1800 and 1845. The Irish remained largely poor but healthy, and around a third were sustained on nothing but potatoes and milk. 

“Nobody knew what to do — leave the potatoes in the soil or dig them up? Expose them to light or keep them in the dark? Smother them in salt or soak them in chlorine?”

But then on Sept. 13, 1845 The Gardeners’ Chronicle in the U.K. announced the disease had “unequivocally declared itself in Ireland. The crops about Dublin are suddenly perishing.” Irish folk archives of the time report a dense fog that lasted days in the west, and a terrible stench that filled the fields. That year, the disease wiped out a third or more of Irish potato crops, but the following year, it was total devastation. 

“Wandering beggars, roadside deaths, rising crime rates, poorly attended burials, widespread panic about contagion and mass evictions were commonplace throughout most of the country,” wrote the historian Cormac Ó Gráda. “Evictions, by forcing migration, exacerbated the problems. … There were many such accounts of bodies left unburied; others described survivors dragging corpses unaided to cemeteries, and people not yet quite dead being lowered into communal burial pits.”

There’s no doubt that Britain’s neglectful colonial rule over Ireland during the famine years — fueled by the ideological rhetoric of Prime Minister John Russell, as well as leading figures like Charles Trevelyan and Charles Wood — worsened a crisis initiated by plant disease. Tens of thousands died across Northern Europe, but that was little compared to Ireland, where it is estimated that as many as 1.5 million died of disease, hunger and fever following the crop failures, most of them the country’s poorest. And between 1846 and 1855, 1-2 million more emigrated, a mass movement of people that, until then, was perhaps the fastest in human history. To this day, Ireland’s population has not returned to the level it was before the famine.

At the same time as potato crops were failing across Europe, halfway across the world, chestnut trees in the southeastern United States were dying in great numbers, so many that hillsides once covered in majestic greens, reds and yellows turned a dull grey. They were perishing too in western Italy, where locals noticed what they called “mal dell’inchiostro” (“ink sores”): foul-smelling wounds amid the trees’ ridged bark oozing a black liquid. 

Nobody knew it then, but the potato famine and what would become known as chestnut root rot were the result of closely related microorganisms. The world had shrunk, and devastating microbes were traveling to lands they had never visited before, attacking plant species that were unprepared to fight back. 

This genus of microbes contains many different species that behave in many different ways — the ones that attack potatoes and chestnut trees are only two of perhaps hundreds of others. But in time scientists would come to realize they were all related, and they grouped them together under one carefully named genus: Phytophthora. In Greek, it means “plant destroyer.”


A farmer in Baden-Württemberg, Germany, sprays his potato field with fungicide to protect it against Phytophthora infestans. 2019. (Thomas Warnack/picture-alliance/dpa/AP Images)

“Every aspect of human society and every part of the natural world is affected, for good or ill, by the activities of tiny unseen microbes — bacteria, viruses, fungi and protozoa,” wrote the British writer Bernard Dixon. Microscopic organisms are the most abundant form of life on Earth, filling the air we breathe, water we drink, soil we walk on and food we consume. They make life possible, and have been since the very beginning. Many of them are also the waste disposers of our planet, without which we would be surrounded by ever-growing colossal mountains of the dead. 

At some point in their approximately four-billion-year existence, certain groups of microbes discovered the evolutionary niche of living on or inside other living things. Your body is a good example: A couple pounds of microbes, more or less, live inside your mouth, your eyes, your gut, on your skin, all carrying out their own independent tasks. You are a complex assemblage, more “we” than “I.” 

Most plants are no different. Microbes live on and in plants, and in the soil from which they grow, weaving their way into their cellular structures to create an interface where they exchange everything from nutrients to water to signaling compounds.

“In an era of animal pandemics — tuberculosis, smallpox, COVID-19 — it’s easy to overlook the fact that plants also experience them, and they can have an equal or more devastating impact on human society.”

“We know from the oldest fossils that when plants arrived on land, they already had fungal structures inside their cells,” Sebastian Schornack told me, as I admired a colorful tank of shell-dwelling cichlid fish on the desk in his office. Schornack is a plant scientist at the Sainsbury Laboratory at the University of Cambridge. His research focuses on how plants and microbes live together, sometimes collaborating, other times competing. “Plants consider this so important they have maintained a whole set of genes to engage with symbiotic fungi. But if you have a whole system that allows someone into your home, the question is: Are there others exploiting it?”

In an era of animal pandemics — tuberculosis, smallpox, COVID-19 — it’s easy to overlook the fact that plants also experience them, and they can have an equal or more devastating impact on human society. Humans of course live in complete entanglement with plants, and our incessant struggle to manipulate and control them has defined modern civilization. Plants provide over 80% of all the food we eat and are the primary source of feed for our livestock. Their materials are in our fuels, clothes, medicines and the structures we build; even our hands are shaped by millions of years of climbing trees. Trees cover around 30% of the world’s land and produce much of the oxygen we breathe, not to mention store the carbon we need to remove from our overheating atmosphere. What affects plants, in other words, affects humans.

As the climate shifts and global trade quickens, plant diseases are becoming increasingly frequent, severe and widespread. Each year, pests and diseases rip through global food crops, where they cause losses of up to 30% of staple crop yields, and dramatically alter the Earth’s natural forest ecosystems. It was plant disease that made the once-dominant “Gros Michel” banana nearly commercially extinct (Fusarium wilt), turned the towering American chestnut into a mere shrub (Cryphonectria parasitica) and is currently threatening the future of coffee (Hemileia vastatrix). “There is basically always a plant disease pandemic ongoing,” said Schornack. “But most people don’t know it.”


For almost a century, scientists working on Phytophthora (pronounced “fi-toph-tho-ra”) thought they were dealing with a strange type of fungi. It looked like fungi and behaved like fungi, so it must have been fungi. But as technology improved and they were able to look deeper into the structure and behavior of the pathogen, they realized they were dealing with something different. In the 1980s, one scientist went so far as to suggest renaming it “pseudofungi.” “Let’s put it this way,” David Cooke, a scientist at the James Hutton Institute, explained to me over Zoom, “it’s as closely related to a fungus as you and I are to a pine tree.” 

Phytophthora descend from an ancient lineage of microbes known as oomycetes — a group that either absorb their food from surrounding soil or water or by colonizing the body of another organism. Oomycetes are most closely related to algae but evolved to form a motley crew of parasites. Like all microbes, they complicate our taxonomies by blurring the lines between animals and plants: Some of them attack fish and amphibians, others attack insects, but most, like Phytophthora, attack plants. 

“Essentially, they are like algae that have lost the ability to photosynthesize,” Cooke said. “They worked out it was easier to hijack their food from plants that are doing the work for them.” In 2000, scientists knew of only 50 or 60 Phytophthora species. Today some 200 have been discovered, and the number is still climbing. 

“Evil fungus?” I proposed to Schornack, who grimaced. 

“From a human perspective, it seems dark because it involves stealing or tricking,” he said. “But in an ecosystem, we need these mechanisms, otherwise we wouldn’t have the constant circulation of resources, like minerals and nutrients. … Parasitism is a widespread phenomenon that evolves in all parts of life. The interesting thing about it is that it always evolves together with the host. You have a coevolution where parasites try to steal and hosts try to defend. Both parties adapt their mechanisms to steal or defend.”

“Over the last 200 years, sudden outbreaks of this globe-trotting fluff have damaged economies, influenced wars and sparked social unrest.”

He directed me down wood-paneled corridors into one of the labs where his team works. Machines whirred and surfaces shone white. There were microscopes on each table, and one in a room of its own. It was so powerful, Schornack told me, that you could look into individual cells and watch proteins move around. 

He ushered me toward a shelf where a small, sealed Petri dish sat unassumingly. Inside, pressed against the lid, was a white fluff. “This,” he said, “is Phytophthora infestans.” It looked as if someone had captured a tiny slice of cloud, or gathered a fragment of cotton candy.

Of the hundreds of species of Phytophthora, some float through the air in droplets of water and attack plants through their leaves, fruit or bark; others swim in waterways and creep through the moisture in soil to attack roots. The Phytophthora infestans (P. infestans), which caused the crop failures that began the Great Irish Famine, is the poster boy of the oomycetes. It has been described as “the plant pathogen that has most greatly impacted humanity to date,” and remains among the biggest killer of potatoes and tomatoes worldwide. 

Over the last 200 years, sudden outbreaks of this globe-trotting fluff have damaged economies, influenced wars and sparked social unrest. For a while, humans even flirted with harnessing its destructive nature for our own uses. During and after World War II, Germany, Britain, France and America invested heavily in researching P. infestans as a tool of biological warfare that could destroy an enemy’s food supply. The appeal was its ability to spread far and wide, like radiation from a nuclear bomb, and when you look closer at how it works, you can see why warmongers desired it.


A forester inspects recently felled larch in Wentwood Forest, Wales, where Phytophthora ramorum was killing trees in 2013. (Geoff Caddick/PA Wire)

As a parasitic microbe, Phytophthora is formidably, elegantly designed to do what it does. It is able to have sex with itself, a reproductive process which eventually births ephemeral sporangia — imagine microscopic and colorless lemon-shaped spores — that float through the air, carried for miles by wind, rain and fog. Life continues for the sporangia if one lands on the leaves of a plant it understands, preferably a species of potato or tomato that hasn’t evolved disease resistance. 

Then, if the conditions are right — ideally damp and cool — the sporangium will, over the course of 30 minutes or so, swell up and burst open, allowing a hoard of tiny spores (known as zoospores) to come scuttling out. The tiny zoospores hover for a moment, as if coming to terms with what they are and what they do, before speeding off. They are incredibly fast swimmers, almost sperm-like, and propel themselves using two hair-like threads that they whip and beat in tandem to navigate the moisture on a leaf’s surface. Under a microscope, the zoospore appears like a demonically possessed tennis ball wielding two pipe cleaners.

The zoospore then grows a long germ tube that slithers and probes around for a weak spot on the leaf before punching its way into the plant’s flesh. Into this microscopic wound, hyphae grow: long branching pipes that maraud through the plant’s insides via the corridors of air between its cells. The hyphae poke finger-like haustoria (from the latin “haustor,” meaning “one who drains or drinks”) into cells, which begin sucking the nutrients from the plant. As the disease colonizes the entire plant, it secretes proteins that suppress its immune system and convince it that it is not actually dying. The plant, now nourishing the pathogen rather than itself, rapidly deteriorates and dies.

“As a parasitic microbe, Phytophthora is formidably, elegantly designed to do what it does.”

An observer would see black, brown or purple blotches on the leaves grow and spread, soon followed by white fringes of mold on the underside of the leaf. This mold is the pathogen’s final act. Under a microscope, the mold reveals itself to be a forest of thread-like filaments emerging from the pores, or stomata, on the underside of the leaf’s surface. Having sucked the life from the plant, these tiny trees grow and bear the fruit of a new generation of sporangia, ready to be carried away again, leaf to leaf, field to field, country to country. In certain conditions, P. infestans can also produce oospores — thick walled stalwarts that can survive freezing winters and reinfect new crops the following year. 

The British botanist and novelist E.C. Large described this ordeal in gory detail in his 1940 opus on plant disease: “If a man could imagine his own plight, with growths of some weird and colorless seaweed issuing from his mouth and nostrils, from roots which were destroying and choking both his digestive system and his lungs, he would have a very crude and fabulous, but perhaps instructive idea of the condition of a potato plant when its leaves were moldy with” P. infestans.

This entire process takes 48 hours or so, and every square centimeter of an infected leaf’s surface produces around 20,000 more sporangia per day. “Amplify that to an industrial crop scale and there are billions of these spores being produced very quickly,” Cooke said. 


Since the late 1960s, widespread forest epidemics as a result of Phytophthora have increased exponentially. Among the most virulent tree killers is Phytophthora ramorum (P. ramorum), which most likely originated in East Asia, and spread around the world via the global rhododendron trade, one of the many plants it infects. Like P. infestans, it is primarily an airborne disease that moves quickly. It was discovered in California in the late 90s — where it was given the name “sudden oak death” — and went on to kill around 50-70 million native oaks and tanoaks across the U.S. Then it turned up in Europe, and soon it had blown to Scotland.

On the outskirts of Dumfries, a town in southwest Scotland, I met with Alan Gale, the adaptation and resilience manager at Forestry and Land Scotland, a government agency responsible for conservation, timber production and managing and protecting a third of all the country’s forests. Gale was born and brought up in the area and has been working in forestry for 30 years. “All of my family work in forestry,” he told me. “We love trees.”

Unlike most fungi, Phytophthora have no mushroom or fruiting body that signal its presence. It’s ghostly, invisible to the naked eye, known only by the destruction it leaves behind. If scientists and foresters want to find it, they have to go looking for it. They can leave buckets out in the rain to collect water and analyze it for spores, or take samples of soil. Another method, which Gale retrieves from the trunk of his car, is a PCR test. Trees, just like humans, can take PCR tests. “It’s exactly the same as the ones we all now know,” he said. “You take your knife, take a little bit of bark off, put it in the little bottle of fluid, and give it a shake. We’ve been using these for five or 10 years.”

For the first 20 years of his career, Gale said, he had “no need for an interest in pests and diseases. It was something I thought about once or twice a year. … Now they’re having a massive impact.” As we drove through the countryside past fields of Highland cows, he pointed at vast swathes of brown forest rising up on the hills around us and even bigger stretches of dark stumps. “Dead,” he said, pointing at one section. “That’s dead,” he said, pointing at another. “In parts of south Scotland and west Scotland, it has been catastrophic.” 

In a world where the effects of climate change are sometimes dramatic, Scotland is mostly mellow: no severe droughts, wildfires or devastating storms. The impact is more subtle: Winters are a little warmer and a little wetter. But P. ramorum likes warm and wet winters. “Most larch are going to die in this area,” said Gale.

“We really don’t have a clue what’s coming at us.”
— Alan Gale

As is the case with all Phytophthora, there is no known cure for P. ramorum. And because infected trees become vectors of transmission, the official control method in Scotland is to cut down everything within about 800 feet of an infected tree. In the last 10 years, millions of larch across Scotland have been felled or put on a waiting list to be. In one area between the coast and Dumfries, the disease spread so quickly and so brutally that foresters, fellers and sawmills couldn’t cope. Felling and prevention tactics were simply abandoned, dead trees were left standing, and no woody materials allowed in or out. 

Phytophthora have extremely high evolutionary potential, and are known for their ability to overcome host resistance and jump to new species if the conditions are right. Until 2009, P. ramorum had never been found in larch — it was called sudden oak death for a reason. Scientists were bewildered to find it in larch, and they will be again if (when) it jumps again. The front line is moving north, and they are already felling uninfected larch just as a preventative measure.

Gale and I pulled into Dalbeattie Forest: around 2,700 acres of trees on granite upland. Larch are not native to Scotland: They were first introduced in the early 1700s, and are grown and cut for timber used in construction. They are allowed to grow for almost half a century, and in that time create vibrant forest ecosystems and communities. Bike tracks and walking paths weave through the forest floor; goshawks and red squirrels and many other animals have colonized the trees. At the forest gate, a startled roe deer dashed away as we approached. Gale walked with a crutch, having hurt his leg, but he scampered into the crunchy undergrowth and snapped off a branch with a crack that echoed around us. “The needles should all be flushing a lush green right now, for spring,” he said. “But they’re not.” 

I realized we were completely surrounded by dead trees. Some larch can live for 300 years and grow up to 130 feet tall, but a microbe smaller than a needle point can kill one in months. A cool breeze cut through the forest, and if our eyes could have seen the sporangia, they may have been gliding through the air: a blizzard of poisonous snowflakes.

Gale told me that foresters like him get great data on how much warmer and wetter it’s going to be in the future as the climate warms, and how that’ll affect plant life and forest ecosystems. What they struggle to plan for are the pests and diseases, he said. “We really don’t have a clue what’s coming at us.”


A tanoak tree decimated by sudden oak death disease (Phytophthora ramorum) in southwest Oregon. 2001. (AP/Oregon State University)

“What makes a native plant or animal or fungus abandon its companionable habits to carve a path of destruction across the landscape?” asked the anthropologist Anna Tsing in a 2018 essay on invasive species, from rice-devouring insects to frog-killing fungus to ocean-dominating jellyfish. Her essay was inspired by her work on Feral Atlas, a scientific research project that proposed the concept of “feral” ecologies: the unintended consequences of human activity that spiraled out of our control. Viruses, bacteria, fungi and chemicals that thrive because of human disturbance, spread via global trade flows, proliferate in dramatically changing climates and become uncontainable. 

Most plants can defend themselves against most pathogens, especially those that have coevolved together; they tend to reach a state of close competition in which the pathogen can steal enough to survive and reproduce, and the plant can defend itself well enough to stay alive and spread. “The result is that both sides are continuously adapting and counter-adapting to each other,” wrote the biologist Andy Dyer in his 2014 book “Chasing the Red Queen.” “In such an ‘evolutionary arms race’ there is no winner, only a never-ending race without a finish line.” In biology, this is known as the “red queen hypothesis,” named after a passage in Lewis Carroll’s “Through the Looking Glass” in which the Red Queen explains to Alice that in Wonderland, a person must run very fast just to stay still.

Problems arise when the incessant beat of global trade brings feral pathogens to new territories where species have not evolved the defenses to protect themselves. Thomas Jung, a scientist at the Phytophthora Research Centre, recalled travelling across Asia to trace the origins of Phytophthora cinnamomi. “In Taiwan, Vietnam and all across Indonesia, you are basically walking on Phytophthora cinnamomi,” he said. “It was in almost every soil sample I isolated. But there was no damage to the trees, because of coevolution. … It’s definitely there, but it’s a benign pathogen.” By contrast, when the exact same disease found its way to southwest Australia, it became a threat to national biosecurity, attacking around 40% of all native plant species. Scientists and conservationists there call it the “biological bulldozer.”

Plants have been moved around the world in huge quantities since the dawn of European colonialism. Timber, grasses, potatoes, tobacco, tea, coffee, cacao, oil palms and many more species were uprooted from their native climes and transported across land and sea, bolstering the economies of colonial powers as they went. Colonialists, the people they’d enslaved and the livestock they transported spread diseases, causing unimaginable death among Indigenous populations — and the plants they took and traded did the same. The historian Alfred Crosby called this process “ecological imperialism.”

Phytophthora and other plant diseases have hitched a ride on these transportations, in soils, on leaves and in the flesh of plants themselves. The strain of P. infestans that triggered the Great Irish Famine came across the Atlantic, hidden aboard a ship of infected tubers heading for Belgium, like Dracula in a box of dirt. Scientists recently discovered that after it had swept Europe it followed British colonial trade routes to East Africa, China and eventually Australia and New Zealand. 

“Plants are moving around the world at an unprecedented rate, and plant diseases are swarming with them.”

The prolonged travel times of ships up until the 20th century helped prevent the transfer of all but the hardiest diseases. But now plants are moving around the world at an unprecedented rate, and plant diseases are swarming with them. Maritime transport, which accounts for 90% of all global trade, is faster than ever, and carries more than ever. Crops and timber, whole trees and shrubs, cut flowers — all of it uprooted and taken to new lands. The U.S. alone imported $2.5 billion worth of plants in 2020. Plant diseases have even been discovered on the International Space Station.

Phytosanitary regulations are in operation at borders around the world, but they are only effective at preventing what we know about. Many of the Phytophthora currently wreaking havoc around the world had not been discovered until they started causing devastation. Effectively unknown by science, they could not have been stopped or even detected wherever they entered new territory. In September 2021, Phytophthora pluvialis — a pathogen never before detected in Europe — was found infecting trees in rural England. 

“We are constantly throwing pathogens — and I mean constantly, every day, hundreds of thousands — from one biogeographic region into another one,” Jung told me. “Many of them will never find a host there. But some do.” The ones that do usually thrive because the climatic conditions are favorable to their survival and there is a large population of susceptible hosts for them to infect. This might be a native forest, but more often, plant diseases end up thriving and mutating in large manmade agricultural or forestry plantations of mostly identical species. 

Crowds of any one thing promote the spread of pests and diseases. And the dense way we have come to grow identical plants has been a scourge for centuries. The failure of Europe’s potatoes in the 1840s was largely the result of intensive farming of a very narrow range of potato species that, it transpired, were susceptible to P. infestans. In Ireland, it was the infamous lumper potato.

“Effectively unknown by science, many of the Phytophthora currently wreaking havoc around the world could not have been stopped or even detected wherever they entered new territory.”

Farming has transformed since then — post-WWII industrial agriculture is a thoroughly scientific domain, teeming with pesticides, herbicides, fungicides, resistance breeding and genetic modification. But the monoculture approach of growing vulnerable fields of identical crops still dominates industrial agriculture, and industrial agriculture dominates agriculture — 1% of the world’s farm owners control 70% of the world’s farmland. 

“Progress and Doom are two sides of the same medal,” wrote Hannah Arendt, and much like the antibiotic paradox, pesticides, herbicides and fungicides have created their own worst enemies. New chemicals are created to kill unwanted organisms, but the targeted organisms over time adapt and survive. It is a cycle that has come to be known as the “chemical dependency treadmill.” For the farmers who can’t afford the chemical weapons, crop losses tend to be unpredictable and devastating. 

The monoculture-like settings of the horticultural nursery trade play a similar role to industrial agriculture. Asymptomatic plant materials arrive in nurseries around the world carrying non-native plant diseases, which they pass on to other plants, which are then redistributed. The same goes for composts that are imported and sold. Cooke and his colleagues launched a project in 2016 called Phyto-threats that aimed in part to establish the amount of Phytophthora present in U.K. nurseries; in one experiment they tested a sample of water from a puddle on the ground in a nursery and found 10 different species of Phytophthora. So many Phytophthora in one place increase the chance of hybridization: new and more virulent species with wider host ranges. Cooke calls them “hopeful monsters.” 

Jung has made similar discoveries. “You buy these little pots with the basil and the coriander from the supermarket, and after a couple days they start looking bad, and they are wilting and they get spots,” he said. “We isolated one: it was carrying the same Phytophthora species that was killing the big beech trees outside: Phytophthora plurivora. So you buy these herbs, they die, you put them on the compost pile and then it spreads onto rhododendrons, then hedge rows and, eventually, it kills the big beech trees a few years later. It’s amazing — the implications are actually really huge.”

What makes a native Phytophthora abandon its companionable habits to carve a path of destruction across the world? The manmade infrastructure of capitalism has created a constant movement of organisms that displaces native pathogens and exposes vulnerable hosts, and fuels a climate crisis that creates new environments in which they can thrive. To maintain the growth of economies, we dismissed the growth of plants. “Endless expansion has unintended consequences,” Tsing said in a 2018 interview. “The new wild,” she called it.


Te Matua Ngahere (Father of the Forest), believed to be the second-largest kauri tree in New Zealand. (Wikimedia)

The steep coastal cliffs and expansive beaches of the Waitākere Ranges form one of the most iconic ecosystems in New Zealand, but it’s the ancient and colossal kauri trees that grow on the forested hills that are the area’s most spectacular feature. Kauri trees are some of the oldest in the world; the kauri genus, Agathis, has been around since dinosaurs roamed the Earth. They can grow to be 165 feet tall, with a girth sometimes exceeding 50 feet around, and can live for more than 2,000 years. They are a keystone species — critical to the survival of other plants and animals that live among them — and also form an integral part of Māori mythology. “They are what we call the kaitiaki, or protector of the forest,” Edward Ashby, a board member of the Te Kawerau ā Maki, the tribe (iwi) that is the historic guardian (mana whenua) of the ranges, told me over Zoom. “They are thought of as a living ancestor, a connection between people living now and the atua, or gods of the past.”

The forest started to die in the late 2000s. A survey in 2011 discovered that around 8% of the kauri trees were infected with Phytophthora agathidicida. By 2016, it had shot up to around 20%. “A whole valley in the forest that used to be green and full of birds was now just all these skeletons sticking up,” Ashby said. “It started to look like a graveyard.” He called it “stag horning” — without their leaves, the dead and dying kauri resembled antlers. 

Nobody knows how P. agathidicida arrived in New Zealand, whether it was brought to the country or had been dormant in the soil until climate conditions enabled it to go feral. In a way, it hardly matters how it got there, only that it was bringing 2,000-year-old gods to their knees. “These trees that have lived for essentially the entire human history of New Zealand — some of them are older than all of the tribes,” Ashby said. “Now, seeing them die — what that means for the future is devastating.”

To iwi like Te Kawerau ā Maki, the death of the forest was an existential threat. P. agathidicida, it turned out, isn’t airborne, but waterborne — it was creeping through the soil like a subterranean specter. And of course, it was being aided and abetted by humans: Close to a million people visit the Waitākere Ranges each year, and the disease traveled around the forest primarily on the soles of muddy shoes. Nearly 70% of the dying trees were within 50 meters (164 feet) of walking tracks. 

After consulting with scientists, the iwi made a simple and defiant decision. On Dec. 2, 2017, members of Te Kawerau ā Maki held a ceremony beside a 1,000-year-old kauri tree known as “Aunt Agatha,” in which they laid down a rāhui: a spiritual set of protocols that declares an area to be sacred. Human access to the Waitākere Ranges was thus banned. 

“It’s kind of like a quarantine, a spiritual quarantine,” Ashby said. “It essentially means to commune with the other realm. … In the opinion of the Māori, the environment is a lot older than all of us; it knows how to look after itself. You just need to get out, leave it alone and it will heal. That’s the idea of rāhui.” It was, and remains, one of the biggest and most complex rāhui declared in living memory. 

“In the opinion of the Māori, the environment is a lot older than all of us; it knows how to look after itself. You just need to get out, leave it alone and it will heal.”
— Edward Ashby

“When you walk through a mature kauri forest, it’s like walking into a cathedral,” said Lee Hill, a biosecurity specialist and one of a handful of scientists who has been given permission by the iwi to continue entering the forest for research. Unlike in Scotland, there is no chance that such culturally significant kauri trees can be felled, even if they are vectors of infection. So Hill and others have had to get resourceful. They spoke to experts in Australia who had been fighting a similar Phytophthora that was killing jarrah trees. There, they had implemented an unusual approach: They vaccinated their trees. The chemical compound known as phosphite seemed to be a safe biostimulant — it didn’t cure the disease, but it did boost the tree’s immune system and slow the infection. 

It seemed to work for the kauri too. In the last five years, Hill and his team have administered around 17,000 tree vaccinations. “When I go back to these trees four or five years later, they are still there, when they would have been dead,” he said. “But,” he admits, “it’s got this Frankenstein feel to it.” To do the vaccination, first you remove the outer bark on a small patch of the trunk, then drill a hole in the tree trunk and insert a syringe into the hole. A spring on the syringe forces the medicine into the tree. On hot and dry days, the tree will sometimes suck in the liquid itself. 

“It doesn’t look pretty,” Hill said. “You’ve got a god of the forest, then you’ve got me drilling holes in it and injecting it. So, you have to have that conversation with the public or mana whenua, and ask them: Are they okay with this? Having the tool doesn’t always mean we can use it.” A representative from Te Kawerau ā Maki accompanied Hill and his team into the forest and blessed the work as it was being done.

Hill isn’t the only one tackling the disease through cross-cultural collaboration. Monica Gerth, a microbiologist at Victoria University of Wellington, has been studying Phytophthora interactions with plants. Alongside a team of researchers, she used Māori mātauranga (traditional knowledge) of the forest to select and test if four native medicinal plants were able to produce any anti-Phytophthora compounds. They found that roots and extracts from the kānuka plant not only disabled P. agathidicida’s zoospores, it stopped them from germinating. 

“In an ideal world,” Gerth told me, “we’d be able to find something that can be planted near a tree or given to a tree as a medicine, that doesn’t involve trunk injections. They are a good solution for a dying tree. But when you inject a trunk, you damage it, and you need to inject again after four or five years. These kauri can live for thousands of years. Are we going to be injecting them for thousands of years?”

Last year, a few paths on the edge of the forest were reopened, but footwear disinfection points and elevated walkways were installed so that no dirt could be walked in or out. No human feet touched the sacred ground. 

Most of the forest remains closed. It is an act that has forced many New Zealanders to wrestle with questions around their relationship to nature, ones that more of us may have to face in the coming years. Does a forest have value if we are not allowed to look at it — or even go near it? “That question was raised a lot,” said Hill. “If people can’t go in there for five or six years, will they forget its value?” In other words: Do we value nature in and of itself, or just when we are extracting something from it — a resource or pleasurable experience? And what if one of the most promising solutions to our climate crisis is not the endless endeavor of science and technology and perhaps politics, but the eerie stillness of inaction? “In our forest,” Ashby said, “most things have stopped moving.”

“Do we value nature in and of itself, or just when we are extracting something from it — a resource or pleasurable experience?”

Surveillance of the disease goes on in the Waitākere Ranges. “The data has confirmed that the science is aligning with the custom,” Ashby said. “We’re only seeing the disease light up around the edges. … It isn’t showing up in the heart of the forest.” When I asked when the iwi will be lifting the rāhui, he paused. “The environment will tell us when it’s ready, when the mauri [life force] is balanced and the forest is healed,” he said. “Are we seeing more death, or are we seeing new kauri coming through? Are we hearing birds returning? Are we seeing lush green canopy again? There are signs we will look to, and that’s when the rāhui will lift. But ideally, what we would like to see is a sanctuary in the center of the forest: a place that isn’t anything to do with recreation or visitors. A forest just for its own sake, that will be there thousands of years into the future.”

To grieve the death of a tree is to see it as something other than a resource or an object of beauty. It’s something many people struggle with, myself included. It’s easier to grieve the death of an animal. Despite a multitude of forms, animals are essentially like us: Born of flesh and blood, they eat, sleep, talk, procreate, move around and die. 

Plants, by contrast, seem alien in their lifecycles, behaviors and forms. They are alive in a way that is utterly unlike us. Psychologically, we maintain a strange distance between our world and theirs, a sort of denial of the fact that our existence depends on them — a distance that was perhaps required and reinforced so that we could overlook their wholesale destruction for human purposes. Now more than ever, Indigenous knowledge systems with centuries of ecologically grounded thought are reminding us that a dying forest is not a dying resource, but a dying miracle — a miracle that includes and affects us.

During a series of impassioned debates in which Auckland City Council members voted on whether to endorse the rāhui (they eventually did), an elder in the iwi, Te Warena Taua, was asked what it would mean should the disease be allowed to continue without major intervention. He replied with a proverb: “Ko te mauri o te kauri.” It roughly translates to: “If the kauri dies, we die too.”


Potatoes are still farmed in the Andean region of South America where the strain of P. infestans that triggered the Great Irish Famine most likely originated. Wild native potatoes were first domesticated there some 8,000 years ago, and today they are grown in greater diversity than anywhere else in the world. 

In 2002, in the Sacred Valley of the Incas, five different Indigenous Quechua farming communities — around 6,500 people in total — came together to form the Parque de la Papa (Potato Park). Living and growing at an altitude that ranges from 10,000 to almost 16,000 feet above sea level, their farming methods are completely at odds with industrial agriculture. Land ownership is collective, not individual, and the economy is a mixture of monetary and non-monetary, in which barter markets and labor exchange play an important role. Across more than 22,000 acres of mountainous land, they cultivate almost 1,500 varieties of potato — plus beans, barley, quinoa and maize — in a wondrous mosaic of fields. Nature, after all, doesn’t abide by tidy rows.

“Farmers here don’t separate themselves from nature,” said Alejandro Argumedo, who was born to a native Quechua farming family and works as the principal advisor at the Potato Park. “They see themselves as part of this complex and unique system. They maintain their livelihoods and societies in a way that is in harmony with their surroundings. The human community has to work with the wild. … Conservation is a foreign word here, because they have been doing that forever. Agriculture and food production is done in a ritualistic way, in tune with those beliefs.”

Every June, farmers ascend the mountains during some of the coldest nights of the year to observe the Pleiades, a cluster of stars in the constellation Taurus, the brightness of which they believe determines the timing and quantity of rainfall that will come during potato growing season. In 2002, a group of American meteorologists and climatologists investigated the scientific basis for this folk practice and found it to correspond with the presence of certain high, thin and almost undetectable clouds that increase during El Nino years in that area. They concluded that it had a higher forecast accuracy (around 65%) than scientific weather forecasts of the time (55-60%). 

“Farmers here don’t separate themselves from nature. They see themselves as part of this complex and unique system.”
— Alejandro Argumedo

The potatoes come in all colors — brown and yellow of course, but also purple, pink, red, orange and blue — and shapes, from long and curly to stout and knobby. There’s even a black variety, puma maki, that’s shaped like a puma’s claw. The potato is respected, inspirited and treated like family — different varieties are incorporated into marriage proposals, weddings, baptisms and funerals. Humans and potatoes live in companionship. 

Phytophthora infestans has been known since at least the 1500s here — it’s sometimes called “la rancha” in Spanish and “chuyu” in Quechua. But the cultivation of a huge variety of potato species on frequently rotated land plots — as well as the constant crossbreeding with nearby wild potato relatives that have coevolved a natural resistance — has stopped the disease from ever really getting a foothold; there’s never been a need in the park for intensive pesticides or genetic modification beyond crossbreeding. There is a sense that the mountains — some of which are viewed as sacred entities by the Quechua — have dictated the conditions of their society. “Diversity is at the core of farming in a mountain ecosystem,” said Argumedo. “One single variety of potato could never do it here, because every time you move a hundred meters, the eco-climatic conditions change completely.”

Over the last few decades, however, temperatures are rising faster in high mountain areas due to a climate phenomenon known as elevation-dependent warming, and many of the once snow-covered peaks are now bare. As soil and air temperatures rise, greater populations of pests and diseases are hitting the Potato Park — Phytophthora infestans, but also weevils. “Pests and diseases are moving upwards, because their ecological niches are changing,” Argumedo said. “So are plant species.” Farmers have responded to this by moving crops higher too — more than 3,000 feet in the last 30 years.

Upwards and upwards: The farmers, the pests, the plants and the diseases chase each other ever higher. What happens when everyone reaches the top?

The post Rise Of The Plant Destroyer appeared first on NOEMA.

]]>
]]>
The Conscious Universe https://www.noemamag.com/the-conscious-universe Wed, 17 Nov 2021 17:26:14 +0000 https://www.noemamag.com/the-conscious-universe The post The Conscious Universe appeared first on NOEMA.

]]>
Credits

Joe Zadeh is a contributing writer for Noema based in Newcastle.

London was a crowded city in 1666. The streets were narrow, the air was polluted, and inhabitants lived on top of each other in small wooden houses. That’s why the plague spread so easily, as well as the Great Fire. So did gossip, and the talk of the town was Margaret Cavendish, the Duchess of Newcastle.

Cavendish was a fiery novelist, playwright, philosopher and public figure known for her dramatic manner and controversial beliefs. She made her own dresses and decorated them in ribbons and baubles, and once attended the theater in a topless gown with red paint on her nipples. In his diaries, Samuel Pepys described her as a “mad, conceited, ridiculous woman,” albeit one he was obsessed with: He diarized about her six times in one three-month spell.

The duchess drew public attention because she was a woman with ideas, lots of them, at a time when that was not welcome. Cavendish had grown up during the murderous hysteria of the English witch trials, and her sometimes contradictory proto-feminism was fueled by the belief that there was a parallel to be drawn between the way men treated women and the way men treated animals and nature. “The truth is,” she wrote, “we [women] Live like Bats or Owls, labour like Beasts and die like Worms.” 

In 1666, she released “The Blazing World,” a romantic and adventurous fantasy novel (as well as a satire of male intellectualism) in which a woman wanders through a portal at the North Pole and is transported to another world full of multicolored humans and anthropomorphic beasts, where she becomes an empress and builds a utopian society. It is now recognized as one of the first-ever works of science fiction. 

But this idea of a blazing world was not just fiction for Cavendish. It was a metaphor for her philosophical theories about the nature of reality. She believed that at a fundamental level, the entire universe was made of just one thing: matter. And that matter wasn’t mostly lifeless and inert, like most of her peers believed, but animate, aware, completely interconnected, at one with the stuff inside us. In essence, she envisioned that it wasn’t just humans that were conscious, but that consciousness, in some form, was present throughout nature, from animals to plants to rocks to atoms. The world, through her eyes, was blazing. 

Cavendish was not the only one to have thoughts like these at that time, but they were dangerous thoughts to have. In Amsterdam, the Jewish philosopher Baruch Spinoza wrote that every physical thing had its own mind, and those minds were at one with God’s mind; his books were banned by the church, he was attacked at knifepoint outside a synagogue, and eventually, he was excommunicated. Twenty-three years before Cavendish was born, the Italian Dominican friar and philosopher, Giordano Bruno — who believed the entire universe was made of a single universal substance that contained spirit or consciousness — was labeled a heretic, gagged, tied to a stake and burned alive in the center of Rome by the agents of the Inquisition. His ashes were dumped in the Tiber. 

If the dominant worldview of Christianity and the rising worldview of science could agree on anything, it was that matter was dead: Man was superior to nature. But Cavendish, Spinoza, Bruno and others had latched onto the coattails of an ancient yet radical idea, one that had been circulating philosophy in the East and West since theories of mind first began. Traces of it can be found in Hinduism, Buddhism, Taoism, Christian mysticism and the philosophy of ancient Greece, as well as many indigenous belief systems around the world. The idea has many forms and versions, but modern studies of it house them all inside one grand general theory: panpsychism. 

“If the panpsychists are right, it could cast doubt on the foundations of a worldview that has been deeply embedded in our psyche for hundreds of years: that humans are superior to everything around them.”

Derived from the Greek words pan (“all”) and psyche (“soul” or “mind”), panpsychism is the idea that consciousness — perhaps the most mysterious phenomenon we have yet come across — is not unique to the most complex organisms; it pervades the entire universe and is a fundamental feature of reality. “At a very basic level,” wrote the Canadian philosopher William Seager, “the world is awake.”

Plato and Aristotle had panpsychist beliefs, as did the Stoics. At the turn of the 12th century, the Christian mystic Saint Francis of Assisi was so convinced that everything was conscious that he tried speaking to flowers and preaching to birds. In fact, the history of thought is dotted with very clever people coming to this seemingly irrational conclusion. William James, the father of American psychology, was a panpsychist, as was the celebrated British mathematician Alfred North Whitehead; the Nobel Prize-winning physicist Max Planck once remarked in an interview, “I regard consciousness as fundamental.” Even the great inventor Thomas Edison had some panpsychist views, telling the poet George Parsons Lathrop: “It seems that every atom is possessed by a certain amount of primitive intelligence.”

But over the course of the 20th century, panpsychism came to be seen as absurd and incompatible in mainstream Western science and philosophy, just a reassuring delusion for New Age daydreamers. Karl Popper, one of the most influential philosophers of recent times, described it as “trivial” and “grossly misleading.” Another heavyweight, Ludwig Wittgenstein, waved away the theory: “Such image-mongery is of no interest to us.” As the American philosopher John Searle put it: “Consciousness cannot be spread across the universe like a thin veneer of jam.” 

Most philosophers and scientists with panpsychist beliefs kept them quiet for fear of public ridicule. Panpsychism used “to be laughed at insofar as it was thought of at all,” wrote the philosopher Philip Goff in his latest book, “Galileo’s Error: Foundations for a New Science of Consciousness.” But now, we are in the midst of a “full-blown panpsychist renaissance.” Goff is one of a rising tide of thinkers around the world who have found themselves drawn back to this ancient theory. Spurred on by scientific breakthroughs, a lost argument from the 1920s and the encouraging way panpsychism is able to bypass the “hard problem” of consciousness, they are beginning to rebuild and remodel its intellectual foundations, transforming it into a strong candidate for the ultimate theory of reality.

“According to panpsychism,” Goff told me when we met recently in the garden of a pub near Durham University, where he teaches, “consciousness pervades the universe and is a fundamental feature of it. So, it doesn’t necessarily mean everything is ‘conscious.’ The basic idea is that the fundamental building blocks of the universe, perhaps electrons and quarks, have incredibly simple forms of experience, and very complex experience — like that of a human brain — is somehow built up from these very simple and rudimentary forms of experience. … That doesn’t mean your chair is conscious. It means the tiny particles the chair is made up of have some kind of rudimentary experience.”

If the panpsychists are right, it could cast doubt on the foundations of a worldview that has been deeply embedded in our psyche for hundreds of years: that humans are superior to everything around them, disconnected from the insensate matter of nature, marooned on a crumbling planet in a cold and mechanical universe. Panpsychism re-enchants the world, embeds us profoundly within the climate crisis and places us on a continuum of consciousness with all that we see around us. 

“We have become used to the Copernican idea that we are not at the center of the universe but simply one planet among many,” Goff wrote in his first book, “Consciousness and Fundamental Reality.” “Perhaps it’s time for a Copernican revolution about our own consciousness.”


The notion of a world awake might seem unintuitive to most of us, but it is something we adopt naturally in childhood. In 1929, the Swiss psychologist Jean Piaget found that children between two and four years old are inclined to attribute consciousness to everything around them. A child can happily talk to a grasshopper and blame the pavement if they trip up, and it isn’t such an alien thought, at that age, to think a flower might feel the sunlight and perhaps even enjoy it. Fairy tales and children’s media are infused with animate worlds in which trees, animals and objects come to the aid or annoyance of a protagonist.

Most of us dismiss these notions as we mature. Gradually, we rein the concept of consciousness closer and closer in, until, at least in the West, we usually settle on the traditional view that consciousness is present only in the brains of humans and higher animals. 

Along with this goes the premise that consciousness must have sparked into existence from completely non-conscious matter quite recently, cosmically speaking — and only in a tiny corner of the universe. Perhaps a few hundred million years ago, a light bulb flickered on, and something somewhere felt reality for the first time. Before that miraculous spark, the great physicist Erwin Schrodinger wondered, was the universe “a play before empty benches, not existing for anybody, thus quite properly speaking not existing?” As for which higher animals have it and which don’t, there is no agreement, but we have a vague sense. Monkeys and dolphins, definitely conscious; cats and dogs, surely; worms, butterflies and Antarctic krill, probably not. 

But in the last 10 years or so, this understanding has been repeatedly disrupted by new scientific breakthroughs. We are now well versed in the playfulness and creativity of cephalopods, the intelligent communication between fungi and the interspecies sharing economy in forests. Honeybees recognize faces, use tools, make collective decisions, dance to communicate and appear to understand higher-order concepts like zero. Plants can feel you touching them. In fact, the evolutionary ecologist Monica Gagliano has suggested that pea plants can learn behavior, identify the sound of running water and grow towards it and communicate via clicking sounds. When you consider that plants account for around 80% of the total biomass on Earth (the biomass of humans is roughly equivalent to that of Antarctic krill), then extending consciousness to them would mean we are living on a vastly conscious planet.

Recent research into slime mold — a single-celled eukaryotic organism that has no brain, no nervous system and looks like a yellow puddle — found that it makes decisions, perceives its surroundings and can choose the most nutritious food from numerous options. As an experiment, researchers arranged oat flakes in the geographical pattern of cities around Tokyo, and the slime mold constructed nutrient channeling tubes that closely mimicked the painstakingly planned metropolitan railway system. At Columbia University, the biologist Martin Picard has discovered that mitochondria, the organelles found in the cells of almost every complex organism, “communicate with each other and with the cell nucleus, exhibit group formation and interdependence, synchronize their behaviors and functionally specialize to accomplish specific functions within the organism.” Nobody is concluding that mitochondria are conscious, but if an animal the size of a dog acted like this, would we intuitively ascribe to it some basic level of consciousness?

“I find it striking,” the neuroscientist Christof Koch told me on a video call, “that after 2,400 years, we are now back to panpsychism. This is like scientists discussing whether the Earth is actually flat, or if the heart is the seat of the soul. I mean …” 

“Panpsychism re-enchants the world, embeds us profoundly within the climate crisis and places us on a continuum of consciousness with all that we see around us.”

Koch, one of the world’s most renowned neuroscientists, is the chief scientist of the MindScope Program at the Allen Institute for Brain Science in Seattle. Throughout the 1990s and early 2000s, he worked closely with the late Nobel Laureate Francis Crick (who, alongside James Watson, Maurice Wilkins and Rosalind Franklin, discovered the double helix structure of DNA) to build the foundations for a neuroscience of consciousness.

In recent years, Koch has become a champion and staunch defender of Integrated Information Theory (IIT), a leading theory in the neuroscience of consciousness developed by the Italian neuroscientist and psychiatrist Giulio Tononi. IIT is concerned with developing a method to measure the amount of integrated information (which, it posits, represents consciousness) in a physical system. In other words, is something conscious and, if so, how conscious? One of Tononi and Koch’s many ambitions is to create a practical device, “a consciousness meter,” which could measure the level of consciousness in patients in a vegetative state. 

Tononi’s scientific research aligned with what some philosophers of consciousness believe about panpsychism: “It says consciousness is graded and much more widely distributed,” Koch said. “Giulio tried to downplay that at first because he felt people would then reject it out of hand, but I kept on pushing.” In 2015, they published a paper together titled “Consciousness: Here, There and Everywhere?” in which they stated that while IIT was not developed with panpsychism in mind, it did seem to share its central intuitions.

“A lot of people say that’s wacky: Any theory that predicts a microbe is conscious is crazy because it’s so different from my insight into the state of the world,” Koch told me. When people hear the word consciousness, he explained, they assume qualities like self-awareness, emotion, pleasure and pain, but here we are talking about a degree of consciousness that is far more basic: experience. “Clearly, the paramecium doesn’t have psychology; it doesn’t hear bees, it doesn’t worry about the weekend. But the claim is: It feels like something to be the paramecium — and once the cell membrane dissolves and it disintegrates and dies, it doesn’t feel like anything anymore.”

In 2013, Koch traveled to a Tibetan monastery in southern India for a symposium on physics, biology and brain science between Buddhist monk-scholars and Western scientists where he presented the contemporary Western consensus that only humans and some animals are blessed with consciousness. While there, Koch helped with teaching at the monastery. “We showed some movies of little bacteria moving around and asked them what they thought,” he told me. “They said they were clearly sentient. They had no trouble with that.”


Matthew Craven

Arthur Stanley Eddington was a pacifist Quaker from a family of farmers in the north of England. In 1913, at the age of 30, he started working as a professor of astronomy at Cambridge. Six years later, he and the Royal Astronomer Frank Watson Dyson became perhaps the first scientists to prove Albert Einstein’s general theory of relativity, catapulting Einstein, who was then mostly unknown in the English-speaking world so soon after the end of World War I, to global notoriety. Eddington and Dyson organized expeditions to Principe and Brazil to observe a solar eclipse and record what Einstein had predicted: Gravity bends light. Newton was overthrown, and a new revolution in science began. 

Upon his return from Brazil, Eddington dove into the work of the mathematician Bertrand Russell. One of Russell’s observations, first developed in the 1920s and furthered in the 1950s, was this: “All that physics gives us is certain equations giving abstract properties of their changes. But as to what it is that changes, and what it changes from and to — as to this, physics is silent.” In other words, while physical science might appear to give us a nearly complete account of the nature of matter — what everything is made of — it really only provides a description of mathematical structures: the “causal skeleton” of reality. These descriptions are incredibly valuable and have led to many of humanity’s greatest scientific achievements, because they can be used to predict how matter will behave. But as Goff said to me: “Physical science only tells us what stuff does, not what stuff is. It’s not telling us the underlying nature of the stuff that is behaving in this way.”

Consider this crude breakdown of water. What is water? Water is a colorless, transparent, odorless chemical substance that fills our oceans, lakes, rivers and bodies. But what is it composed of? It is composed of sextillions and sextillions of tiny water molecules. What are they composed of? Well, each molecule contains three atoms: two hydrogen and one oxygen. And what are hydrogen and oxygen made of? Subatomic particles like neutrons and electrons. What is an electron made of? An electron has mass and charge. And what are mass and charge? They are properties of the electron. But what is the electron? 

“When I was a young physics student,” wrote the astrophysicist Adam Frank in a 2017 essay, “I once asked a professor: ‘What’s an electron?’ His answer stunned me. ‘An electron,’ he said, ‘is that to which we attribute the properties of the electron.’ That vague, circular response was a long way from the dream that drove me into physics, a dream of theories that perfectly described reality.” In a strange and convoluted way, there is a sense in which we don’t really know what water is. We don’t really know what anything is. As Stephen Hawking wrote in “A Brief History of Time”: “Even if there is only one possible unified theory, it is just a set of rules and equations. What is it that breathes fire into the equations and makes a universe for them to describe?”

Eddington, who by the late 1920s was seen as one of the greatest living scientists in the world, agreed with Russell: “The physicist cannot get behind structure,” he wrote in a glowing review of Russell’s 1927 book “The Analysis of Matter.” That same year, Eddington was invited to give the prestigious annual Gifford Lectures in Edinburgh, which he then turned into his own book, “The Nature of the Physical World.” Written for a wider audience, it contained echoes of Russell and became one of the most influential popular science books of the era, selling 72,000 copies in Britain alone. 

“Even if there is only one possible unified theory, it is just a set of rules and equations. What is it that breathes fire into the equations and makes a universe for them to describe?”
— Stephen Hawking

“The Victorian physicist felt that he knew just what he was talking about when he used such terms as matter and atoms,” wrote Eddington. “Atoms were tiny billiard balls, a crisp statement that was supposed to tell you all about their nature in a way which could never be achieved for transcendental things like consciousness, beauty or humor. But now we realize that science has nothing to say about the intrinsic nature of the atom. The physical atom is, like everything else in physics, a schedule of pointer readings” — measurements on a machine.

Eddington, like Russell before him, felt that the intrinsic nature of matter, the thing that has mathematical structure, could be integral to explaining consciousness. He wrote that there is one clump of matter that we know and experience directly, not through perceptions, equations or measuring devices: the matter that constitutes our brains. We know that the intrinsic matter that constitutes our brains must involve consciousness because that is our rich and subjective moment-to-moment experience of reality.

Throughout the history of consciousness studies, the mind-body problem has caused theory after theory to collapse and falter. How does the matter of the brain, which we know and understand, give rise to the mystery of consciousness? Eddington’s panpsychist argument, said Goff, “turned the problem upside down.” In Eddington’s view, matter is the mystery; consciousness is the thing we understand better than anything else. The only matter we experience directly is the matter in our living brains, and we know that to be conscious. Therefore, we have good reason to believe that all matter is conscious. And while critics of panpsychism are quick to challenge its proponents to prove that consciousness is the intrinsic nature of matter, the panpsychists are equally poised to respond: Prove that it isn’t.

This argument, Goff wrote in “Galileo’s Error,” “is hard to really absorb” because “it is diametrically opposed to the way our culture thinks about science. But if we manage to do so, it becomes apparent that the simplest hypothesis concerning the intrinsic nature of matter outside of brains is that it is continuous with the intrinsic nature of matter inside of brains.” As Eddington wrote in the conclusion to his book: “The stuff of the world is mind-stuff.” Consciousness didn’t emerge or flicker into existence; it has always been there — the intrinsic nature of us and everything around us. This is what breathes fire into the equations.

Despite the popularity of Russell and Eddington’s work in the 1920s and 30s, these ideas were largely lost. Western academia was then being seized by a group known as the Vienna Circle and their logical positivism movement. Interwar Europe was being overrun by extremist violence, dogmatic ideologies and political propaganda, and the logical positivists railed against it all by calling for “exact thinking in demented times” — strict scientific standards and rigorous objectivity. Logical positivism swept through Anglo-American universities and caused a wholesale rejection of abstract metaphysical discussions. The scientific method relied on what could be observed; what went on inside the mind was largely ignored in favor of how humans behaved, ushering in the rise of B.F. Skinner and behavioral psychology. Eddington’s work in particular remained forgotten until Galen Strawson, one of the elder statesmen of modern British philosophy, found his book on the shelf of a holiday home in Scotland and used it to bolster his modern argument for panpsychism.

Since the 1930s, our scientific understanding of the fundamental building blocks of reality has become even weirder. Particles have been shown to behave like waves and waves like particles, depending on the experimental conditions. Particles no longer seem to be the fixed and knowable objects they once were, and different particle physicists will give you different answers to the question, “What is a particle?” Perhaps it is a quantum excitation of a field, vibrating strings or simply what we measure in detectors. “We say they are ‘fundamental,’” Xiao-Gang Wen, a theoretical physicist at the Massachusetts Institute of Technology, told Quanta Magazine. “But that’s just a [way to say] to students, ‘Don’t ask! I don’t know the answer. It’s fundamental; don’t ask anymore.’”

“Physical science only tells us what stuff does, not what stuff is. It’s not telling us the underlying nature of the stuff that is behaving in this way.”
— Philip Goff

In a landmark 2003 essay, Strawson wrote that this ambiguity only strengthened the argument for panpsychism. The idea of our brains as “lumpish, inert matter, dense or corpuscled, stuff that seems essentially alien to the phenomenon of consciousness,” he wrote, “has given way to fields of energy, essentially active diaphanous process-stuff that — intuitively — seems far less unlike the process of consciousness.” Physics doesn’t show our brain as a spongey blood-filled mass composed of tiny concrete particles, but as “an astonishingly (to us) insubstantial-seeming play of energy, an ethereally radiant vibrancy.”

Of course, panpsychism has its flaws. In October, the Journal of Consciousness Studies dedicated an entire issue to responding to Goff’s book. It featured both critical and supportive essays from philosophers, psychologists, neuroscientists, the bestselling author Annaka Harris and renowned physicists like Carlo Rovelli, Sean Carroll and Lee Smolin. “The real problem with panpsychism is not that it seems crazy,” wrote the British neuroscientist Anil Seth in his essay, “it is that it explains nothing and does not generate testable predictions.”

Physicists critical of it have said it is trying to add something new into our scientific picture of reality that wasn’t there before, and that would therefore require alterations to the proven laws of physics to accommodate it. But in one of the journal essays that addressed this point, the philosopher Luke Roelofs (of NYU’s Center for Mind, Brain and Consciousness) argued the opposite: Nothing is being added to our scientific picture of the world — this is just a different interpretation of that picture. “It rests on recognizing that the physical picture itself is just under-specified: It tells us how this thing called an ‘electron’ behaves, and how these properties called ‘charge’ affect that behavior, but never says (never could say) what any of this is in and of itself.” In other words: Consciousness is exactly what the physicists have been studying the structure and behavior of all along. “For a panpsychist,” Goff said, “the story of physics is the story of consciousness. All there is is forms of consciousness, and physics tracks what they do.”

The most notorious hurdle for panpsychism is known as the combination problem. It has been framed in various ways, but its essence is this: How can lots of tiny conscious entities, like fundamental particles, combine to create one big conscious entity, like the human mind? Goff has explored a possible solution via quantum entanglement. The Norwegian panpsychist philosopher Hedda Hassel Mørch has worked with Tononi and IIT to overcome it. And Roelofs has investigated split-brain cases — a radical procedure used for severe epilepsy in which a patient’s corpus callosum, which connects the two hemispheres of the brain, is severed — that seem to result in patients experiencing a split or partial split of consciousness. If consciousness can be split, then why couldn’t it be combined?

“Some people dismiss panpsychism simply because the combination problem has not yet been solved,” Goff wrote in “Galileo’s Error.” “To my mind, this is like someone in 1859 rejecting Darwinism on the basis that ‘On The Origin of Species’ did not contain a completely worked-out history of the evolution of the human eye.” Panpsychism as a research program, he told me, is only just getting started.


In the freezing winter of Boxing Day, 1966, the medieval historian Lynn Townsend White Jr. delivered a speech to the American Association for the Advancement of Science in Washington. Fifty-nine years old at the time, White was a rigorous and provocative historian, and also a devout Christian. But through his studies, he had found something that deeply troubled him, something he felt resonated with the time in which he lived. In 1960s America, the general American public was becoming aware of the effects of pollution and global warming. Rachel Carson’s “Silent Spring” had been out for a couple years, and the landmark Air Quality Act (1967) and the Clean Air Act (1970) were not far off. Just four weeks before White gave his speech in Washington, a toxic copper-colored cloud of acrid smog, so thick it could be wiped off car windscreens, smothered New York for three days, causing 168 deaths and adverse health conditions for hundreds of thousands of people.

How we view the world determines how we treat it, and White felt the roots of the climate crisis went deep into ancient history. They must be analyzed, he thought, because to address the climate crisis we must rethink the fundamental mindset that caused it, without which our solutions might create even more serious problems than they solve. “What people do about their ecology,” he said during his speech, “depends on what they think about themselves in relation to things around them.” In his eyes, the causes weren’t solely scientific, technological, political or economic — they were ideological. Christianity in the West, he thought, was what first separated man from nature and established a relationship of superiority and exploitation with everything around. 

Prior to Christianity, ancient paganism revolved around the idea of an animate world. “Before one cut a tree, mined a mountain or dammed a brook, it was important to placate the spirit in charge of that particular situation, and to keep it placated,” White wrote. But Christianity siphoned these spirits out of the Earth and placed them in heaven. “By destroying pagan animism, Christianity made it possible to exploit nature in a mood of indifference to the feelings of natural objects. … The spirits in natural objects, which formerly had protected nature from man, evaporated … and the old inhibitions to the exploitation of nature crumbled.” This led White to describe Christianity as the most “anthropocentric religion the world had seen.”

“Consciousness didn’t emerge or flicker into existence; it has always been there — the intrinsic nature of us and everything around us.”

Christianity didn’t create the ecological crisis, White asserted, but it laid the foundations for an abusive relationship between man and nature. This religious ideology was infused with the Scientific Revolution (of which the key drivers were deeply religious Christians like Galileo, Descartes, Newton and Bacon) and ushered in an age of technology, capitalism and colonialism that thrived on exploiting the Earth. The universe came to be viewed not as organic and animate, but as a mindless machine, like a clock, the gears of which are governed by scientific laws. The wonder and unpredictability of nature was transformed into something stable, predictable, knowable and therefore controllable. Forests were there to be cleared, hills were there to be mined and animals were there to be slaughtered. This became known as the “mechanistic worldview.” As the science historian Carolyn Merchant wrote in a 1980 book: “Because it viewed nature as dead and matter as passive, mechanism could function as a subtle sanction for the exploitation and manipulation of nature and its resources.”

While we might think we are now living in a “post-Christian age,” this deeply entrenched mindset still haunts us. This “relation to nature,” White wrote, is “almost universally held not only by Christians and Neo-Christians but also by those who fondly regard themselves as post-Christians. Despite Copernicus, all the cosmos rotates around our little globe. Despite Darwin, we are not, in our hearts, part of the natural process.” Echoes of this story, Naomi Klein wrote in her 2014 book “This Changes Everything,” reverberate through a “cultural narrative that tells us that humans are ultimately in control of the Earth, and not the other way around. This is the same narrative that assures us that, however bad things get, we are going to be saved at the last minute — whether by the market, by philanthropic billionaires or by technological wizards.”

White’s paper caused shockwaves. Biblical scholars and theologians criticized him heavily, and he received torrents of hate mail accusing him of being the anti-Christ, a Kremlin agent and more. But he was a religious man, with no intention of turning his back on his God. In the final third of the paper, he reminisced on the forgotten panpsychists of Christianity who had implored people to recognize themselves as within nature, not above it. White called for his fellow Christians — roughly a quarter of the world’s population at the time — to look for ways forward in the example of a saint like Francis of Assisi, who spoke to his flowers and envisioned the Earth itself as something divine. 

Throughout the 1970s, environmentalists and political activists pored over White’s thesis, and it became a seminal text in universities and ecological studies. But his panpsychist conclusions went largely under the radar. His critique became mainstream, but his call to action, at least in the 1960s, largely failed.


The essence of what White, Marchant and then Klein are getting at is that our mechanistic worldview is not an objectively true portrayal of reality, but something that has been constructed. Therefore, it can be reconstructed. Panpsychism is one possible reconstruction. Our inability to fully comprehend the widespread decline of the natural world could be a consequence of our refusal to see ourselves as part of it. And the potential of panpsychism to put mind back into matter, reconnect us to nature, dispel human exceptionalism and revolutionize our ethics is, for many, a clear way forward. 

While analytic philosophers like Goff and Strawson are keen to emphasize that their argument for panpsychism is technical, logical and based on strict theoretical frameworks, the Australian philosopher Freya Mathews has reached similar conclusions via environmental philosophy, which is more concerned with questions about how we live in this world. In 1991, Mathews wrote “The Ecological Self,” a book that essentially described a panpsychist view without using the word — “I didn’t want to shoot myself in the foot by calling it that,” she told me. 

She went on: “I don’t think just accepting something like this at a theoretical level would change us. I don’t think that’s how psychology or motivation works. Reason can’t touch motivation; it doesn’t touch our deeper psyche. But if a theory like this gives us permission to experiment experientially with new ways of seeing and exploring the world — and looking for our purpose, as it were — then we might have a chance of discovering it.”

In the 1960s, the physicist and philosopher Thomas Kuhn analyzed what it takes to cause a paradigm shift in our scientific perspective. A paradigm shift is when a dominant theory is suddenly or gradually overturned when significant anomalies that disprove the theory continue to arise until it goes into a state of crisis. During this crisis, Kuhn wrote, we witness “the proliferation of competing articulations, the willingness to try anything, the expression of explicit discontent, the recourse to philosophy and to debate over fundamentals.” An example of a Kuhnian paradigm shift is the sudden change from Newton’s simple physics to Einstein’s far weirder theories, which, among many things, completely upended the understanding of time. 

Mathews thinks we are now on the verge of another paradigm shift, whether that is to panpsychism or some other worldview that sees nature as more than unfeeling matter. “Our current worldview is leading to the ecological collapse of the planet,” she said. “And it is completely pragmatically self-defeating to continue with it.”


Margaret Cavendish had a fairly robust set of environmental ethics that was rare in 1600s Europe. At a time when Descartes — who gave his dog the very human name Monsieur Grat (“Mr. Scratch”) — was arguing that animals were machine-like senseless automata that felt neither pain nor pleasure, Cavendish was trying to create a dialogue between man and nature.

In her poems “The Hunting of the Hare” and “The Hunting of the Stag,” she abandoned the human perspective to adopt that of the animal being killed. In another, she imagined a conversation between a man and the tree he is about to cut down. These views and others ostracized her from the 17th-century scientific community, and much of her work was either ignored or dismissed. When she became the first woman to visit the all-male scientific institution of the Royal Society in May 1667, Pepys’ account of her visit focused mostly on the offensiveness of her dress. She was viewed by many as insane and irrational; they labeled her “Mad Madge.”

But none of this dissuaded Cavendish, who, in her lifetime, published numerous books of philosophy, fiction, plays and poetry. “I had rather appear worse in singularity,” she said, “than better in the mode.” And if anyone was being irrational, thought Cavendish, it certainly wasn’t her. “Man is more irrational,” she wrote in 1664, “when he believes that all knowledge is not only confined to one sort of Creatures, but to one part of one particular Creature, as the head, or brain of man.”

The post The Conscious Universe appeared first on NOEMA.

]]>
]]>
The Tyranny Of Time https://www.noemamag.com/the-tyranny-of-time Thu, 03 Jun 2021 15:07:33 +0000 https://www.noemamag.com/the-tyranny-of-time The post The Tyranny Of Time appeared first on NOEMA.

]]>
Credits

Joe Zadeh is a contributing writer for Noema based in Newcastle.

On a damp and cloudy afternoon on February 15, 1894, a man walked through Greenwich Park in East London. His name was Martial Bourdin — French, 26 years of age, with slicked-back dark hair and a mustache. He wandered up the zigzagged path that led to the Royal Observatory, which just 10 years earlier had been established as the symbolic and scientific center of globally standardized clock time — Greenwich Mean Time — as well as the British Empire. In his left hand, Bourdin carried a bomb: a brown paper bag containing a metal case full of explosives. As he got closer to his target, he primed it with a bottle of sulfuric acid. But then, as he stood facing the Observatory, it exploded in his hands.

The detonation was sharp enough to get the attention of two workers inside. Rushing out, they saw a park warden and some schoolboys running towards a crouched figure on the ground. Bourdin was moaning and screaming, his legs were shattered, one arm was blown off and there was a hole in his stomach. He said nothing about his identity or his motives as he was carried to a nearby hospital, where he died 30 minutes later. 

Nobody knows for sure what Bourdin was trying to do that day. An investigation showed that he was closely linked to anarchist groups. Numerous theories circulated: that he was testing the bomb in the park for a future attack on a public place or was delivering it to someone else. But because he had primed the device and was walking the zigzagged path, many people — including the Home Office explosives expert, Vivian Dering Majendie, and the novelist Joseph Conrad, who loosely based his book “The Secret Agent” on the event — suspected that Bourdin had wanted to attack the Observatory.

Bourdin, so the story goes, was trying to bomb clock time, as a symbolic revolutionary act or under a naive pretense that it may actually disrupt the global measurement of time. He wasn’t the only one to attack clocks during this period: In Paris, rebels simultaneously destroyed public clocks across the city, and in Bombay, protestors shattered the famous Crawford Market clock with gunfire. 

Around the world, people were angry about time.

The destruction of clocks seems outlandish now. Contemporary society is obsessed with time — it is the most used noun in the English language. Since clocks with dials and hands first appeared on church towers and town halls, we have been bringing them closer toward us: into our workplaces and schools, our homes, onto our wrists and finally into the phone, laptop and television screens that we stare at for hours each day. 

We discipline our lives by the time on the clock. Our working lives and wages are determined by it, and often our “free time” is rigidly managed by it too. Broadly speaking, even our bodily functions are regulated by the clock: We usually eat our meals at appropriate clock times as opposed to whenever we are hungry, go to sleep at appropriate clock times as opposed to whenever we are tired and attribute more significance to the arresting tones of a clock alarm than the apparent rising of the sun at the center of our solar system. The fact that there is a strange shame in eating lunch before noon is a testament to the ways in which we have internalized the logic of the clock. We are “time-binding” animals, as the American economist and social theorist Jeremy Rifkin put it in his 1987 book, “Time Wars.” “All of our perceptions of self and world are mediated by the way we imagine, explain, use and implement time.”

“The clock does not measure time; it produces it.”

During the COVID-19 pandemic, many people have reported that their experience of time has become warped and weird. Being trapped at home or laboring unusually excessive hours makes days feel like hours and hours like minutes, while some months feel endless and others pass almost without notice. It seems the time in our clocks and the time in our minds have drifted apart. 

Academic studies have explored how our emotions (such as pandemic-induced grief and anxiety) could be distorting our perception of time. Or maybe it is just because we aren’t moving around and experiencing much change. After all, time is change, as Aristotle thought — what is changeless is timeless. But rarely does the clock itself come into question — the very thing we use to measure time, the drumbeat against which we define “weird” distortions. The clock continues to log its rigid seconds, minutes and hours, utterly unaware of the global crisis that is taking place. It is stable, correct, neutral and absolute. 

But what makes us wrong and the clock right? “For most people, the last class they had devoted to clocks and time was early in primary school,” Kevin Birth, a professor of anthropology at the City University of New York who has been studying clocks for more than 30 years, told me recently. “There’s this thing that is central to our entire society, that’s built into all of our electronics. And we’re wandering around with an early primary school level of knowledge about it.”

Birth is one of a growing chorus of philosophers, social scientists, authors and artists who, for various reasons, are arguing that we need to urgently reassess our relationship with the clock. The clock, they say, does not measure time; it produces it. “Coordinated time is a mathematical construct, not the measure of a specific phenomenon,” Birth wrote in his book “Objects of Time.” That mathematical construct has been shaped over centuries by science, yes, but also power, religion, capitalism and colonialism. The clock is extremely useful as a social tool that helps us coordinate ourselves around the things we care about, but it is also deeply politically charged. And like anything political, it benefits some, marginalizes others and blinds us from a true understanding of what is really going on. 

The more we synchronize ourselves with the time in clocks, the more we fall out of sync with our own bodies and the world around us. Borrowing a term from the environmentalist Bill McKibben, Michelle Bastian, a senior lecturer at Edinburgh University and editor of the academic journal Time & Society, has argued that clocks have made us “fatally confused” about the nature of time. In the natural world, the movement of “hours” or “weeks” do not matter. Thus the build-up of greenhouse gases in the atmosphere, the sudden extinction of species that have lived on Earth for millions of years, the rapid spread of viruses, the pollution of our soil and water — the true impact of all of this is beyond our realm of understanding because of our devotion to a scale of time and activity relevant to nothing except humans.

During an era in which social constructs like race, gender and sexuality are being challenged and dismantled, the true nature of clock time has somehow escaped the attention of wider society. Like money, the clock has come to be seen as the thing it was only supposed to represent: The clock has become time itself.


Clock time is not what most people think it is. It is not a transparent reflection of some sort of true and absolute time that scientists are monitoring. It was created, and it is frequently altered and adjusted to fit social and political purposes. Daylight savings, for instance, is an arbitrary thing we made up. So is the seven-day week. “People tend to think that somewhere there is some master clock, like the rod of platinum in the Bureau of Weights and Measures, that is the ‘uber clock,’” Birth told me. “There isn’t. It’s calculated. There is no clock on Earth that gives the correct time.”

What’s usually taught in Western schools is that the time in our clocks (and by extension, our calendars) is determined by the rotation of the Earth, and thus the movement of the sun across our sky. The Earth, we learn, completes an orbit of the sun in 365 days, which determines the length of our year, and it rotates on its axis once every 24 hours, which determines our day. Thus, an hour is 1/24 of this rotation, a minute is 1/60 of an hour and a second is 1/60 of a minute.

None of this is true. The Earth is not a perfect sphere with perfect movement; it’s a lumpy round mass that is squashed at both poles and wobbles. It does not rotate in exactly 24 hours each day or orbit the sun in exactly 365 days each year. It just kinda does. Perfection is a manmade concept; nature is irregular.

For thousands of years, most human societies have accepted and moved in harmony with the irregular rhythms of nature, using the sun, moon and stars to understand the passage of time. One of the most common early timekeeping devices, sundials (or shadow clocks), reflected this: The hours of the day were not of fixed 60-minute lengths, but variable. Hours were longer or shorter as they waxed and waned in accordance with the Earth’s orbit, making the days feel shorter in the winter and longer in the summer. These clocks didn’t determine the hours, minutes and seconds themselves, they simply mirrored their surrounding environment and told you where you were within the cyclical rhythms of nature. 

But since the 14th century, we’ve gradually been turning our backs on nature and calculating our sense of time via manmade devices. It began in the monasteries of Northern and Central Europe, where pious monks built crude iron objects that unreliably but automatically struck intervals to help bellringers keep track of canonical hours of prayer. Like any machine, the logic of the mechanical clock was based upon regularity, the rigid ticking of an escapement. It brought with it a whole different way to view time, not as a rhythm determined by a combination of various observed natural phenomena, but as a homogenous series of perfectly identical intervals provided by one source. 

The religious fervor for rationing time and disciplining one’s life around it led the American historian Lewis Mumford to describe the Benedictine monks as “perhaps the original founders of modern capitalism.” It is one of the great ironies of Christianity that it set the wheels in motion for an ever-unfolding mania of scientific accuracy and precision around timekeeping that would eventually secularize time in the West and divorce God, the original clockmaker, from the picture entirely.

“The more we synchronize ourselves with the time in clocks, the more we fall out of sync with our own bodies and the world around us.”

By 1656, the Dutch scientist Christiaan Huygens had invented the first pendulum clock, which delivered homogenous and regular slices of a small unit of time: seconds. Unlike the inconsistent mechanical clocks of before, the clock time of pendulums was nearly perfect. In that same century, the British astronomer John Flamsteed and others developed “mean time,” an average calculation of the Earth’s rotation. Science had found a way around the Earth’s wobbly eccentricities, producing a quantifiable and consistent unit that became known as Greenwich Mean Time. 

Standardized time became vital for seafarers and irresistible to corporate interests, such was the ease it could offer trade, transport and electric communication. But it took longer to colonize the minds of the general public. During the British “railway mania” of the 1840s, around 6,000 miles of railway lines were constructed across the country. Investors (including Charles Darwin, John Stuart Mill and the Brontë sisters) climbed over each other to acquire rail company shares in a frenzy of freewheeling capitalism that caused one of the biggest economic bubbles in British history. Companies like Great Western Railway and Midland Railway began to enforce Greenwich Mean Time inside their stations and on their trains to make timetables run efficiently. 

Every city, town and village in Britain used to set its clocks to its own local solar time, which gave each locale a palpable sense of identity, time and place. If you lived in Newcastle, noon was when the sun was highest, no matter what the time in London was. But as the railways brought standardized timetables, local times were demonized and swept aside. By 1855, nearly all public clocks were set to GMT, or “London time,” and the country became one time zone. 

The rebellious city of Bristol was one of the last to agree to standardized time: The main town clock on the Corn Exchange building kept a third hand to denote “Bristol time” for the local population who refused to adjust. It remains there to this day. 

“Railway time” arrived in America too, splitting the country into four distinct time zones and causing protests to flare nationwide. The Boston Evening Transcript demanded, “Let us keep our own noon,” and The Cincinnati Commercial Gazette wrote, “Let the people of Cincinnati stick to the truth as it is written by the sun, moon and stars.”

The 1884 International Meridian Conference is often framed as the moment clock time took over the world. The globe was sliced into 24 time zones declaring different clock times, all synchronized to the time of the most powerful empire, the British and their GMT. Nobody would decipher time from nature anymore — they would be told what time it was by a central authority. The author Clark Blaise has argued that once this was implemented, “It didn’t matter what the sun proclaimed at all. ‘Natural time’ was dead.”

“Clock time is not what most people think it is. It was created, and it is frequently altered and adjusted to fit social and political purposes.”

In reality, this process had already been taking place throughout the 1800s as a result of European colonialism, imperialism and oppression. Colonialism was not just a conquest of land, and therefore space, but also a conquest of time. From South Asia to Africa to Oceania, imperialists assaulted alternative forms of timekeeping. They saw any region without European-style clocks, watches and church bells as a land without time. 

“European global expansion in commerce, transport and communication was paralleled by, and premised upon, control over the manner in which societies abroad related to time,” the Australian historian Giordano Nanni wrote in his book, “The Colonization of Time.” “The project to incorporate the globe within a matrix of hours, minutes and seconds demands recognition as one of the most significant manifestations of Europe’s universalizing will.” In short, if the East India Company was the physical embodiment of British colonialism overseas, GMT was the metaphysical embodiment.

The Western separation of clock time from the rhythms of nature helped imperialists establish superiority over other cultures. When British colonizers swept into southeastern Australia in search of gold, they depicted the timekeeping practices of the Indigenous societies they encountered as irregular and unpredictable in contrast to the rational and linear nature of the clock. This was despite the fact that Indigenous societies in the region had advanced forms of timekeeping based on the moon, stars, rains, the blossoming of certain trees and shrubs and the flowing of tides, which they used to determine the availability of food and resources, distance and calendar dates.

“Nineteenth-century Europeans generally conceived of such closeness to nature as calling into question the very humanity of those who practiced it,” Nanni wrote. “This was partly determined by the fact that Enlightenment values and ideals had come to associate the idea of ‘humanness’ with man’s transcendence and domination over nature; and its corresponding opposite — savagery — as a mode of life that existed ‘closer to nature.’”

In Melbourne, churches and railway stations grew quickly on the horizon, bringing with them the hands, faces, bells and general cacophony of clock time. By 1861, a time ball was installed in the Williamstown Lighthouse and Melbourne was officially synchronized to Greenwich Mean Time. British colonizers attempted to integrate Indigenous peoples into their labor force with unsatisfactory results due to their unwillingness to sacrifice their own form of timekeeping. They did not believe in “meaningless toil” and “obedience to the clock,” wrote the Australian sociologist Mike Donaldson. “To them, time was not a tyrant.”

In some parts of Australia, the Indigenous resistance to Western clock time continued defiantly. In 1977, in the tiny town of Pukatja (then known as Ernabella) a giant, revolving, electronically operated clock was constructed near the town center for the local Pitjantjatjara people to coordinate their lives around. A decade later, a white construction worker at a town council meeting noted that the clock had been broken for months. Nobody had noticed, because nobody looked at it.

“Nineteenth-century Europeans generally conceived of such closeness to nature as calling into question the very humanity of those who practiced it.”
— Giordano Nanni

The movement toward standardized time reached its apex in the 1950s, when atomic clocks were judged to be better timekeepers than the Earth itself. The second, as a unit of time, was redefined not as a fraction of the Earth’s orbit around the sun, but as a specific number of oscillations of cesium atoms inside an atomic clock.

“When you look at precision timekeeping, it’s all about insulating and isolating these clocks from responding to anything that goes on around them,” Bastian told me via a video call from her home in Edinburgh. A poster with the words “A clock that falls asleep” hung on the wall behind her. “You have to keep them separate from temperature fluctuations, humidity, even quantum gravity effects. They can’t respond to anything.”

Over 400 atomic clocks in laboratories around the world count time using the atomic second as their standard. A weighted average of these times is used to create International Atomic Time, which forms the basis of Coordinated Universal Time (UTC). UTC isn’t completely non-responsive. Every few years, a leap second is added to it to keep it reasonably close to the rotations of the Earth. But in 2023, at the World Radiocommunication Conference, nations from around the world will discuss whether it is in our best interest to abolish leap seconds and permanently unmoor ourselves from the sun and moon in favor of time we manufacture ourselves. 


“It’s easier to imagine the end of the world than the end of capitalism,” wrote the literary critic Fredric Jameson. One of the hardest elements to imagine is what capitalism has done to our perception of time via clocks. It now seems embedded into our very psychology to view time as a commodity that can be spent or wasted. 

Capitalism did not create clock time or vice versa, but the scientific and religious division of time into identical units established a useful infrastructure for capitalism to coordinate the exploitation and conversion of bodies, labor and goods into value. Clock time, the British sociologist Barbara Adam has argued, connected time to money. “Time could become commodified, compressed and controlled,” she wrote in her book “Time.” “These economic practices could then be globalized and imposed as the norm the world over.”

Clock time, Adam goes on, is often “taken to be not only our natural experience of time” but “the ethical measure of our very existence.” Even the most natural of processes now must be expressed in clock time in order for them to be validated. 

Women in particular often find themselves at the wrong end of this arbitrary metric. Unpaid labor such as housework and childcare — which still disproportionately burdens women — seems to slip between the measurements of the clock, whereas the experience of pregnancy is very much under the scrutiny of clock time. Adam quotes a woman’s account of her birth-giving experience: “The woman in labor, forced by the intensity of the contractions to turn all her attention to them, loses her ordinary, intimate contact with clock time.” But in the hospital environment, where the natural process of childbirth has been evaluated and standardized in clock-time units, a woman is pressured to follow what Alys Einion-Waller, a professor of midwifery at Swansea University, has called a “medicalized birth script.” 

“It now seems embedded into our very psychology to view time as a commodity that can be spent or wasted.”

The firsthand experience and intuition of a woman giving birth is devalued in favor of timings and measurements related to the expected length of labor stages, the spacing of contractions, the progress of cervical dilation and other observations. Language such as “failure to progress” is common when a woman doesn’t perform to the expected curve, and diversion from the clock-time framework can be used to justify medical intervention. This is one of the reasons that the home-birthing movement has recently grown in popularity.

Likewise, new parents know that the baby itself becomes their clock, and any semblance of standardized time is preposterous. But in time, of course, the baby joins the rigid temporal hierarchy of school, with non-negotiable class and mealtimes, forcing biological rhythms to adhere to socially acceptable clock time.

As Birth put it to me: “The clock helps us with things that are uniform in duration. But anything that is not uniform, anything that varies, the clock screws up. … When you try to schedule a natural process, nature doesn’t cooperate.”


In 2002, scientists watched in amazement as Larsen B, an ice shelf on the Antarctic Peninsula 55 times bigger than Manhattan — which had been stable for 10,000 years — splintered and collapsed into hundreds of shards the size of skyscrapers. A glaciologist who flew overhead told Scientific American that he could see whales swimming in water where ice a thousand feet thick had been just days earlier. 

Virtually overnight, previous clock-time predictions around the mass loss of ice needed to be rewritten to acknowledge a 300% acceleration in the rate of change. In 2017, a piece of the nearby Larsen C ice shelf fell off, creating the world’s biggest iceberg — so big that maps had to be redrawn. The Intergovernmental Panel on Climate Change calls such abrupt events, which happen more often than you might think, “surprises.”

The climate crisis is a realm in which linear clock time frequently and fatally misfires. It frames the crisis as something that is measurable, quantifiable and predictable — something we can envisage in the same way as work hours, holidays, chores and projects. Warming temperatures, ocean acidification, ice melting and carbon dioxide levels in the atmosphere are constantly being translated into clock time to create tipping points, thresholds, roadmaps and sustainable development goals for us to beat or aspire to. When a “surprise” happens, time estimates crumble in the face of reality. Nature doesn’t cooperate.

It works the same way for putting limits on the amount of time we have to stop global warming. The Guardian launched a blog called “100 months to save the world” in July 2008 that used scientific research and predictions to make it “possible to estimate the length of time it will take to reach a tipping point.” That was 154 months ago. Are we 54 months into the end of the world? Perhaps. But one can’t help but wonder if the constant framing of the climate crisis in clock time deadlines, which then pass without comment, has contributed to the inability and inertia of many to comprehend the seriousness of what is actually happening.

“It’s a privilege to live by clock time alone and ignore nature’s urgent temporalities.”

“We can’t say that clock time isn’t important,” Vijay Kolinjivadi, a researcher at the University of Antwerp’s Institute of Development Policy, told me. “There’s certain times when that metric makes a lot of sense, and we should use it. For instance, you and I decided to talk at 10 a.m. There’s no way to escape that. But when we are thinking about capitalism, social crisis and ecological breakdown, it gets problematic.” Clock time, he went on, “is always geared toward production, growth and all the things that created this ecological crisis in the first place.”

One of the most affecting myths of clock time is that we all experience time at the same steady pace. We don’t. “The future is already here,” the science-fiction author William Gibson famously said in 2003, “it’s just not very evenly distributed.” And framing the climate crisis as a ticking clock with only a certain amount of time “to avoid disaster” ignores those for whom disaster has already arrived. The reality is that it’s a privilege to live by clock time alone and ignore nature’s urgent temporalities.

Every few years, the American Midwest is ravaged by floods as the Missouri River swells from intense rainfall, upending the lives of millions. When the floods came during the summer of 1993, a New York Times journalist interviewed a resident about the night he was evacuated. “He remembers everything about the night the river forced him and his wife out of the house where they had lived for 27 years — except for this. ‘I can’t tell you what day it was. … All I can tell you is that the river stage was 26 [feet] when we left.’” The headline of the article was, “They Measure Time by Feet.”


In 1992, the astrophysicist turned author Alan Lightman published a novel called “Einstein’s Dreams” in which he fictionalizes a young Albert Einstein dreaming about the multitude of ways that different interpretations of time would play out in the lives of those around him. In one dream, Einstein sees a world where time is not measured — there are “no clocks, no calendars, no definite appointments. Events are triggered by other events, not by time. A house is begun when stone and lumber arrive at the building site. The stone quarry delivers stone when the quarryman needs money. … Trains leave the station at the Bahnhofplatz when the cars are filled with passengers.” In another, time is measured, but by “the rhythms of drowsiness and sleep, the recurrence of hunger, the menstrual cycles of women, the duration of loneliness.”

Recently, there have been many attempts in both art and literature to reimagine the clock and the role it plays in our lives. At the end of 2020, the artist David Horvitz exhibited a selection of clocks he had created, which included one that was synchronized to a heartbeat. Another artist, Scott Thrift, has developed a clock called “Today,” which simplifies the passage of time into dawn, noon, dusk and midnight as opposed to seconds, minutes and hours. It moves at half the speed of a regular clock, making one full rotation in a day. 

Bastian herself has proposed clocks that are more responsive to the temporalities of the climate crisis, like a clock synchronized with the population levels of endangered sea turtles, an animal that has lived in the Pacific Ocean for 150 million years but now faces extinction due to temperature changes. These and other proposals all have the same idea at their core: There are more ways to arrange and synchronize ourselves with the world around us than the abstract clock time we hold so dear.

“They have been trapped by their own inventiveness and audacity. And they must pay with their lives.”
— Alan Lightman

Clock time may have colonized the planet, but it did not completely destroy alternative traditions of timekeeping. Certain religions maintain a connection to time that is rooted in nature, like salat in Islam and zmanim in Judaism, in which prayer times are defined by natural phenomena like dawn, dusk and the positioning of stars. The timing of these events may be converted into clock time, but they are not determined by clocks. 

In places where globally standardized time is enforced, some still rebel, like in China, where the entire country is under one time zone, BST (Beijing Standard Time). In Xinjiang, nearly 2,000 miles west of Beijing, where the sun sometimes sets at midnight according to BST, many Uighur communities use their own form of local solar time. 

And Indigenous communities around the world still use ecological calendars, which keep time through observations of seasonal changes. Native American tribes around Lake Oneida, for example, recognize a certain flower blooming as the time to start plowing and setting traps for animals emerging from hibernation. As opposed to a standardized clock and calendar format, these ecological calendars, by their very nature, reflect and respond to an ever-changing climate. 

In one of the last dreams in Lightman’s book, Einstein imagines a world not too dissimilar from our own, where one “Great Clock” determines the time for everyone. Every day, tens of thousands of people line up outside the “Temple of Time” where the Great Clock resides, waiting their turn to enter and bow before it. “They stand quietly,” wrote Lightman, “but secretly they seethe with their anger. For they must watch measured that which should not be measured. They must watch the precise passage of minutes and decades. They have been trapped by their own inventiveness and audacity. And they must pay with their lives.”

The post The Tyranny Of Time appeared first on NOEMA.

]]>
]]>