A golden age of satire


In 2012, The People’s Daily reported that Kim Jong Un had been voted the sexiest man alive. Quoting their source, The Onion, a little too extensively, the press organ of the Chinese Communist Party accounted for the newly-minted dictator’s triumph of beauty in a strange mix of terms: “With his devastatingly handsome, round face, his boyish charm, and his strong, sturdy frame, this Pyongyang-bred heart-throb is every woman’s dream come true.”  This praise is redolent, at once, of a Mills and Boon hero and what could be experienced at the cheesier end of match.com profiles.

When a man who, at best, has the chubby good looks of a young Rosie O’Donnell beats Depp, Pitt and Clooney to such an august title, most readers, you hope, would assume satire. Being generous, we could chalk up the assumption of fact to a cross-cultural crossing of wires. Or perhaps, in the wake of Gangnam style, to a vogue for haircuts that combine military precision with a party-boy attitude. More seriously and more significantly, though, The Onion can be seen as part of a flourishing satirical culture that finds spin-offs in The Daily Currant and a television equivalent in Stephen Colbert’s Bill O’Reilly-inspired persona. All draw on a tradition of satire that, in coming so close to its target, has the potential to be misinterpreted, at least by some, as fact.

The Kim Jong Un incident is only the most public instance of a much more widespread phenomenon, seen in the constant posting and sharing of Onion and Daily Currant articles on Facebook and the litany of outraged comments that follow headings like ‘Man Responsible For Olympic Ring Mishap Found Dead In Sochi’ (Daily Currant 8/2/14), which is likely to incur more rage than its playful counterpart, ‘Winter Olympics Inspire Nation’s Youth To Try Sports Their Parents Can’t Afford’ (The Onion 20/2/14) and more likely to be seen as factual, I think, than a headline from last week’s Onion: ‘German Leaders Quietly Confident They Could Pull Off Another Holocaust If They Ever Really Wanted’ (21/2/14).

Daniel Defoe

One of the earliest instances of this type of satire was Daniel Defoe’s The Shortest Way with the Dissenters, first published in 1702. As Colbert does with Fox News, Defoe did with High Church  clergymen and their Tory sympathisers. He impersonated their rhetoric, blaming Dissenters (members of Protestant sects outside the Church of England) for every upheaval, rebellion and revolution of the previous century. Defoe, one of the early exponents of the English novel, was himself a Dissenter, but The Shortest Way was published anonymously and so was taken by many to be a genuine contribution to contemporary debates on religious toleration. When Defoe was revealed to be the author, the backlash was so stark that he was arrested for seditious libel. Looking back on this episode, scholars are preoccupied by questions of genre that have interesting implications for Defoe’s modern-day inheritors.

There is an argument that Defoe’s pamphlet was intended as a hoax. This supposes a wide-scale inability to appreciate irony. And, indeed, in Defoe’s time there was a marked reluctance by many to read the pamphlet as completely ironic. Yet the term ‘hoax’ also suggests an intention to dupe, and so denies that the effectiveness of such arguments lies in the moment they are revealed as satirical.  A recent sociological study on Colbert’s viewers found that Conservatives are more likely to think that he is only pretending to be joking, using humour to couch positions he actually endorses. And so in our own time, when it comes to political humour we too have difficulty processing what I will call – for lack of a widely-accepted categorical term – deadpan ironic impersonation. There isn’t the implication that Colbert is trying to commit a hoax but there is, in some, an unwillingness to see him as purely ironic.

Unlike John Stewart’s The Daily Show or the Weekend Update segment of Saturday Night Live, which poke fun at the news events, The Onion and Colbert, following Defoe, take to task the form of delivery. Defoe was writing in the first age of the press where a fear of the spread of information was metaphorically realised as a plague-like contagion. With modern epidemics, the metaphor has shifted to a viral spread as a means of quantifying the 24-hour news cycle and the proliferation of information on the internet. But in both eras, new media are met with a breed of satire, which in its propensity to be taken as fact, exposes the excesses of said media. And so, at the very least, we can thank Rupert Murdoch for another golden age of satire.

*Marc Mierowsky [2011] is doing a PhD in English.

A global battlefield

ID-100102817 (1)

Lest we forget, this year marks the 100th anniversary of the war that shaped the 20th century. The first of three world wars (two hot and one cold), this conflict is remembered once a year as a lesson in human suffering, as a reminder that the war to end all wars was only the beginning of the human cost of the past century.

But do we really remember? Or do we merely pay lip service to memory, an absent-minded nod to the past while we continue to relive and rescript its greatest tragedies?

Wars make good stories, and the First World War in particular lends itself to a certain kind of narrative; the wastage of a generation, the death of optimism, Europe’s loss of innocence. But this war also marked the beginning of something, a spectre that would haunt the margins of the 20th century and dominate the narrative of the 21st. The First World War was also the first global war on terror.

‘Terror’ as a tactic has a long history, as all forms of war can also be seen as forms of terrorism. The breaking of an enemy’s morale through aggressive and violent offensives remains an integral aspect of military strategy to this day. It is not so much the tactic of terror, but rather the concept of it, which can trace its roots to the early 20th century.

Total war

Political assassinations and public bombing campaigns by small networks of non-state actors matured as tactics of anti-governmental resistance in the second half of the 19th century. It is during this period that English-language sources really began to use the word ‘terrorism’, but there is a flexibility to its usage at this point, an uncertainty in its exact meaning and application. By the 1920s, however, this uncertainty had been replaced by the iron-clad conviction of administrators like Lord Lytton, the Governor General and Viceroy of India, that terrorism was “a thing entirely apart by itself, a danger that must be faced and got rid of because of its own intrinsic evil”.

How did these revolutionary networks, previously described in the language of sedition or anarchism, come to be redefined as existential threats to civilised society? It was partly a consequence of the global frame within which the First World War was fought. In the context of a global war, regional resistance movements took on a global significance. Financing and promoting unrest in the far-flung imperial possessions of Britain, France, and Russia became an important part of German strategy, with agents being despatched to places like Persia (Iran), Afghanistan, and North America.

At the same time, previously existing revolutionary movements took on new life as Irish and Indian radicals attempted to use the distraction of the war to overthrow their imperial governments. These movements were countered by the creation of transnational intelligence services and strict wartime legislation, as the magnitude of the war provided the opportunity for states (even those that prided themselves on civil liberties) to arm themselves against the threat of internal unrest.

The postwar settlement allowed sovereign states to (temporarily) suspend hostilities against each other, but no settlement was reached with the anti-colonial ‘terrorists’, whose regional grievances continued to be bolstered by transnational networks of arms, money and ideas. While sedition had previously been understood as a breach of the established law, terrorism came to be considered a thing apart from the law entirely, which could consequently be met only with extralegal powers of surveillance and detention, previously reserved for times of overt war.

The First World War was thus a total war, not only in its scope, but in its pervasiveness as well. Not because it was the war to end all wars, but because of the way that it extended war beyond geographical or temporal limitations, introducing a global battlefield on which everybody was a suspect and everybody was a target.

A battlefield that no one wants to commemorate.

 *Joseph McQuade [2013] is doing a PhD in History on the use of political violence in the early 20th century. Picture credit: dan and http://www.freedigitalphotos.net.

Architects need to address long-term climate change challenge


Flooding in the UK has caused huge problems this winter, damaging homes and possessions. In developing countries, from the Philippines to Haiti, extreme weather events are growing. Such events come on top of unpredictable changes in the world economy, the rapid and constant pace of technology and political and social upheaval, with developing countries being on the frontline of such global shifts.

Such changing times have raised huge questions for architects. How do you create long-lasting buildings in a developing country? What is the future for our urban environments, our cities, our buildings amid all these changes? The dual challenge architects face today is how to create buildings with a low carbon footprint while at the same time ensuring that they are future-proofed to reduce their vulnerability to the effects of climate change and other unforeseen potential surprises.

Developing countries are experiencing rapid economic growth with most undergoing a construction boom. Alongside a demand for meeting the needs of an increasing human population amid ever dwindling resources, architects and design teams in these emerging economies are being asked to respond to pressure:

– to build low-carbon buildings which conserve energy and mitigate climate change;

– to design buildings that can adjust not only to ever-changing climate conditions, but to the effects of such conditions, including escalating flooding risks, overheating risks, strains on water resources and less stable ground conditions among other environmental hazards;

– to meet the aspirations of a fast-developing society, without compromising on energy consumption and attempts to cut emissions during the urbanisation process.

Future-proofing our buildings

The question of how we can guarantee a building can meet all these demands over decades is not just about designing for our current climate conditions. We need to design buildings which will meet the demands of tomorrow, given that a building could be in use for 50 to 100 years or more. Making rational design decisions for such an extended timescale in an uncertain future is daunting.

It is a future where factors that drive change such as the economy, politics, society, technology and the environment are characterised by randomness. Most of today’s buildings are not built with such change in mind. Architects need to take into account the various uncertainties that might occur in a building’s lifetime and manage the risks involved. These risks may not be clear during the design and construction of the building. They need to adopt a risk approach that aids in understanding, assessing and managing unforeseen and potential surprises.

A risk management approach involves mapping out and quantifying the physical, technological, environmental and socio-economic consequences of dealing with a whole range of uncertainties and how they might affect building performance both now and in the future.

Simulating change

My current research investigates how computer-aided building simulation can be used to model these complex and dynamic interactions over different timescales to produce accurate and reliable results. This could give us a chance to test different design interventions whose success in achieving building performance robustness would remain unaffected both in the short and long term. Moreover, it would be possible to assess and ultimately rank these strategies’ effectiveness and urgency along a time axis.

Ultimately, this would help architects to identify and assess whole life building performance in specific locations. The results of such risk management work will undoubtedly help policymakers to target resources and help them identify areas of possible technological advancements. They would also help with the setting of building regulations, policy strategies and updated building codes which improve buildings’ effectiveness and ensure that they are future proofed against runaway climate change.

*Linda Gichuyia [2011] is doing a PhD in Architecture at the University of Cambridge. Picture credit jiggoja and http://www.freedigitalphotos.net.

Ours to reason why: a reflection on GYSS 2014@one-north

GBTB (1)

My area of research often strikes me as very detached from application – I work in basic research, studying the fundamental physics of electrons interacting with metals. Recently, I attended the Global Young Scientists Summit 2014 (GYSS 2014@one-north) in Singapore. On my way to Singapore, two questions were at the forefront of my mind: What are good reasons for doing scientific research, particularly basic research? And is new research or new technology truly valuable for solving the problems facing us today?

I hoped GYSS 2014, a meeting of eminent scientists (e.g., Nobel Prize winners, Fields Medalists) and some 350 young scientists (e.g., PhD students, post-docs, and early-career scientists from industry, government agencies, and the academy) would offer new insight into why we do research (and why we fund research) in the sciences.

GYSS 2014 is the second meeting of its kind, held this year on the campus of the Nanyang Technological University (NTU). Modeled on the Lindau Nobel Laureate Meetings, the National Research Foundation of Singapore organises GYSS. The week-long event consists of plenary lectures and panel discussions by senior scientists, smaller group sessions with the plenary speakers and site visits to various institutions and science-related endeavours in Singapore such as the National University of Singapore (NUS), the Building and Construction Authority’s Zero Energy Building, and the Gardens by the Bay [pictured].

As explained by the President of Singapore Dr Tony Tan in his remarks at the closing ceremony of GYSS 2014, GYSS is part of Singapore’s long-term strategy for investing in science and technology, a tremendous commitment at a time when many other governments are restricting research funding. The institutional origins of the invited young scientists suggest a particular connection to existing relationships – universities like Cambridge, MIT, Hebrew University, Technion-Israel Institute of Technology and ETH Zurich were well represented and are collaborators on the CREATE project [Campus for Research Excellence and Technological Enterprise].

The motives behind GYSS are likely multifold as NUS faculty encouraged GYSS participants to apply for post-doctoral and faculty positions in Singapore (joining hundreds of applicants per position at these ‘Top 100’ universities). To my mind, GYSS serves to put Singapore on the map for young scientists both for career options in the immediate future and for developing collaborations and the general prestige of Singapore as a leader in science and technology. GYSS certainly enhances relationships between young scientists and Singapore, but to what extent did GYSS 2014 accomplish its goals, captured in the keywords “Excite | Engage | Enable”?

A better world?

The theme for GYSS 2014 was given as “Advancing Science, Creating Technologies for a Better World”. During the first few plenaries, this began to make me bristle as it appeared that the implicit moral justification for scientific research was not going to be deconstructed. The theme suggests a direct connection between science and technology and emphasises that these will without a doubt yield a “Better World”. Basic research does not always or at least does not immediately create technologies and often existing and new technologies do more harm than good.

These nuances became apparent during the week, but were never stated outright. Several of the speakers repeated platitudes in messages offered to attendees: “Follow your heart” and “Dare to take risks”. Is that really enough to be successful or to do good work? Is this new information for young scientists?

In his opening plenary, the Nobel prize-winning biologist Aaron Ciechanover suggested obesity is a behavioural issue by showing a photo of an obese woman eating a large cake by herself. To me, this suggested an incredibly narrow box from which he considered his remedies for global health. Scientific and medical research cannot stand in isolation from social science on this issue.

A strong argument for developing existing technologies rather than creating new ones was offered by biochemist and Nobel Laureate Hartmut Michel. In his lecture, he refuted the logic of biofuels and offered a vision for large-scale implementation of current solar energy technology in a Desertec-type approach. US chemist and another Nobel Laureate Robert Grubbs also offered a critical approach in suggesting uncertainty about the ultimate benefit of industrial use of his renowned catalyst in biofuel refining.

Yet at no point during the conference was there an explicit acknowledgement that science and technology might be separate and/or separable in terms of their moral justifications. During a panel discussion, one student asked about the potential negative impacts of selling a start-up to a large tech company, but the student was met by a resounding answer from the speakers that sale of a start-up was the mark of success and should only be met with a glass of Champagne. Another conference participant later suggested that perhaps these speakers were simply from a different time and that their insights might be complemented well by advice from mid-career scientists facing today’s research and funding environment.


Several of the speakers whose work comprised basic research did present answers to my original question: Science is about truth. Science is about curiosity. Basic research is not-yet-applied research. Or as one speaker quoted testimony to the US Congress justifying funding for the Fermilab particle accelerator, science is part of our culture.

I am ultimately left with mixed feelings. I feel that my motivation for doing research has been revitalised, but GYSS contributed to this effect only passively. I am not convinced that the massive carbon footprint associated with participants flying to Singapore sufficiently balanced the intellectual return for participants.

At the same time, from considering the speakers’ narratives, I have renewed appreciation for the long time-scale and necessary patience, determination and vision required to do novel research. I am energised by the broad sweep of the many areas of science on display at GYSS 2014, giving real shape to research possibilities beyond the comfortable corner of my own training. I am also personally keenly aware of a need to make conscious and active choices about the applications of my research and moral justifications for the work I do. And I believe in the pursuit of these and many other questions through research and about research.

*Sean Collins [2012] is doing a PhD in Materials Science at the University of Cambridge. His research focuses on studying the optical properties of metal nanoparticles using high energy electron beams.