IMG_20160604_112448At the Brain Bar Budapest futurological congress in Budapest, Etienne Augé, Senior Lecturer at the Erasmus School of History, Culture and Communication at the University of Rotterdam, and Founder Director of CHIFT (Community for Histories of the Future), spoke on “Why do we need science fiction? Recipes for crisis management.” Augé is one of the best exponents of a Three Days of the Condor-style think tank approach to science fiction, where the genre serves as a reservoir of scenarios for actual real-world crisis management. So if you want to know how to save the world, get reading, people.

Running through examples including 1984, Farenheit 451, Soylent Green, and Gattaca, Augé outlines how “we study what science fiction can tell us about crisis in the past or the future … What we do is study science fiction to see what extent we can predict the future.” And he makes a case for science fiction as a discipline that exists both to actual invent the future, and also “to prevent things … to warn us against possible forms of the future … not to predict, but to invent, and prevent.” Instancing H.G. Wells, who actually forecast the widespread use of submarines, but who also cautioned that “my imagination refuses to see any sort of submarine doing anything but suffocating its crew and floundering at sea,” he touches on Orwell’s prediction of a society “observing each other constantly” over social media, and Bradbury’s prognostication of a world that burns books because “books do not make people happy, they make people think.”

Augé notes that science fiction can “give us some solutions.” However, he also cites the example of his students at Erasmus University who he polled on The Matrix: some 30 percent wanted to take the blue pill, and live in a fantasy existence cocooned from reality and its threats. Even now, maybe, the world needs all the tough-minded readers – and unsettling science fiction dystopias – it can swallow.

2 COMMENTS

  1. The one problem with this is that ordinary people tend to place too much emphasis on the negative aspects of the future depicted in science fiction. Let computer automation get smarter and smarter, and people’s first thought will be to worry about the killer robots from Terminator or The Matrix taking over the world. Let nuclear energy become more widespread and people worry about the potential for accidents and mutations. As a noted SF author pointed out, most people are terrible at estimating risk, and the overall effect of all this SF is to throw such terrible scares into people that they actually slow down scientific progress now.

The TeleRead community values your civil and thoughtful comments. We use a cache, so expect a delay. Problems? E-mail newteleread@gmail.com.