Future Fest 2018 told us to fear the future, rather than be hopeful for it
Note: Now (Oct 2020), 2 years later, and I’ve come to disagree with much of what I’ve asserted here. To be blunt: my opinion has shifted. I now agree with much of what is said in the comments of this article by the excellent author Adam Greenfield. After Adam commented on this article, I wrote and rewrote responses to it, but I felt they all landed flat, so I said nothing.
Adam notes below, that ‘the critic’s task has nothing to do with being “productive,” or constructive, or even polite.’ I agree with this, and I now find myself offering criticisms about technologies.
I still find myself frustrated and unfulfilled by the inaction that precipitates from tech criticisms (especially my own). Perhaps this is why creating is the more popular pastime: it fulfills.
I encourage you to still read this article, but be sure to read some of my newer ones, my newsletter or better yet read Adam’s Book.
I’d like to thank Adam for helping to set me on a course to re-conceptualising our approach to technology! I’ve reprinted his comments directly below, with the original article beneath that.
But it isn’t the business of the critic to help develop “opportunities for innovative, collaborative solutions,” whatever that means. It may well be the business of the critic to point out that there are and can be no “solutions” to social issues, because the circumstances confronting us aren’t bounded problems susceptible to solution in the first place. (Really, between the contributions of Donella Meadows and Stafford Beer, this was all very well understood forty years ago, and it’s a pretty damning indictment of innovation discourse that it remains so ahistorical, so apolitical and so ignorant that it speaks of “solutions” in the first place.)
Beyond that, of course knowledge production in the field of innovation is going to be adversarial. The scientific method itself is adversarial. It proceeds by falsification. To the degree that the culture of technical development models itself on the culture of scientific discovery, it’s necessarily going to partake of that quality.
But more simply, you’re experiencing friction and pushback because we’re not all on the same team. We don’t want the same things, don’t necessarily imagine the same end states as desirable. Would it surprise you to learn that outside the hegemony of corporate innovation thought, there are those of us who find many of the solutions it deigns to propose repellent, when not simply impracticable, and its conception of human subjectivity, sociality and solidarity radically impoverished?
Under such circumstances, the critic’s task has nothing to do with being “productive,” or constructive, or even polite. It simply consists in observing when barbarism and moral atrocity are in the offing, and doing whatever can reasonably be done to prevent them.
At FutureFest 2018 water dispensers were powered by fob.
People could move a small fob on a string to a highlighted area on a dispenser to fill up their water bottle. Four people could get water from each dispenser, given that the bulky cubic dispensers had 4 fobs on each side. There were at least 4 of these squat dispensers placed throughout the event, clearly intended to show off some fancy future technology, albeit in a rather silly way.
By the end of the event, all 4 off the dispensers were out of order — only 1 fob on 1 side of a single dispenser worked.
I couldn’t help but wonder if this was intentional, given the pessimistic tone towards technology that existed at Future Fest.
Movers and shakers, futurists and artists of renown were all present at this established London Festival. The aim of the event was to “put control back into the hands of the people” and to build “bold solutions to this era’s biggest challenges.”
But the theme at Futurefest was one of trepidation and cynicism. Indeed, the cynicism was about the now as much as the future. Data, more often that not, was seen as the enemy. The Big 5 were the invisible villain. They were utterly invisible in that they were seen to be all powerful and everywhere, even in areas of your life that you would neither expect nor sanction. But they were also invisible in that they had no representation whatsoever at the Festival, which had the effect of making many of the debates somewhat dull. This also meant that finger-pointing tended to be the order of the day, rather than collaboration.
Writer and Speaker Douglas Rushkoff repeatedly slammed everything from Artificial Intelligence to quantification as a very inhuman enemy. In his polemics however, there was a noticeable absence of concrete examples of how this was the case, except in his own anecdotes, which seemed less interesting than he perhaps imagined. “Do we really own our phones?!” he exclaimed, implying that we were bound by a bevy of privacy contracts. While this is true, it undercuts far more interesting questions of how the concept of ownership changes, and indeed why the concept of ‘ownership’ has meaning at all in this day and age.
Evgeny Morozov attacked big data and AI as well, but in a more nuanced way, claiming that we should collectivise and pool our data, choosing who may access it and under what terms. Still, there was little he offered as how this could go about changing — no actionable examples were given. There was little to discuss as well, given that there wasn’t anyone there who could explain what the difficulty with his solution might be.
Academics, too seemed off-put by the spectre off AI and big data, especially as instantiated by Google & Facebook. Much hang-wringing was expressed from Professors Noel Sharkly and Rebecca Allen. Yet their arguments were often fairly poorly articulated; concerns ranging from not wanting physical augmentation to broad concerns about AI were present, but little in the way of thought-provoking solutions were posited. Brilliant people both, but their rather ambiguous hand-wavy concerns did little to advance conversations or provoke thought.
Surprisingly, Nick Clegg seemed to offer a perspective that seemed to mirror my own: he claimed this ubiquitous doomsaying, present from both the left and right, prevented long term solutions to potential threats from technology and tech companies. A positive attitude towards technology, he claimed, could help embed legislation and political programs to develop and harness technology. A sensibility of fear, he claimed, meant that it was much more likely that successive governments would overturn programs aimed toward embracing technological development.
This dearth of solutions and lack of representation of this invisible ‘other’ tended to set the tone, which meant that most talks were fairly predictable in tone and content.
One solution I did see came from Anab Jain. Her solution was perhaps more a way of discovering solutions, rather than a solution itself, however. She and her agency, Superflux, promoted speculative design: the process by which ‘design fictions’ are articulated through provocative futuristic artefacts which elicit useful feedback from participants in the research. She nicely explored this with Mantis, her AI global risk startup, which she revealed to be fake (a speculative design) after her presentation (much to the chagrin and interest of the audience).
But I think there is much to what Nick Clegg said about the political fear that now seems embedded in our discussion of technology. This fear seems especially articulated in a (good) book I am currently reading: The New Dark Age by James Bridle. In it, he claims that it is nearly impossible to understand the vast and invisible computation occurring that governs our society. He claims that new metaphors are needed to grasp, if not understand, these forces. While he makes many good points, his gloomy outlooks predisposes us against agents and organisations that may have a positive outlook towards technology — even if he claims that he is not anti-technology.
But this attitude reflects the sharp divide in the discourse towards technology. There is the critics — sharp-edged commentators on the dystopian possibilities of tech: Zeynep Tufekci, Adam Greenfield, Douglas Rushkoff and many others. On the other side are your Silicon Valley technologists — Mark Zuckerberg, Peter Thiel and any number of startup founders, as well as journalists such as Kevin Kelly.
This antagonistic divide does little to help us. Both sides have cogent arguments, but few people encompass both sides. The critics tend to recognise technological advantages only begrudgingly, with an ever-present subsequent “but…” and the technologists tend to be tone deaf, responding to humanistic problems with technology rather than anticipating them.
This is only exacerbated when, in places like Future Fest, the angle is slanted far toward one side than the other. Pointing fingers at vague threats tends not to be a useful enterprise.
Ultimately, defined collaboration of technologists and critics is the only way we can smooth the bumpy present out into a comfortable future.