An interview with Julia Bell
Thursday, 28 March 2024 08:51

An interview with Julia Bell

Published in Cultural Commentary

Fran Lock interviews Julia Bell

Background

Julia Bell is a writer and Reader in Creative Writing at Birkbeck, University of London, where she is the Course Director of the MA in Creative Writing. Her recent creative work includes poetry, lyric essays and short stories published in the Paris Review, Times Literary Supplement, The White Review, Mal Journal, Comma Press, and recorded for the BBC. She is the author of three novels with Macmillan in the UK (Simon & Schuster in the US) and is co-editor of the bestselling Creative Writing Coursebook (Macmillan) updated and re-issued in 2019.

She is interested in the intersection between the personal and the political, and believes that writing well takes courage, patience, attention and commitment. Radical Attention is Julia's latest book and is available from Peninsula Press here

*

FL: Thanks so much for agreeing to talk to me about your new book, Radical Attention. This essay is already garnering praise for its chilling and clear-sighted account of our collective internet addiction, and how this addiction is manipulated. The book makes an eloquent case for a sustained and tender regard in which to hold the world and each other, which stands counter to the instrumental indifference of our transactional economy. I wonder if you could start off by talking a little bit about the idea of 'radical attention', particularly in relation to Shoshana Zuboff's notion of 'radical indifference' as it applies to social media monopolies like Facebook and Twitter?

JB: It’s become quite clear to me that the interests of late-stage capitalism have diverged quite sharply and catastrophically from the interests of most humans and the planet. One of the most evident examples of this can be seen in the way that the social media monopolies have built their empires on the attention and the behavioural data of its users. Human attention and behaviour is now the product being sold. To begin with, I think, we used their platforms in good faith, as a vehicle for socialising. But over the years these platforms have also begun to socialise us. They trap us in echo chambers of the information the companies perceive is most likely to appeal to us, and adverts which have been microtargeted by companies who pay to have access to that information. We don’t choose what we see. The algorithms are built in such a way as to feed you more of what you want, so it doesn’t matter if it’s a picture of your cat or a suicide note – as long as you’re engaging it will keep feeding you more of the same.

In such a context I wonder how much control we actually have over what we actually look at and think about. If you spend three or four hours a day on your smartphone what are you actually doing with your time, and by extension your life? Was that leisure time or did it make you anxious, outraged, afraid? I suppose I’m taking an autoethnographic approach to consider how these changes have affected me and my friends, but also the social and political environment around me. To me Radical Attention was my attempt to step outside the Attention Industrial Complex to see what is actually going on. I want to encourage others to do the same.  

FL: One of the things that occurred to me while reading was that the term 'machine' stands equally for the technologies we use and the systems that drive and deploy them. When you write about dazedly losing yourself, “zombified by the machine”, I find myself interpreting this in a couple of ways. Firstly the machine is the literal device, the screen that mediates our experience of the world and captures our attention. Secondly, it is also capitalism itself, the corporations and institutions that vie for this attention in order to keep us engaged, enraged, consuming and competing. As a person who experiences a great deal of unease about the enmeshment of social media and late-stage capitalism, I wonder if you see them as in any way separable? Or is exploitation itself part of the hardware?

JB: Steve Jobs said that technology should be a ‘bicycle for the mind’ and as an early adopter in the 90s I was thrilled by the potential of technology and the web – the possibilities of making publishing easier and cheaper for example, or breaking the monopolies of the music companies that kept such tight control over the copyright of artists while creaming off huge profits, etc. I’m not sorry that we have much easier ways of disseminating knowledge, music, film, writing, art – for people to have access to the means of production. It has improved diversity. It means so many more people can have a voice. And I think there is huge potential in tech to be put to use solving some of the pressing issues around the climate and so on. Smartphones are amazing inventions in many ways.

So, I’m not anti-technology at all, but I am anti the current enmeshment of tech companies with an increasingly dark version of libertarian capitalism. The way the companies have grown into these disruptive, monopolistic behemoths with little or no regulation and who are now making eye-watering amounts of profit – especially the social media monopolies which pretend to be a reflection of society, when actually they are increasingly a means of socialising it into various new forms. Also, this has happened in a place where we have no jurisdiction, and yet this technology has an increasingly huge effect on the quality of my life. I remember thinking in the 90s when I first started using the net – What will all this be for? It seems the people with the capacity and the imaginations have made something very big and revolutionary out if it, but it has become way too centralised and ordinary people have become increasingly locked out of the conversation. There are us – the users – and then a very small elite who are the coders, and we have to live in the world they have built.

FL: I ask because the passages in Radical Attention about Silicon Valley cynicism really struck a chord with me. Nir Eyal writing that noxious book on how to manipulate others through technology, then later publishing a self-help manual for those wishing to take back control of their hijacked attention felt particularly chilling. I recalled that at the start of the year I was at an arts and performance event in London where one of the participants had designed what was essentially a baffle for Alexa: a kind of cyberpunk face-mask that anonymised and distorted speech. I made myself wildly unpopular by suggesting that a simpler solution would be not to buy Alexa in the first place. I've always felt like capitalism's shtick is to break our legs then sell us crutches, so I was mentally cheering to see this feeling so incisively evidenced and articulated in your essay. In particular, you describe the growth of “mindfulness” and self-soothing industries originating from Silicon Valley as the flip side of endemic distraction.  I wonder if you could speak a little bit about that, and share any thoughts you might have on the sudden explosion in popularity of online and app-based pseudo-therapies?

JB: I agree about Alexa – mine is unplugged in the shed after it started talking to us in the middle of the night. It was a gift I might add, which very quickly became a sort of faded novelty. But another example of the way in which tech becomes ubiquitous and then starts to spy on us. I think in time goods and internet services will need some kind of mark of quality, enforceable by law, which promises to protect your privacy. 

The pseudo-therapies issue also interests me – it’s worth noting that the the QAnon conspiracy spread through wellness communities. People feel very uneasy at the moment for quite obvious reasons and they want definitive answers for their unease. There is a lot of snake oil being peddled on the internet and again, I don’t think the companies are interested in whether your therapy works or not, as long as you're prepared to pay for advertising.

FL: I'm highly conscious that when I write critically about social media and digital technology that those platforms are often the sites of first reception for that very criticism, and that there's always a danger in coming across as hypocritical or judgemental. I think one of the most refreshing things about Radical Attention is its deep acknowledgement of your own implicatedness, a reckoning with which would seem to be the absolute prerequisite for any kind of meaningful resistance. Was this reckoning difficult for you?

JB: Yes, and it still is. I feel like, without a major publisher or what is left of UK mainstream media behind me, being able to disseminate this book on social media and be part of the conversation is important. I think social media is another arena where we are asked to perform versions of ourselves for profit. Late capitalism atomises us into individual units of consumption, parsed still further by all the data they have on us. So everyone is scrambling for the latest ‘hot take;’ there is a sense of a frenzy, sometimes, of people shilling their ideas. I am of course one of them. I will share this interview on Twitter and FB. What else can I do?

The flip side of this is then controlling my own social media use, and so on. Just being aware of using it, rather than letting it use me. I think one of the key issues is around feeling. If I’m especially tired or vulnerable it’s very easy slip into things like ‘hatescrolling’ or ‘doomscrolling’ where my feelings are suddenly amplified by seeing so many stories about the same thing. It’s always worth thinking – how does this make me feel? If half an hour on Twitter leaves you exhausted and despairing rather than informed, it’s surely worth asking what the hell it’s good for. Whenever I take extended breaks from social media it’s interesting how much less anxious I feel.

FL: Related to my previous question, do you feel that we are so saturated, even at the level of language, by the logics and rhetoric of capitalism, that some form of complicity is inevitable? And if that's the case, how do we meaningfully manifest any kind of resistance? For example, is going off-grid a useful strategy? Are the technologies we use and the ways we use them even susceptible to subversion?

JB: Of course I could go without it altogether, but it’s increasingly difficult to do that. People who don’t connect in this way do miss out ,I think. It’s important for resistance too. There are some interesting versions of subversion – the K-Pop Tik-Tok fans who bought tickets to the Trump rally and never showed for example, or certain flashmobs. BLM emerged from the internet: the video of George Floyd spread at speed through the networks, sparking a huge moment of resistance. The problem is really that resistance often only works at scale, when everyone joins in. The pressure on the government to change over free meals in the holidays is an interesting example of internet pressure paying off. What happens online becomes news and forces change in real life. So the desire to cancel certain speakers – I hesitate to call it ‘culture’ – comes from this impulse I think to see results of online political pressure played out in real life.

FL: Sorry, that was quite a lot in one go, but these thoughts have been very much on my mind since lockdown. In Radical Attention you write about lockdown as moment of illumination, one that demonstrated how interconnected we really are, and how much we need one another. I wonder to what extent you feel that it also exposed the paradox at the heart of our social media compulsions: that the very technology we use to escape our isolation is, in many subtle ways, damaging our  ability to relate to one another in anything other than transactional or oppositional terms?

JB: The problem with ‘the machine’ (and you rightly point out I use the term interchangeably at times for the system as well as the smartphone and the software which runs on it) is that it runs on binaries – zeroes and ones – whereas humans are fractional. Humans live in grey areas which are not black and white.

Social media forces us to create and then perform versions of ourselves for profit, so we are always on display. ‘I’m like a cartoon of myself’ Paris Hilton says somewhat tragically in a new documentary, which seems at the same time to be asking us to psychoanalyse her because she can’t do it for herself. Hers is an interesting example of a life stunted by its own performance. A cure for this endless exhausting narcissism surely has to be a kind of radical attention for something other than the black mirror of the smartphone screen.

FL: This question of relation is a recurrent theme across the book, and it seems to me to be at the heart of what radical attention is and does. You take great care throughout the text to highlight the physical impacts and consequences of the virtual realm. In places you describe a kind of slow persistent atrophy in the realm of the real: the slump, hunch and stare of bodies bent over phones; a skewing in our systems of perception so violent that it prevents us from recognising our Facebook 'friends' and online adversaries as fully human. One of the book's most significant challenges appears to be to this notion of 'transhumanism' as somehow utopian or liberating. You suggest that the opposite is true, that an unwillingness to acknowledge or attend to the bodies of others is a function of privilege. You state that “real bodies are problematic”. I wonder if you could elaborate on that, and the importance of remembering and attending to their complexities?

JB: Belief in transhumanism is a dodge, like planning colonies on Mars. It’s a bit like running away from the scene of the crime, rather than putting energy into the here and now. Developments in medical tech might well produce some kind of extraordinary cyborg, but this isn’t going to solve the issues that are in our face right now, which are biological, and by extension ecological. They are physical, embodied issues. The planet is trashed and dying. So are we. The question is, what are we going to do about it? I also think the pandemic reveals the limitations of the technology. It can never replace the physical presence of another person. And COVID has also put us in a situation where we are going to have to live with a great deal of uncertainty. For the privileged, this is a new and unwelcome reality, but for a lot of people it’s a familiar kind of instability.

I would say the last ten years have been about the mental zombification of a populace – the internet got mean, sinister. Donald Trump and Brexit didn’t come from nowhere, the social spaces were overwhelmed with bad actors. Military grade psy-ops, along with the amplification of outrageous actors like Hopkins and Farage. It’s worth asking who paid for those Leave adverts and what was going on behind the scenes as journalists like Carole Cadwallader are doing. Who does Brexit actually benefit and why did they spend so much money persuading us that a catastrophe was a good deal? I don’t think we’ve any clear answers to these questions and the whole situation was made murky and surreal by the proliferation of misinformation online.

FL: Following on from my previous question, one of the things that really stood out for me was your reading of Simone Weil who wrote that “attention, taken to its highest degree, is the same thing as prayer. It presupposes faith and love”. This struck me so forcibly because so much of my own reading and writing recently has been around ascetic practice, and the sustained, often painful attention to the suffering of others that such practices demand. There is a kind of fudged modern reading of ascetic practice that presupposes a withdrawal from the world and a turning in toward the self, whereas the opposite is true: the anchorite is asked, as Weil asks of us, to “renounce our imaginary position at the centre” and to  fully apprehend the 'other' without distraction, sentiment, or hope of reward. To write about faith, love and the soul in a contemporary essay has often felt like a risky move. What I sense from Radical Attention is that these terms themselves have great radical and resistive potential. I wonder if you have any thoughts on how we might approach and potentially reinvigorate words and concepts that so many view with suspicion, or that have been so effectively colonised by pseudo-spiritual industries and destructive religious hegemonies alike?

JB: We got rid of religion without thinking about the place it took in society as a space for moral and spiritual questions and crucially, care. I’ve always had a problem with organised religion – in my view it’s always been on the wrong side of history in terms of money and sex. The church could be a space which enacts a kind of radical care and stops bothering about what consenting adults do in bed. But the Cof E is too compromised by its allegiance to the state, after all it was founded to allow Henry VIII to marry his next wife. That aside, we do ourselves a disservice as humans if we throw off the spiritual and philosophical questions humans have had for millennia, especially in relation to our aliveness and our place the world. Denying that we are in some ways questioning, spiritual, even moral beings, is at the core of a lot of anxiety. It’s not about having answers – this is quite clearly where madness lies – but acknowledging that we don’t know and that even without answers the questions are still valid, fashionable or not.

I also think we need new (old) language to speak against what seems to be a new kind of moral barbarism. The level of lying in the political sphere makes a mockery of the very idea of public service. What does it really mean to be a good person? What does it mean to show courage or to love someone? Where are our examples of good people? We’re surrounded by man-babies who are busy trashing everything. Healing from the damage they are causing is going to take a huge rethink in terms of what we actually value as a society.

FL: One of the things that surprised me the most about Radical Attention was the image of humanity that emerges: not feckless or desensitized, but vulnerable and deeply wounded. It would seem that our devices simultaneously insulate us from the horrors of the world, and expose us to those horrors. We become trapped within a self-referential feedback loop of our own making, unable to connect to others; we are endangered both by our own obliviousness to our surroundings, and by our infinite accessibility to the forces of neoliberal surveillance. We are phone-jacked, or data-mined, or we selfie our way over cliff edges and into oncoming traffic. The selfie deaths really got to me: that there's a Wiki page for that kind of blew my mind, as if even those deaths are sucked back up into an endlessly scrolling textureless meld of data. I wonder if you think living such disconnected and technologically mediated lives that we have lost or refused our sense of ourselves as mortal beings? How might the kind of radical attention you advocate help us to recapture that sense?

JB: This is the critical message of the book. I think our mortality – which is one of the key conundrums of being human - is cheapened by social media and is one of the issues I wanted to encourage the reader to address. The shadow of death passes over us nightly in the middle of a pandemic. It’s one of those clarifying events that reveals what is important. The difficult thing is getting in touch with our feelings about this and turning that into action. 

FL: I'm aware that this has been a very long and quite dense set of questions, so I have one more, and then that's it. I notice that throughout the essay you draw upon and quote from various works of fiction.  Fiction requires of both writer and reader a bestowing of non-trivial attention. As a writer of fiction yourself, and as someone who teaches creative writing, how has technology shaped the writing practices of this current generation, and do you think there is anything to be learnt from the models of attention espoused by the writers of creative fiction?

JB: Good writers are good observers of the world – they pay attention. They walk around the world on high alert. It’s this practise that I want to teach students. It’s what I tried to do when I wrote this – to give my attention for a concentrated period of time on one question, on what technology was doing to me. And then use these observations as evidence for argument. I’m coming at the subject not as an expert at all but as writer in the world, an observer for whom attention is the most important part of the practice. The world was feeling unreal and weird and I wanted to figure out why.

As for fiction specifically, I think one of the reasons that the structures of social media seem so clear to me is that in writing classes we are always trying to work out how to create affect in the reader. How to place the character in relation to the reader to create the best experience. How will the story carry? What is the best way to provoke surprise? Horror? Fear? Storytellers understand the human need to make patterns from chaos. How far we can push language, structure, truth before the story breaks. These skills are useful it seems, in decoding some of the fake news, and deliberate outrages that have become part of our daily lives.

Instruction
Thursday, 28 March 2024 08:51

A people's algorithm? Facebook and the rise of surveillance capitalism

Tom Walker discusses how corporate profiteers are capturing a vital social resource - information about ourselves. How can we de-commodify our everyday lives and even our resistance? The above image is 'Instruction', by Alix Emery.

‘You don’t get to 500 million friends without making a few enemies,’ as Facebook movie The Social Network put it. Everyone the social media giant has ever crossed has joined the pile-on recently, as Facebook CEO Mark Zuckerberg faces calls to come to parliament and explain himself. (Keep your pies at the ready.)

But ‘Zuck’ is just the smirking face of a much wider issue: the way the web has been captured by corporate profiteers who make their money from selling a simple product: you – or, more precisely, your data. The biggest technology firms, responsible for an ever-growing share of the world’s billionaires, follow the Silicon Valley mantra that ‘data is the new oil’ – and the apps and websites you use every day are the extraction method. They are not monitoring you for state-style social control, but for profit: surveillance capitalism.

Surveillance capitalism, as academic Shoshana Zuboff has defined it, is ‘constituted by unexpected and often illegible mechanisms of extraction, commodification and control that effectively exile persons from their own behaviour while producing new markets of behavioural prediction and modification’. What does data extraction mean in practice? Let’s return to Facebook. Behind the photo slideshows, cartoon smiles and birthday wishes, downloading your Facebook data shows that its app has been slurping every bit of data it can get its hands on, from the location of your phone to even who you’ve been calling and for how long. After all, what’s a little data between friends?

But it’s not just about Facebook – and not just about targeted advertising, either. Google knows everything you search for, and when you stay signed in to Gmail, it knows who made the searches. That’s not so unexpected, but I was surprised to find recently that Google had been keeping track of everywhere I’ve been for the past five years. I must have said yes once when prompted by its Maps app with some explanation about ‘improving the experience’, and that was enough for it to graciously keep track of my every footstep from then on. 

Twitter, meanwhile, decided to make a list of every app I have installed on my phone, to show me ‘more relevant content’. Netflix builds preference profiles based on what you watch, and then uses this data in aggregate to create entire new original TV shows, right down to the cover images, micro-targeted at sections of its audience. Amazon keeps track not only of what you buy, but everything you search for and look at – it’s all grist to its marketing mill. Every company can use what it learns from millions or even billions of people not just to target ads but to make decisions that will let it grow faster than the competition – so over time, the most data-driven firms come to dominate.

As computer security expert Bruce Schneier writes, the smartphone is ‘probably the most intimate surveillance device ever invented. It tracks our location continuously, so it knows where we live, where we work, and where we spend our time. It’s the first and last thing we check in a day, so it knows when we wake up and when we go to sleep. We all have one, so it knows who we sleep with.’

These ubiquitous ‘devices’ – the telescreens in our trouser pockets – arrived not as a state-enforced requirement, barking orders at us in the manner of an outwardly oppressive apparatus, but as our ever-present assistants, always keen to help us, and to help themselves to a little more of our data so that they might give us ‘better recommendations’. In the future rapidly approaching, when you have an automated home, a self-driving car and a city full of internet-connected sensors, their makers will be watching you too, unless we can change the path we’re on.

The never-ending experiment

People have heard of ‘algorithms’, those annoying things that mean your social media news feed does its best never to appear in chronological order. But algorithm is a soft term for what is really going on: machine learning – cutting-edge artificial intelligence – is being trained on all that data being extracted from massive populations every day. Every time you scroll on Facebook, hit the heart button on Instagram or watch a video on Youtube, you are taking part in the latest round of a never-ending worldwide experiment.

You are like a rat in a maze, with a machine showing you a stimulus, noting your response (and everyone else’s response), and then showing you another. Oh, looks like that one made you feel angry! But the notification you got afterwards clearly stroked your ego. Interesting…

The machine is not attempting to make you happy – though, to be fair, it is not attempting to make you sad either. Its aim is what is called ‘engagement’: in other words, to keep you running around inside the maze for as long as possible each day, every day, for the rest of your life. Why bamboozle billions of people like so many rodents? Because every minute you spend ‘engaged’ racks up another few fractions of a cent in corporate profit.

Tristan Harris, a former Google employee and founder of the Center for Humane Technology, argues that apps’ feeds hook into the same parts of human psychology as gambling does: ‘When we pull our phone out of our pocket, we’re playing a slot machine to see what notifications we got… If you want to maximise addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward.’ Such apps are essentially Skinner boxes.

The attention rat-race is what causes all of the presumably-unintended consequences that are more visible on the surface, from Facebook fake news farms to the creepy, often auto-generated kids’ videos clogging up Youtube. The tech giants’ money-bots spray out audience traffic and ad cash in the direction of anyone who produces content that captures attention – what that content might be is at best a secondary concern.

What ‘your feeds’ show you, and in what order, is no longer under your control. That doesn’t make them some kind of mind-control machine, but it does mean that, over time, the decisions of a large population could be influenced in subtle ways – as Facebook found in a 2014 study where it was able to influence users’ emotions.

Cambridge Analytica, according to whistleblower Chris Wylie, used psychological profiling to ‘know what kinds of messaging you would be susceptible to’ – so, for example, ‘a person who’s more prone towards conspiratorial thinking’ might be shown ads that play to that mindset, helping an idea to spread by starting with a ‘group of people who are more prone to adopting that idea’. Without boarding the bandwagon of blaming Facebook for all political ills, it doesn’t seem so inconceivable that a large enough advertising campaign with that kind of targeting could influence an election – or, say, a referendum – by a crucial few percent. As the whistleblowers’ evidence highlights, such methods have been quietly in use in the global South for several years.

A people’s algorithm

But while Brexit and Trump – and the Cambridge Analytica affair’s contribution to the ever-growing web of connections between the two – have catalysed new interest in how our data is being collected and sifted, any potential solution has to start much further back, before the data was originally gathered. These data-mongers are no geniuses: they stumbled across Facebook’s data goldmine and filled their boots. The point is that such a motherlode should never have existed in the first place.

At its best, social media has provided an important platform for alternatives to the mainstream media, allowing people to spread the word about protests and grassroots events, giving a voice to people previously marginalised and ignored. This is surely one of the factors behind the emergence of Corbynism. The question is: how can we decommodify our everyday interactions – and even our resistance?

This article is republished from Red Pepper.