An immersive film installation combined with a social experiment, interrogating the rise of fake news and large-scale conspiracy theories such as Qanon, and how they can affect and sometimes even isolate individuals from the person they used to be. Our piece focuses on how social media contributes to the breakdown of trust in science and mainstream news, and how people can find a false sense of community in online spaces, where like-minded ‘outsiders’ can express themselves freely.
Audience experience:
The audience would be led into a dark room and guided to their seats. All phones/communication with the outside world should be switched off. After a while, a vertical screen switches on in front of them, evoking the impression of a phone. The film plays, simulating the experience of scrolling online, and a person is clicking on provocative articles and switching back and forth between a web browser. It makes the audience members spectators to an individual’s online experience and acts as an audience surrogate, leading them to potentially identify with the “person using the phone”.
The audience would be led into a dark room and guided to their seats. All phones/communication with the outside world should be switched off. After a while, a vertical screen switches on in front of them, evoking the impression of a phone. The film plays, simulating the experience of scrolling online, and a person is clicking on provocative articles and switching back and forth between a web browser. It makes the audience members spectators to an individual’s online experience and acts as an audience surrogate, leading them to potentially identify with the “person using the phone”.
The idea is to plant breadcrumbs through the use of provocative and contradictory articles, in order to question where opinions come from, delving deeper into the origins of conspiracy theories and how they begin. Through this, we would like to question people’s authenticity of opinion and whether such a thing truly exists. Are not all opinions a reaction to a reaction to something around us either proven by science or still being researched, almost like a fight in a Twitter thread? And this type of question is a demonstration of what we would like to ask the viewers after the screening finishes and the lights come up.
The doors will be barred shut in front of the audience and a microphone on a stand will be brought in front of them, letting them know that the piece is not over yet. The audience are then left to figure out for themselves whether they are supposed to speak to each other and use the microphone. This will go on for 20 minutes regardless of whether anybody speaks or not. No one will know when the piece is supposed to end until the doors are opened again.
Approach:
Our piece uses moving image as a tool to create an immersive experience, without the audience even having to move from their seats. The simple design of a vertical screen, mimicking the shape of a phone, is known to most and it immediately creates a sense of familiarity, as who hasn’t spent hours of their time scrolling in the dark?
There is almost a sense of invasiveness and voyeurism, getting a glimpse into how someone acts on their phone in their private time. You can tell a lot about a person by their search history. One’s devices are usually a place of safety, where they can be in private, but also a danger, as algorithms can lead users down dark paths before they even know it.
Our piece uses moving image as a tool to create an immersive experience, without the audience even having to move from their seats. The simple design of a vertical screen, mimicking the shape of a phone, is known to most and it immediately creates a sense of familiarity, as who hasn’t spent hours of their time scrolling in the dark?
There is almost a sense of invasiveness and voyeurism, getting a glimpse into how someone acts on their phone in their private time. You can tell a lot about a person by their search history. One’s devices are usually a place of safety, where they can be in private, but also a danger, as algorithms can lead users down dark paths before they even know it.
This piece is in collaboration with Arwel Rees-Kay.