Hey so um 👉 👈 there’s this monkey meme and it um kinda ends up being a perfect allegory for life under surveillance capitalism. In this image a diagram shows a monkey sitting in a ‘Restraint chair’, with a ‘Response bar’, ‘Juice reward mechanism’, a ‘Stimulus screen’, and head plugged into a ‘Recording electrode’. Read with the caption ‘how you look when you say your work gives free snacks’, the picture offers an image of the worker, and the ways in which our mental and physical input is fed back to us in morsels of reward, distraction, survival, and sustenance.
Take for example the way a workplace emails a meeting agenda – first dot point – ‘R U OK? Day campaign’ – second dot point – ‘EOIs open: Second round voluntary redundancy program’.
The monkey in restraint chair indexes something further about our everyday technologies, that is, the totalising data-veillance embedded into our software, and the very means of escaping it feeling steadily unthinkable. We feel this on almost all platforms, whether their core function is the workplace or the social. You feed the platform your data, your behaviour patterns, which in turn produces your ‘feed’ and shapes and reshapes your behaviour. An endless generative feedback loop of controlled input and output. Does this Monkey offer a possible vision of the framework of working under techno-capitalism, and the kinds of complicated binds of mental and physical labour we find ourselves labouring in and out from?
Let’s take an obvious example: Instagram discovery feeds and advertisements. The application’s backend reveals that every few seconds Instagram sends signals back to the software reporting exactly what you’re doing. It monitors how much time you hover and gaze at a post before scrolling forward, whether you click and expand open an image on the discovery feed, who you’re sliding into the DMs with, who they’re following and what content they’re looking at, what links you click on. It makes predictions based on all of this. Your data is disassembled, broken down, filtered, re-assembled, re-packaged and sold. As Mckenzie Wark notes, when using technologies that are experienced as “free’ – it is us who are being sold.”
Digital surveillance is often pardoned under the rubric of an enhanced user experience; for example, an algorithmically generated playlist curated just for you. Technology is foundationally designed on a popular illusion of complete access to unlimited information, connection, and distribution, and an obfuscation of its political imperatives as it presents as neutral. But the network is incredibly good at maintaining homeostasis, centralising power and control, and commodifying time, connection, and information flows.
Juice reward mechanism
Under this digital and self-surveillance people are encouraged and incentivised to internalise behavioural standards and learn to order their bodies. James Bridle writes “The ability to record every aspect of our daily lives settles ultimately onto the very surface of our bodies, persuading us that we too can be optimised and upgraded like our devices.” People become profiles, avatars, data doubles, building images of themselves, filling a demand to be legible for machine reading, legible for companies, for algorithms to boost them into the mainstream.
We can see this in the way influencer culture has exploded. Take for example Honeyhouse, an ‘adult TikTok house’, in which four couples live together. Their video what we do for work introduces each resident: a YouTuber; fitness trainer and actor; mindset, meditation and motivation coach; model; brand strategist and e-com expert. Their freely available content shows the influencers perform challenges like ‘voting on who does the dishes’, who can reply to a text first, hopscotch jumping in rhythm, book page turning, and candle blowing competitions. As McKenzie Wark writes: “Reduced to nothing but users, and our actions forced into the commodity form, our collective work and play produces a world over and against us…collective human labor made a world for a ruling class that keeps making not only itself but us in its image.”
We aren’t solely regulated by the disciplinarity of old institutions like the school, factory, and so on anymore. As Paul B. Preciado puts it, we are also regulated “by a set of biomolecular technologies that enter into the body by way of microprostheses and technologies of digital surveillance subtler and more insidious than anything Deleuze envisioned in his famous prognostications about the society of control.” Preciado reminds us how Hugh Hefner became a ‘horizontal worker’ on his rotating circle bed in the Playboy mansion, signaling a new kind of ultra-connected pharmacopornographic disciplinary subject. Preciado draws a connection here between this horizontal worker, and where many of us found ourselves in this current pandemic moment. That is, under widespread surveillance and digital biopolitical management, completely entangled and interconnected confinement to the home or to the device.
The technologies we’re using already have leaky privacy conditions, listening devices, spying cameras, biometric recognition and other data-tracking processes. As with most popular technologies we use, the algorithmic processes are purposefully opaque: we’re denied the right to see into and understand the machine, articulate its processes, and understand our data flows.
Platforms such as Microsoft Office Teams and Slack already built in the ability for management to functionally snoop in, to view private conversations and teams. It’s becoming increasingly commonplace for software to allow management to be alerted by or to uncover ‘underperforming’ workers or unionising employees. We’re now seeing how automation has reshaped more than just work processes; it also creates conditions in which the space, time, and means of forming solidarity are foreclosed.
Uncanny in its speculative forecasting of facial analysis computation, us+ (2013) by Lauren Lee McCarthy and Kyle McDonald shows us a sociality implicated by algorithmic improvement and observation. The work imitated a promotional video for a new Google Hangout app that analyses speech and facial expressions to optimise and improve conversation. The app uses pop up notifications with suggestions for changing tone or mood, or it takes direct action such as auto-muting a user speaking.
This kind of AI offers a form of objectivity, distancing, and standardisation which becomes attractive to platforms and to the workplace. But such ‘smart’ technology becomes increasingly appealing in its ability to collect more and more granular information about our movements and behaviour, meanwhile we are automated to self-monitor and change our actions.
This murky atmosphere of power, technology, and labour are brought to the forefront by Kynan Tan’s works Computer Learns Automation (2020) and Production (smart phone assembly) (2019). The more recent Computer Learns Automation uses generative AI and video to animate three scenes – the rideshare, drone strike, and robot factory arm. The later work Production shows looping simulated scenes of a smartphone production factory assembly line with faceless workers. When read in tandem these artworks speak to the tensions between increasingly commonplace artificial intelligence and machine learning with the very real abstracted and separated human labour power and resource extraction behind the scenes.
Kynan Tan, Production (smart phone assembly) (2019)
In each of these instances we see regimes of assessment, prediction, and analysis, where the world we inhabit is understood as a vast field of data awaiting abstraction and possession. So-called smart devices that constantly watch us and record us, harvest information – every activity captured and extracted is a direct extension and expansion of colonisation.
Tan’s scenes of drone strikes, robotics, and ride sharing transport us to a machinic vision that is prosthetic, distant, and mediated through screens and interfaces. This optical paradigm is reductive, gamified, simulated – revealing what is commonly rendered invisible or abstracted (military violence, the distribution of goods, and human labour). It reminds us how algorithms serve a political function that benefits those in positions of power; the state and the boss. This is precisely because of the ability for them to defer responsibility for the actions of algorithms, and because of the illusionary belief of apolitical and objective algorithmic processes.
Computer Learns Automation, Kynan Tan, 3 artificial intelligence agents: 3ch HD video, 2ch sound, generative. 2020
Harney and Moten say that to work today is “to be asked, more and more, to do without thinking, to feel without emotion, to move without friction, to adapt without question, to translate without pause, to desire without purpose, to connect without interruption.” Capitalism and algorithmic surveillance intersect to create a system that is less visible, less understood, but more violent, and more totalising.
In danger of restraining ourselves to the ‘chair’ – or of thinking the job is 💯 because free snacks – we might consider ways to steal back time, ‘our groove, our pulse, our swing… that never belonged to anyone. It’s what we share.’
This piece was originally written in August 2020.
 Wark, M, ‘Capital is Dead. Is this Something worse?’, Verso, 2019
 Bridle, J. ‘The new Dark Age. Technology and the End of the Future.’, Verso, 2018
 Wark, M, ‘Capital is Dead. Is this Something worse?’, Verso Books, 2019
 Preciado, P. ‘Learning from the virus’, Artforum, https://www.artforum.com/print/202005/paul-b-preciado-82823, 2020
 Mbembe, A. ‘Thoughts on the planetary: an interview with achille mbembe.’, https://www.newframe.com/thoughts-on-the-planetary-an-interview-with-achille-mbembe/, new frame, 2018
 Moten, F. Harney, S. ‘The Undercommons: Fugitive Planning & Black Study’, Autonomedia, 2013
Audrey Pfister (sometimes-) writes, curates, edits, makes mixes, and works in the arts. They spend most of their time across Gadigal lands and waters, and grew up on Thawaral land. They've written for Runway Conversations, RunningDog, Overland, Circle Square Paper, Flower Books and Framework journal. Audrey holds a Bachelor of Art Theory from UNSW Art & Design, and is doing Arts Honours in Media, Culture, Technology. Audrey has collaboratively produced projects such as Precarity and Possibility; a forum on labour and the arts and collectivity, Fatal Crush reading series, Kudos Live: Intimate Circulations, and Anti-Annual: Trails. @eggeplant on insta @beanartillery on Discord/Twitter