This piece is a dream collaboration with Are.na Editorial, edited by the wonderful Meg Miller.
Subscribe to the Are.na Editorial newsletter here. In my experience, purchasing Are.na Annual and reading it over a cup of coffee with a pencil in hand is a powerful trans-algorithmic spell.
Maybe you were drawn in enough to click this headline because you feel it too: a latent unease with how the world is being fed to you, and how you are being fed back to it — a feeling that researcher Shagun Jhaver calls “algorithmic anxiety.” If I linger on this video of a power wash too long will I be confused for a power washing enthusiast? Did I buy that skirt because it suits my form of self expression, or because I was manipulated by effective ad targeting? Do I actually believe what I believe? Or am I just a product of what platforms show me?
The most common prescribed cure to algorithmic anxiety is control and abstinence. Reduce the number of hours you spend on Instagram each week. Create barriers to opening Instagram in the first place. Better yet, delete Instagram entirely.
If you've found that this strategy doesn't work for you, that receiving your weekly Screen Time report feels less like moderation, more like self-flagellation, there are good reasons for that. The first is that even if you delete Instagram, you still live in an Instagram world, in which, for example, the aesthetic possibilities, pathways to fame, and social dynamics are all profoundly influenced by the platform.
But the second is perhaps even more sobering. I write a newsletter, teach a course, and run workshops all called “escape the algorithm.” The implicit joke of the name’s particularity (not “escape algorithms” but “escape the algorithm”) is that living outside of algorithms isn't actually possible. An algorithm is simply a set of instructions that determines a specific result. The recommendation engine that causes Spotify to encourage you to listen to certain music is a cultural sieve, but so were, in a way, the Billboard charts and radio gatekeepers that preceded it. There have always been centers of power, always been forces that exert gravitational pulls on our behavior.
The anxiety isn't determined by the presence or absence of code. It comes from a lack of transparency and control. You are susceptible whether or not TikTok exists, whether or not you delete it. Logging off is one tool, but it will not alone cure you.
Instead of withdrawing, I encourage my students to dive deeper, engaging with platforms as if they were close reading a work of literature. In doing so, I believe that we can not only better understand a platform's ideological premises, but also the inevitable cracks in a rigid software logic that enables the surprising, delightful messiness of humanity to shine through. And in so doing, we might move beyond the flight response towards a fight response. Or if it is a flight response, let it be a flight not just away from something, but towards something.
Do your research
The first step is to deeply understand the algorithms of the platform you’re trying to escape. Use the platform with a lens towards what makes them tick and why. Form your own hypotheses about these questions:
What are some of the contexts that might influence the creation of its algorithms?
How do its algorithms work?
What are some biases that its algorithms might hold?
How do those biases manifest on the platform?
What are some outcomes those manifestations might have on its users or culture at large?
Then, get to work researching. How does the platform make money? What is the mythology around the company’s founders and origins? What is its corporate culture? When have there been accusations of bias, privacy violations, dark design patterns, or censorship? Has the platform ever been under investigation? Does any documentation exist of its design philosophy or how it ranks certain pieces of content above others?
Make a map
Next, exhaustively document every possible path a user can take on the platform. It doesn’t matter how — you can use a sketchpad or folder full of screenshots. Pay extra attention to the user behaviors you usually ignore, and to these more oblique paths:
Settings page + advanced settings
About pages
Paths that you take to end up on the platform in the first place (for example, from a Google search)
Behaviors that you might take when first creating an account
Advanced search operators or advanced search pages
External tools for searching the platform
External tools for navigating the platform
Sitemaps
Ways that the platform might be navigated “randomly” (has anyone built a random navigator?)
The urls of the pages you visit (does their structure reveal anything that can open doors to navigating by direct url entry?)
Write your own algorithms
Now the fun part! Using your research and your map, write your own algorithms for your platform. These should be a set of simple instructions for a user to take to navigate the platform.
The goal should be to uncover some less trodden paths that will help guide your target user to places they wouldn’t otherwise discover. These detours might reveal:
Content outside of your usual location/interests/social network
Content with little engagement
Deleted content
Old, forgotten content
Glitches in the platform
Invisible labor that powers the platform
People misusing the platform
People vandalizing the platform
Anything that typically goes unnoticed
Once you've written your algorithms, have some friends test them out and keep a record of what they find. Iterate; try again. Document the things that feel most surprising or compelling.
Release
Resisting the paths most traveled invites us to look at the platforms we use with a critical eye, leading us to new forms of critique, making visible parts of the world and culture that are out of our view, and inspiring entirely new ways of navigating the web.
Take Andrew Norman Wilson’s ScanOps, a collection of Google Books screenshots that include the hands of low-paid Google data entry workers, or Chia Amisola’s The Sound of Love, which curates evocative comments on Youtube songs. Then there’s Riley Walz’s Bop Spotter (a commentary on ShotSpotter, gunshot detection microphones often licensed by city governments), a constantly Shazam-ing Android phone hidden on a pole in the Mission district. Or consider my own experiment, IckTok, in which I trained a TikTok account to show me videos I would hate, in order to better understand what it means to confuse engagement with user interest. A similar spirit of inquiry led to The Man Cave Up in the Sky, where, entranced by photos of man caves unearthed by students in my workshop, I attempted to search for a liberatory gender politics within them. Exhausted by Spotify’s constant pandering to their existing tastes, another group of students designed an app specifically focused on discovering new genres of music. Other projects are as unserious as they are evocative, like EyeChat, a Chatroulette clone but only for eyes. There are also full standalone products like Marginalia, a search engine specifically focused on non-commercial content.
Not all of these projects follow the exact set of methods described, but they all represent a level of attention and intentionality that navigating the internet doesn’t usually afford.
If the most common metaphor for algorithms is following a recipe, then the goal shouldn’t be to stop making food, but to become so attuned to the shape of cooking that we can make substitutions and eventually, our own recipes. If we’re bold enough to question the maps we've been given, we may discover a world richer, stranger, and more alive than anything an algorithm could drop at our feet.
Header art by me, using images by Albert Bierstadt (The Met) and David Grandmougin