đȘMy first-hand account of training TikTok to steal my essence
Any sufficiently advanced algorithm is indistinguishable from magic
Algorithms guide many aspects of our lives, from our everyday choices (what movie to watch, what verb tense to use while writing) to the decisions that others make about us (who is fit to be paroled or who will be filtered out of a job search). But lately, one digital platform algorithm has received a lot of attention for what itâs capable of, both for good and for ill.Â
Journalists are digging into what makes TikTok recommendations so powerful
Josh experiments to see if TikTok can âfigure him outâ from 2 hours of watching
Wilfred reflects on a notable analog recommendation engine: the school librarian
Open TikTok for the very first time, and within minutes the âFor Youâ page is recommending short videos. The only navigational tool is what you might have in dating apps: pull down to skip to the next video, or tap and hold to choose âNot interestedâ and see fewer videos like the one youâre watching.
Once uploaded, every video is automatically labeled with attributes. Once the app starts observing which videos you watch and how long you watch them for, your watching habits can be matched to the attributes in the videos that you watch. Very quickly, the algorithm can gain a spookily accurate perception of your interests, and maybe more than that. Both Reply All and the Wall Street Journal recently investigated TikTokâs recommendation engine. Reply All focused on how the For You page can seemingly unearth thoughts from deep within oneâs psyche. New Yorker columnist Kyle Chayka has said, âI believe the For You page is the haven of our deepest secrets, the true algorithmically determined root of our identities, and thus should be kept private: a For You Only page.âÂ
On the podcast episode, a Reply All producer interviewed her sister, who has a unique biological condition: she canât burp normally. She had never met another person with the same issue. And yet somehow, the TikTok algorithm served her up a video by another woman who also could not burp. For the sister, the video made her feel truly seen, and for the first time, able to connect to others with the same concerns.
The WSJ took a different approach. They trained the algorithm with over a hundred bots that went down rabbit-holes and in some cases ended up in disturbing places. For example, a bot programmed to be interested in âsadnessâ and âdepressionâ was served up more and more of that content, with no limits or reservations, until almost everything on the For You page was sad and depressing. And this process did not take as long as you might think: âTikTok fully learned many of our accountsâ interests in less than two hours. Some it figured out in less than 40 minutes.â
When reached for comment, a TikTok representative responded that real users have more diverse interests than programmed bots. Iâve never used TikTok, but Iâm a real person with diverse interests. Could the app figure me out in two hours? And so, despite Chaykaâs warning, I started down a rabbit-hole of my own, chronicled below. Over the course of a few days, I spent the length of a feature film on the app. I did not list any interests, like any posts, or follow any accounts. As far as I know, all I gave the app to work with was my genuine attention, IP address, and a lot of swiping.
Curiouser and curiouser!
First half hour:Â
I know from the WSJâs reporting to expect that at first, the app would serve up a lot of very popular (millions of likes) videos that have been pre-screened by human moderators. My first impression is that there are a lot of ads! At first blush, it doesnât seem like thereâs a lot for me here. I see one video clearly from Maryland (Iâm in D.C.), but thatâs the only sign that theyâre using my IP address. The first interesting thing I see is a Jack Black video. Iâve loved his band, Tenacious D, since high school. But more than likely, this is a coincidence â heâs a movie star with a huge presence on TikTok. After a half hour, TikTok seems to know I like cats, art and food â I have two cats and Iâve worked in art and food â but so do millions of other people in the world.
After an hour:
I decide to test the Not Interested button because Iâm seeing a lot of prank videos that joke about violence, which I donât like. I use the button and right away that kind of thing seems to vanish, seemingly for good. Now Iâm occasionally starting to see videos with fewer than 100,000 likes, and the videos are getting a bit weirder, in a good way. Towards an hour in, Iâm finally getting away from constant cat videos. Iâm surprised I havenât seen any drawing yet because Iâm an artist, but maybe thatâs asking too much. Iâm wondering if this experiment will work.
After an hour and a half:
A video begins and a man says, âso I started working with Mexicans today andâŠâ I decide in the moment to use the Not Interested button again, and itâs robust enough to keep me from seeing more stereotypical race-based humor, which is what I was going for. There are more art videos coming now, with a lot of painting. A lot of the early cooking videos I was getting were microwave hacks or instructionals for making dips with whole blocks of cream cheese. Now Iâm getting recipes for things like pickled broccoli stems, which is closer to the experimental home cooking I like to do. I also get my first video about urbanism, a topic I follow closely. But there are still some off-signals: Iâm getting a lot of ADHD and depression/anxiety content, which honestly are not right for me. I really enjoy a video about âchaos gardening.â Surprisingly, the cats are completely gone, and I miss them a little. Itâs still not exactly âme,â but itâs way closer than before!
After two hours:
Iâm getting a lot more art than before, including watercolor, which I paint, and even a video referencing Studio Ghibli â I love those movies. Itâs satisfying to get shown kinds of skilled art and craftsmanship Iâve never seen before. I see my first political video, about #StopLine3, an issue I know very little about but am interested in. Iâm also getting tiny house and DIY content that I might not have picked, but if Iâm being honest, I really enjoy it. There are videos showing the kinds of things Iâve seen recommended many times on Kottke, my favorite blog. Iâm also now regularly seeing videos made by far less popular users, some with only a few hundred likes. There are still videos Iâm not crazy about, like skits where teens do an impression of their mom, but overall Iâm enjoying what Iâm seeing a lot more. Iâm somewhat surprised that some of my longest-held professional interests, like cartooning and cheesemaking, havenât surfaced, but I have no doubt that they would eventually.Â
At the end of this experiment, I am enjoying how much genuine creativity there does seem to be on TikTok, as well as videos that Iâm really happy to see or that make me laugh. But now thereâs one dominant feeling â I want to watch more. I think thatâs the important point: Not only can TikTok be addictive, it may serve you more and more of whatever it thinks you want, even if thatâs terrible for you, like misinformation or self-harm. The further you get into your For You page, the further away you are likely getting from human-moderated content. I think people have the right to watch whatever they want, and Tiktok has the right to try and make money off it, but what moral responsibility does the platform have if it plays a role in a personâs mental deterioration or radicalization? There is some question about whether oneâs online viewing habits can cause thinking to change, but for me anyway, uncertainty is not an excuse. I want to see platforms experimenting with guardrails or other mitigating design elements for when they notice people deep diving in potentially dangerous waters. This is an issue Iâm looking forward to continuing to think and write about here.Â
So how well does TikTok know me after two hours â 687 videos â of just watching? Iâm definitely impressed. Itâs a little bit like taking a personality test or getting your fortune read: Itâs amazing to be told who you are and recognize some truth in it, even if youâre the source of that information in the first place. Iâm not yet altogether spooked, but I suspect that if I keep on using the app, that moment Iâve heard about it â when Iâll wonder if thereâs magic or spycraft at work â is inevitable. What about you? If you use TikTok, Iâd love to hear how you think the algorithm compares to other platforms. Is this the beginning of a new era of eerily-personal platforms? Or is this just a tweak on the same formula weâve seen for years now? Comment below and let me know.
â Josh Kramer
"And what is the use of a book,â thought Alice, âwithout pictures or conversations?â
The first ârecommendation engineâ I encountered was my public school librarian. Twice a week my classmates and I would get an hour to wander the libraryâs aisles and pick books to take home. The librarian knew each of us by name and what we had been reading. I was in third grade, had just finished the first Harry Potter book and the sequels hadnât been published yet. I told her I wanted more. She suggested Brian Jacquesâ Redwall, a series about peaceful mice and rabbits overcoming warlike weasels and snakes. A year later, after I had devoured all of Jacquesâ novels, my librarian proposed The Lord of the Rings: she loved it as a kid, and it had similar themes and heroes and villains. And the year after that, when I was through with those, she said I was ready for the most advanced novels in the school library: Phillip Pullmanâs His Dark Materials â not only a good vs. evil story, but challenging the very concepts of good and evil.
The idea of a recommendation engine is simple: it collects data about the content we like and the things we do. Then it predicts what we might like by looking for content similar to the content we like (content-based filtering) and by comparing us to other users who act similarly to us (collaborative filtering). On a basic level, this is what my librarian was trained to do. Itâs quite easy to imagine that an AI would have been able to come up with the same recommendations she did. But unlike an algorithm, my librarian was invested in me. She wasnât just feeding me content, but participating in my growth. She shared a part of herself with me. She was my model for what it meant to be a reader. At New_ Public, weâre wondering: who plays the role of the librarian on the internet? Do we need to create it? How do digital recommendation engines complicate (or complement) that role?Â
â Wilfred Chan
Interacting with TikTok was prompting me to try to see myself as the app does, and by extension to reimagine myself in terms of the pleasures it presumed I was deriving from its content. Would I eventually actually want what it had to offer? Would I see that as my own desire? Or would I still just be desiring myself through the lens of those recommendations? Is there even any difference between those things? When I wrote about TikTok before, I had already primed myself to come to this sort of conclusion. What remains unimaginable to me, still, is that I would actually want to watch these videos for their own sake, without the algorithmic intrigue. So I remain convinced that the point of TikTok is to teach us to love algorithms over and above any content, and to prepare us to accept an increasing amount of AI intervention into our lives. It seems designed to program users with a form of subjectivity appropriate to algorithmic control: where coercion is merged with an experience of "convenience" as one's desires are inferred externally rather than needing to be articulated through our own conscious effort.
â Rob Horning, Internal exile
Flash Forward
Donât forget about the Flash Fiction Contest! Find more info here (scroll down a bit). ââEmail us your stories at hello@newpublic.org with âFlash Fictionâ in the subject line. Good luck!
Deadline: 9/1/21
Theme: Social Media
Word Limit: 500 words
Prize: Original illustration, publication on newsletter
Next Week
Weâll dig into the survey results, and take a look at what has changed in a year. Plus, an introduction to an ongoing topic of interest.
Nobody's gonna know (they're gonna know),
Josh and Wilfred
Design by Josh
New_ Public is a partnership between the Center for Media Engagement at the University of Texas, Austin, and the National Conference on Citizenship, and was incubated by New America.