You’re on your phone and notice the same ad (for a product you viewed a couple of hours earlier) following you everywhere. Spotify makes a curated playlist, just for you. You double-tap one meme and suddenly your Instagram Explore page is full of more...memes. We’re all familiar with algorithms, and whether we’re conscious of it or not, our online experiences are dictated by the invisible tugs and prods of ever-more-sophisticated machine learning. With the latest iOS 14.5 update for iPhone users and its new “app tracking transparency” tool, the friction between individual privacy and data sharing is at an all-time high, and even Facebook is starting to sweat.
Like all things tech, there’s nothing black-and-white about algorithms themselves. Despite their dystopian potential, there are genuinely intriguing and positive outcomes of digital hyper-personalization. But the current situation is incontrovertible — all pros aside, algorithms are making us lose a sense of online serendipity. It’s harder than ever for us to discover new ideas and people and marketplaces in the digital sphere, and we’re tasked to ask — what do we do about that?
We aren’t always a fan of being advertised to, but personalized ads and content, in theory, make a lot of sense. There’s value here for both consumers and brands — younger millennials, for instance, are way less likely to react to an ad for backyard landscaping equipment or baby formula, no matter how strong the CTA is. It makes sense that the right brands would want to get in front of the right audiences by using demographic and psychographic markers, like age, location, activities, and interests. And if you’re struggling to pick a flick for movie night, doesn’t Netflix make the decision-making process that much easier by telling you that there’s a “97% match”?
The truth is that hyper-targeted content, at a certain level of sophistication, can cut through the noise of the marketplace and show us exactly what we may want or be looking for. And when AI understands you well enough (like a very, very close friend), there’s a lot of potential for lateral discovery: if you’re an eco-conscious shopper, there’s a greater chance you’ll be served ads for other eco-conscious brands.
Here’s the thing — you might think algorithms are “objective” because they involve math, but algorithmic bias is a very real and terrifying phenomenon. Whether it’s due to flawed/incomplete training data or information that is itself biased, algorithms can easily perpetuate inequities that disproportionately target certain groups, such as people of color. An alarming study published in Science in 2019 found that Black patients at a large hospital were more than twice as likely not to be referred to necessary care by an algorithm because of data that reflected systemic racism in healthcare. There are countless other examples of this in facial recognition technology and criminal justice.
Even scrolling through your Facebook News Feed or typing a query into Google are heavily stage-managed acts, with behind-the-scenes algorithms selectively filtering information in ways we won’t even know about — and don’t have much of a say in, either. In his 2011 TED Talk, author Eli Pariser pointed out the dangers of living in a “filter bubble,” a term that captures the reality of our digital experience. Facebook and Google, using the macro- and micro-data you automatically share the moment you go online, determine exactly what you see, often quietly removing news sources and search results, respectively, that don’t present you with information that isn’t “relevant” to the avatar that algorithms have made you out to be. Spooky, right? With more than six hours of the average person’s day spent online, is there room left for spontaneous discovery?
We may spend a significant portion of our lives online, but there’s a lot you can do to “pop” the filter bubble and add serendipity back into your life. Here are some tips: