by: Zank, CEO of Bennett Data Science
I took this photo a couple of weeks ago. As a person who has spent years writing algorithms to help people, the back-window writing struck me as a very powerful and interesting statement. “Resist the Algorithms”.
I get it. I really do. At times, we should resist the algorithms. This week, I’m going to take a stance on a controversial subject: maximizing engagement. I’ll explain.
Often, data science is used to personalize offers at scale. That means that large companies can interact with millions of customers in real time with personalized offers for each customer. Amazon does it. Netflix does it. Many many companies do too.
And the goal of all this personalization is to increase engagement. But what if we win? In other words, what if we write the perfect algorithm that makes a product so “sticky” that users can’t get enough. They’re permanently engaged. Picture a user holding her phone, 100% engaged with some app. For hours. As data scientists, that’s actually what we’re going for when we set out to maximize engagement.
Sounds as creepy as it does impossible. Yet that is exactly what apps like Facebook and Instagram (among others) have achieved. And they do it through carefully timed messages and close attention to algorithm-driven 1:1 personalization.
When founders start a new venture, they dream of having huge engagement, whether it’s for social good, such as collecting donations for a natural disaster or getting users to hurl birds at rocks all day. More often than not, companies seek greater user engagement (because they know revenue will follow) more than anything else.
As data scientists, we have a moral obligation to understand that if we succeed in creating an app that users have trouble turning away from, we could be damaging society as a whole, taking people from time with friends, family and self.
This is quite serious stuff. We’ve all got an opinion when we see a toddler transfixed at a restaurant by an iPad playing SpongeBob, oblivious to anything in their proximity. But what about the 38-year old woman in the checkout stand who’s reading her Google News feed for the 23rd time that day, also oblivious to her surroundings? Or throngs of teens who turn to Instagram for attention.
These apps were built to maximize engagement, and they use data to get better at it each day. Knowledge is power. Data scientists have an obligation to discuss these effects with stakeholders. And companies have an obligation to their customers and users to protect them from endless engagement.
One solution might be to let users choose the times they can receive hyper-personalized content. In other words, turn the algorithms off between certain hours.
Algorithms are wonderful for so many purposes. Let’s remember that and do our part to listen and respond morally to these and other issues as they arise.