When Business Goals ≠ User Goals

Twelve years ago at a SXSW afterparty Jeremy Keith, a fellow designer, first told me about “Twitter”. It was just a fledgling text message service at the time. At his encouragement, at 1:28 PM on March 10th 2007 I sent a text message to 40404 (this still works, btw) which read simply “Eating lunch“.

Twitter has changed a lot since then. Social media is now officially “a thing” and people across the globe have become dependent on a variety of platforms to stay connected to friends, get their news, and pass idle time. All of that is changing, and it’s changing fast.

Faceb**k is all over the news with issues of privacysecuritycensorship, and doing an all-around lousy job of removing bad actors from their platform. They, like all the other major platforms, need eyeballs on ads to deliver Wall Street results, so their needs and their users’ needs are in direct conflict.

The days of following friends, topics, and brands you like might be coming to an end. TikTok and other copycat platforms have a different strategy, approach, and their interfaces reflect that. Even though ByteDance operates in a communist economy, there are capitalist fingerprints all over their strategy. But it’s not about money, it’s about your attention.

The ‘Attention Economy’ is here, and it’s here to stay. And while Twitter, Faceb**k, Instagram and the like have been focused on Timelines, the Chinese company ByteDance has taken a decidedly different approach, and the potential effects on society are worth your attention.

The main difference is how these apps are designed to pull you into their worlds. It’s not about your declared friends or interests, it’s about your implied friends and interests. They don’t care about what you say you like. They’re watching your behavior and showing you things their AI thinks you’ll like.

“TikTok has stepped over the midpoint between the familiar self-directed feed and an experience based first on algorithmic observation and inference. The most obvious clue is right there when you open the app: the first thing you see isn’t a feed of your friends, but a page called “For You.” It’s an algorithmic feed based on videos you’ve interacted with, or even just watched. It never runs out of material. It is not, unless you train it to be, full of people you know, or things you’ve explicitly told it you want to see. It’s full of things that you seem to have demonstrated you want to watch, no matter what you actually say you want to watch.

It is constantly learning from you and, over time, builds a presumably complex but opaque model of what you tend to watch, and shows you more of that, or things like that, or things related to that, or, honestly, who knows, but it seems to work. TikTok starts making assumptions the second you’ve opened the app, before you’ve really given it anything to work with. Imagine an Instagram centered entirely around its “Explore” tab, or a Twitter built around, I guess, trending topics or viral tweets, with “following” bolted onto the side.”

Why am I writing about this? Because AI and all the focus on  has a huge influence on how we design experiences, and it’s time we thought about designing products and services differently.

The algorithms and bots aren’t going anywhere. It’s up to us to decide if they’ll be used to help us, or if we’ll submit to just being another set of eyeballs in the churn of the attention economy.

Full article on New York Times.