Tim Kendall Helped Architect Addictive Social Loops. Here’s His Advice for Getting Unstuck.

by Stephen Gossett
September 9, 2020
tim kendall headshot social dilemma moment
Tim Kendall. | Photo: Moment

In one dramatization sequence in The Social Dilemma — a new documentary about the psychological and civic perils of social media, premiering September 9 on Netflix — a family desperate for some device-free IRL connection resorts to drastic measures, including tossing everyone’s phones inside a time-lock receptacle during dinnertime.

The sequence is inspired by Tim Kendall’s own real-life “brute force” attempt to sever the magnetic pull he felt to his phone, after old-fashioned willpower failed to suffice. The irony is not lost on Kendall, who is featured prominently in the doc: the former director of monetization at Facebook and onetime president of Pinterest was instrumental in developing strategies that mined gold from that same magnetic pull.

He just never imagined he, too, would be so personally susceptible.

Like several people interviewed in The Social Dilemma, Kendall is a reformed whistleblower, calling for greater corporate responsibility and more awareness about screen time and algorithmically juiced attention grabbing. The film’s thesis will be familiar to tech-aware viewers. (The notion that certain modes of tech design can fuel the psychology of compulsion has been prominent at least since Natasha Schül published Addiction by Design in 2012.) But the documentary is notable as much for who’s making the case as it is for the case being made.

Alongside Kendall, the procession of interviewees includes Tristan Harris and Sandy Parakilas (Center for Humane Technology, ex-Google), Cathy O’Neil (author of Weapons of Math Destruction), Guillaume Chaslot (AlgoTransparency, ex-YouTube) Jaron Lanier (virtual reality pioneer, author of Ten Arguments for Deleting Your Social Media Accounts Right Now), Cynthia Wong (Twitter, ex-Human Rights Watch), Shoshanna Zuboff (author of The Age of Surveillance Capitalism) and Renée DiResta (Stanford Internet Observatory).

Like many of his fellow interview subjects, Kendall recognizes another irony: that technology will be key in ameliorating some of tech’s problems. Kendall is now CEO of Moment, an app that helps users monitor device habits and reinforces positive screen-time behavior.

“I fundamentally think you can use technology and our intimate knowledge of human psychology to help people make better choices that are in their long-term best interests,” he told Built In.

To be sure, Kendall sounds less like a Luddite scold than someone who’s thought a lot about technology and well-being. In fact, Moment this week will release a new group function — essentially an accountability tool that allows users to share usage patterns with their social circles in order to be more deliberate about their relationships with devices.

The app’s coaching component is noticeably absent from the big tech firms’ tools, Kendall said. “Apple, Google and Facebook said: ‘We know everybody has a weight problem. Here’s a scale.’ But there’s no exercise regimen. There’s no point of view on what kind of workout might help. And we think that’s just a really incomplete approach.”

We spoke with Kendall on what, exactly, that regimen should include, what startups can do to avoid the mistakes of the past, what gives him hope, and more.

 

the social dilemma hero
Skyler Gisondo as Ben in The Social Dilemma. | Image: Exposure Labs / Netflix

How has the pandemic affected your thoughts on screen time for kids? And did the pandemic change your own relationship to technology?

Well, usage is way up, which is not that surprising.

Technology with kids is tricky — and this is what’s tricky about the phone in general. It’s composed of some incredible things that can be incredibly helpful and educational. And it’s also composed of some things that are really not good for you. If I’m on a diet, I can control what’s in my pantry. But even if my phone is full of good things, I’m two taps away from getting something that’s not very good for me.

When it comes to kids, I think there’s going to be more tech use — and that’s probably appropriate. But where I would recommend the line be drawn is to [prioritize] tools that help reinforce relationships that your kids already have. At a basic level, texting or calling one another — those are neutral contexts. But a violent, single-player game or a social media experience that often forces me to question my self-worth are not so neutral. Apps that at least as a byproduct help nurture kids’ relationships with their peers — if I had to pick, obviously I would choose the apps that foster those relationships.

 

There’s a chorus of practical advice that plays during the film’s end credits — turn off all notifications, consider privacy-friendly search options, no social media for kids until high school. What are your own biggest pieces of advice?

In a family context, it’s important to establish windows of time where everybody’s offline. It could be after 9 p.m. It could be between 6 p.m. and 7 p.m., during dinner. But establishing those as norms that everybody sticks to. And another norm is having offline areas of the house. Say, as an example, we don’t bring phones into our bedrooms, period. Or we don’t bring phones to the dinner table.

Those are some basic things that we found families can actually get aligned on and stick to. And most families, once they stick to it for a couple weeks, it’s a relief — even if it is hard.

The second thing — which reinforces the first — is that any behavior change by an individual who’s part of a family will not sustain unless the whole family changes its behavior. This has been documented readily in children with diabetes. The way to get one kid to sustainably stop eating foods that exacerbate diabetes is to have the whole family agree to shift its lifestyle. Otherwise it’s a fool’s errand. And it’s just bad leadership as a parent to say, you’re gonna change but I’m gonna keep using my phone [without restriction].

 

Tristan Harris says in the documentary he’s most addicted to email. What apps are you most addicted to? And what steps do you personally take to counteract it?

At my worst, it’s YouTube and Instagram. And that’s not surprising. They have some incredibly powerful tools and algorithms to keep me sucked in. My best tricks for reducing it are, when I come home from work, I’ll leave my phone in my car and not get it until the next morning. That’s a little extreme, but it works really well in terms of keeping my attention focused on what I care about, which ultimately is my own psychological well-being and my relationship with my family.

 

Wouldn’t you want it handy in case of emergencies?

There are always [less extreme] ways to do what I’m talking about. You can use Apple Screen Time to shut everything off except for your phone. You could do it for shorter periods of time — so not a whole night, but a couple of hours. It’s always funny. I talk to journalists about these things, and journalism is the one profession where it’s probably a little impractical.

But when Screen Time first came out, a lot of people asked our thoughts, because we had a competing product. And I said: “Well, I think I think it’s a great start, that you can impose limits on apps or overall usage. The problem is that it’s really easy to flip off. It’s like putting scotch tape on a box of cigarettes. What does that do?”

 

How would you advise Facebook if you were still there?

Well, I haven’t been there for 10 years, but I don’t believe there’s alignment at these companies that they’re doing real harm, and I think there’s a preponderance of evidence that they are. So I would first get people aligned around the data and the research about the impact.

And then I would say, look, we have to adapt and evolve. If you look at the evolution of the McDonald’s menu, it’s significantly healthier than it was 10 years ago. If you look at auto manufacturers decades ago, they didn’t want to put in seat belts. There was a huge fight about seat belts because the cost was going to really change automakers’ margins. That seems crazy today. And now with electric vehicles, they love these combustion engines. They don’t want to go to an electric model, but they’re going to be forced. And so the best leaders of these auto manufacturers are pushing everyone to adapt to a new model.

That’s a really important place to start. How does Facebook do great while also looking out for the best interests of its users? And what comes from that? Maybe some sort of model where the consumer pays? It’s tricky. I don’t envy their position. The choreography of getting from where they are today to a business model that is congruent with users’ best interest is really tough.

 

At least compared to a few years ago, it seems like the public understands better that things like infinite scrolls and pull-to-refresh can have a psychologically addictive element — even if they don’t always know how that psychology works, exactly. Are there any emerging tech trends that give you a similar kind of pause as those once did? What about trends that give you hope?

Instagram in 2018 inserted its “You’re All Caught Up” flag in the feed. They didn’t get enough credit for that. That’s huge; they probably gave up billions in revenue to do that. And it was the right thing to do. And I bet it moved the needle in terms of usage and people’s well-being, though I’ve seen no data on it. That was sort of the anti-infinite scroll.

On the other hand, the sophisticated content and media companies are concerning to me. Netflix has said in a public earnings call that their biggest competition is sleep. That is, on one hand, funny. On the other hand, it’s really self-evident in terms of what they are designing the product to do, which is to whittle away at your health. What’s more important than sleep in terms of my fundamental health and well-being? And they are literally designing an engine that will manipulate people into sleeping less. [Note: In February, Netflix instituted a feature that allows users to turn off two autoplay functions that had been criticized for their seeming intent to keep viewers glued.]

 

In terms of data and content, aren’t they essentially just providing the audience with what it wants?

To me, it just means that the shows are going to be even more addictive. And people are going to have even less restraint when it comes to, at 10 p.m., saying maybe I should go to bed, or maybe I should binge and not go to bed until 4 a.m.

 

Speaking of persuasive technology, the documentary doesn’t paint the Stanford Persuasive Technology Lab in the most favorable light. What are your own thoughts on the work and teaching approach that was done there?

I don’t know BJ Fogg. I’ve read some of his stuff. It seems that he’s done some reckoning around what he’s advocated in the persuasive technology labs approach. What I have observed is that their approach is more considered and less mercenary.

I have worked in a different Stanford department, at the d.school. I’ve worked with them directly on their curriculum to help address this very issue. The issue is, how do you get the next generation of entrepreneurs to understand the decisions you are making today about the company you’re going to build tomorrow? That’s really important. They’ve already put together a series of courses that are oriented toward entrepreneurship, but forces people to run things through various exercises to help illustrate the ethical and moral dilemmas they may encounter.

 

Social Dilemma Tristan Sandy Roger capital Credit Exposure Labs.jpg
Tristan Harris, Sandy Parakilas and Roger McNamee. | photo: Exposure Labs / Netflix

You’ve talked about how, in the early Facebook days, the company was concerned about not getting crushed by Myspace — not the psychological or civic implications of its design choices, which seemed very remote. What would you tell a startup in survival mode that says it cannot afford to invest in a tech ethicist?

I would hope that we can teach the next generation of entrepreneurs that, like the last generation, it’s important to want to change the world. But a no-holds-barred approach to that can end up doing real damage.

 

Is there investment interest in this space? Do VCs care about ethical design?

Well, they’re investing a lot in mental health [tech], which arguably is some of the negative byproduct of this. There are some big, very well funded mental-health software upstarts that were doing really well even before COVID-19 because people are more anxious and depressed, not only because of screen time and [social media] services, but certainly these services aren’t helping the cause.

So the answer is no, not directly, that I’m aware of. There are pockets. There are a couple of businesses out there focused on creating software for families. Ours is one, although ours is much lighter weight than some of the others. And there’s a company that I’m a large investor in, that I’m on the board of, that makes a kind of a dumb phone. It’s called the Light Phone. It’s basically just a really small phone, about the size of a credit card.

The world at scale does not want that yet. I think they might. It’s not really designed as a replacement. It’s more designed as the use case you’re talking about. I want to put this away, but I need people to be able to get a hold of me. You can achieve that with a light phone. You can also achieve it with an Apple Watch, and you do see people with that use case, who really want to be deliberate about how they manage their attention.

 

I saw you once recommend that people remove email from their phones. How would you respond to people who say the nature of their jobs make that an unaffordable luxury?

What I mean by that is not to eliminate email. What I mean is, batch it. And this isn’t necessarily a phone thing per se as it is, having something that will give you a variable reward turned on 24/7 is not going to help you — in your instance — write clear, good articles.

 

I feel seen right now.

If you take it off your phone, it forces a deliberate checking of email on the desktop. So much of our lives on the internet was desktop-oriented, which happened to be terrific for our attention because it was like: ‘OK, dedicated computer time. What are the tasks I need to do? OK, I’m done, full stop.’ Now the real world and the internet have become so conflated.

Getting rid of email on your phone is kind of an advanced tactic. And by the way, my email is on my phone right now. But the times I’ve taken it off for long periods of time, I’ve found that my attention span is better and I’m more present.

 

Jaron Lanier, who’s also featured in The Social Dilemma, has advocated for something akin to data ownership — a system that pays out micropayments whenever a company makes money off your data. What are your thoughts on such a set-up?

I think what you’re describing is probably where we’re gonna end up. It seems really likely that's going to be the eventual steady state.

 

Will that require regulation?

It seems more likely to me that — and maybe this gets to your earlier question — maybe that’s where Facebook heads, right? Maybe they end up becoming this marketplace of data. And that they kind of broker. They hold your data on your behalf, use that data to show ads to you, they get paid, and then they pay you some portion of that, if you want to opt in.

 

You can see a scenario where they institute that of their own volition?

I could, because regulation seems imminent. In 10 to 15 years, maybe today’s model is untenable. Because when you think about a model of extracting more and more attention, it doesn’t end in a very good place. And I think there’s a possibility that they see that and then decide they just have to adapt. Put it this way: If Mark Zuckerberg were running a car company in the ’80s, he would’ve been the first adopter of electric technology. Because — and that is one of his terrific talents — he can kind of see the curves. So I have pretty high confidence that if anyone sees the chess moves, he does.

I noticed they recently announced an independent partnership with someone who’s going to study their platform’s impact on politics and behavior. That would not have happened a year ago. That’s indicative of where they’re shifting: “We’re gonna make ourselves vulnerable here — we’re gonna let a third party in.” They have probably some controls on it, though so it doesn’t just totally blow up in their faces. But assuming it’s a credible third party, they couldn’t have had the wool pulled totally over their eyes.

To me, it’s indicative of the direction they’re going to go, which is, they’re going to find out whether they have real problems. I think they will find they have real problems, then I think they’re going to have to refactor the model and certainly, that aspect of a data repository for the individual, complete control over my identity, and an opt in for various services with some type of compensation, I could see a world where we have that.

 

What do you hope people get most from the movie?

I obviously would want them to understand the societal impact, but also understand the impact on the individual and family level, and hopefully be compelled to take a hard look at that as a part of the broader system.

There’s a scenario where you see the film and you just write your congressman. And that’s not what I want. If someone does write their congressman, that’s also probably a good outcome, but I think that there needs to be, ideally, for most people, a step around just developing awareness of their own usage.

San Francisco startup guides

LOCAL GUIDE
Top Software Engineer Jobs in San Francisco
LOCAL GUIDE
Best Companies to Work for in San Francisco
LOCAL GUIDE
Women in Tech: San Francisco Bay Area
LOCAL GUIDE
Best Sales Jobs in San Francisco Bay Area