Blocking is having a moment.
Last month, Tracy Chou announced that she had raised a little under $1.5 million to launch Block Party, an anti-harassment startup that helps people filter abuse out of their social media experiences. On Monday, Block the New York Times, a tool to block 800 Times reporters on Twitter at once, materialized via a satirical website.
Meanwhile, the unique blocking mechanics of the audio social network Clubhouse — and their enthusiastic use by the platform’s lead investors — have raised questions about whether people in positions of power should be blocking to avoid public scrutiny.
Behind all these stories, I think, is a kind of despair over what public social networks have done to public debate. Context collapse — the disorientation that comes from addressing infinite audiences online, each with their own norms and expectations for others’ behavior — is driving more and more conversation away from the public square. And while platforms have gradually begun paying more attention to safety issues, trolling and harassment remains a fact of life for too many people who use social networks.
Viewed in that light, aggressive use of the block button makes all the sense in the world. By shrinking the size of your audience, even if only one person at a time, you can gradually rebuild context around the audience to whom you are speaking. And when trolls and abusers rear their heads, the block button remains an effective tool we have for preventing them from harassing us in the future.
When it comes to Twitter — or to Facebook, or Instagram, or Snapchat — blocking is relatively uncontroversial. And Twitter has added features in recent years that offer users ways to fight context collapse without requiring you to, for example, block 800 reporters. Maybe you want to limit replies to people you follow. Or maybe you don’t want anyone to be able to reply at all. Tools like these remove what my friend Hunter Walk calls the “paper cuts” a product can inflict on a user: small, unpleasant interactions that erode your affection for a tool over time.
Clubhouse is a new kind of social network, though, and its approach to blocking has drawn some scrutiny. What sets it apart is that audience members can effectively prevent other people from joining the audience for public discussions, if at any point that audience member is made a speaker or moderator of the chat. (It’s common for small Clubhouse chats to invite everyone to become a speaker as soon as they enter the room.) If I’m listening to an interesting conversation and raise my hand to speak, and I ask a question, anyone I block will henceforth be unable to enter the room.
This phenomenon was most visible during Elon Musk’s January 31st appearance on the app. While the show he appeared on was created by its hosts, Sriram Krishnan and Aarthi Ramamurthy, Andreessen Horowitz co-founder Marc Andreessen was made a speaker — and, as a result, many journalists found themselves shut out of the room. For reasons he has never really explained, Andreessen has blocked most of the press corps on Twitter and now on Clubhouse, myself included.
“It is one thing to block people from sending you messages,” tweeted Jessica Lessin, founder of The Information, in a short thread about the issue, “but listening to public conversations?”
As someone who has interviewed Andreessen before, and really enjoyed it, I feel this pain acutely. I’d like to know what he’s tweeting about and saying in Clubhouse rooms. And I’d also probably like to listen in on some future Clubhouse conversations where he gets called on to be a speaker.
At the same time, an important aspect of blocking is that the person you block doesn’t get a say in it. If you’ve ever blocked someone yourself, you probably wouldn’t want them to. I wish Andreessen hadn’t blocked me, but I also don’t think I’m entitled to an explanation. Maybe not seeing my dumb tweets has improved his experience of Twitter. That’s fine!
I thought of this over the weekend when my friend Taylor Lorenz and other journalists entered a Clubhouse room in which the Andreessen Horowitz founders were discussing r/WallStreetBets, of the recent GameStop stock mania. Some members of that forum refer to themselves using an offensive word, which Horowitz repeated in a question to a guest (“take us through the r* revolution”). Some people wrongly thought Andreessen had used the word; breakout rooms began to appear in Clubhouse to criticize him over it. Journalists tweeted about it, confusing Andreessen with Horowitz. Tweets were deleted; Lorenz apologized.
The whole incident, to me, makes the case for blocking. In offline life, when we worry that our public comments might be misconstrued, we limit the audience for those comments. In online life, when our potential audience is exponentially larger, it makes sense that we would want to limit that audience with extra care.
It also makes sense not to use offensive words when addressing an audience of thousands in a business context. But context collapse means that once you’re above a certain number of followers, there will always be an audience coming after you about something.
Ultimately, I think we should support strong blocking tools for the same reason we should support strong encryption: the more that we live our lives online, the more important that private spaces become. And yes, there are trade-offs: the internet makes it easier for bad actors to come together in private spaces, and platforms should take steps to mitigate the harms that they can cause. But a healthy democracy requires both public and private conversations, and platforms ought to facilitate the best of both.
Okay, sure. But isn’t there something unseemly about the rich and powerful building private spaces only to exclude journalists from scrutinizing them? I sympathize with Lessin on this point, particularly during a time when the media has been under assault from all quarters. (Andreessen Horowitz’s decision to shun the media extends far beyond social networks, as Eric Newcomer recently reported.) Some of the recent fervor in Silicon Valley to go “direct” — which is to say, around the media — has a Trumpist cast to it: reporters are the enemy of the people, and talking to them is beneath us.
Moreover, if several people you follow on Clubhouse have blocked an account, that profile will display a badge when you visit to warn you. While this likely has some safety benefits, Lorenz told me that it can also be abused by people who mass-block accounts of good citizens on the app just to exclude them from popular conversations.
All this makes me wonder whether Clubhouse could refine its blocking tools over time to enable private conversations while also being more inclusive.
I asked Block Party’s Tracy Chou. She reminded me that blocking means something different on every platform, giving each company a chance to reimagine how it might be most effective for its users.
“On Twitter, blocking someone means they can’t follow you, they can’t see your profile when they’re logged in, and it tells them that they’re blocked, they can’t respond to your tweets and have their replies show up,” Chou told me over direct message. “But there’s no actual reason that those all have to be properties of blocking someone. Clubhouse has designed blocking in a specific way with a different set of permissions and consequences. Just because Twitter block and Clubhouse block are both called ‘block’ doesn’t mean they’re the same at all.”
If it wanted to, Chou said, Clubhouse could develop other constructs beyond blocking and muting to offer users more fine-grained control. I’d like to see the app consider a “broadcast” mode that eliminated caps on how many people could attend — no more spillover rooms when an Elon Musk or Mark Zuckerberg shows up to chat. Perhaps such a mode could allow everyone to listen, unless the original moderators of the chat (and not someone who was later invited to speak) had blocked them.
When any new social network becomes popular, reporters will scrutinize how powerful people are using it. That scrutiny is necessary and good. But privacy is necessary and good, too. Blocking is a blunt tool to achieve that end, but it can be refined over time — complemented with tools that achieve similar ends with more inclusive means.
That could introduce complexity for both developers and for users. But then again, as Chou told me, “the users with the most need will probably try to figure it out.”