Saturday, December 3, 2022

Clubhouse’s security and privacy lag behind its explosive growth

Must read

[ad_1]

Clubhouse did not respond to a request from WIRED to comment through press time on its recent security stumbles. In a statement to researchers at the Stanford Internet Observatory, Clubhouse detailed the specific changes it planned to make to strengthen its security, including cutting pings to servers in China and strengthening its encryption. The company also said it will work with a third-party data security company to help carry out the changes. In response to the unauthorized website that was televising the Clubhouse discussions, the company told media it had permanently banned the user behind it and would add additional “safeguards” to prevent the situation from happening again.

While Clubhouse appears to take researchers’ feedback seriously, the company has not been specific on all of the security enhancements it has implemented or plans to add. Additionally, given that the app doesn’t appear to offer end-to-end encryption to its users, researchers say there’s still a feeling Clubhouse hasn’t given enough thought to its security posture. And that’s before you even tackle some of the core privacy issues raised by the app.

When you start a new Clubhouse room, you can choose from three settings: an “open” room can be accessed by any user on the platform, a “social” room only admits the people you follow, and a “social” room. closed ”restricts access to guests. Each has its own implied level of privacy, which Clubhouse could make more explicit.

“I think for public rooms, Clubhouse should give users the expectation that public means public to all users, because anyone can join and record, take notes, etc.” says David Thiel, chief technology officer of the Stanford Internet Observatory. “For private rooms, they can convey this as with any
communication mechanism, an authorized member can record content and identities, so be sure to both set expectations and trust participants. “

Like any leading social network, Clubhouse also has struggled to cope with abuse On the platform. The app’s terms of use prohibit hate speech, racism and harassment from november, and the platform offers some moderation features, such as the ability to block users or flag a room as potentially abusive. But one of Clubhouse’s greatest features is also an issue for anti-abuse: people can use the platform without risking their contributions being automatically saved as posts. This can embolden some users making abusive or derogatory remarks, believing that they will not be recorded and will not suffer any consequences

Stanford’s Thiel says Clubhouse is currently storing chat tapes temporarily for review in the event of abuse complaints. If the company implemented end-to-end encryption for security, however, it would have an even harder time staying on top of abuses, as it wouldn’t be able to make these recordings so easily. Every social media platform faces a version of this tension, but security experts agree that, where applicable, the benefits of adding end-to-end encryption are worth the added challenge of developing solutions. more nuanced and creative anti-abuse.

Even end-to-end encryption does not eliminate the additional possibility that a Clubhouse user can externally record the conversation they are in. This is not something that Clubhouse can easily fix. But he can at least set expectations accordingly, no matter how friendly and informal the conversation is.

“The clubhouse should just be clear on what it’s going to contribute to your privacy,” Potter says, “so you can define what you’re going to talk about as a result.”


More WIRED stories

[ad_2]

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest article