Twitter: Musk nearly “disintegrated” Asia’s child safety team

With each passing day, the situation on Twitter has become more chaotic since Elon Musk took control of the company. The numerous layoffs around the world and Musk’s decisions have weakened the social network and given space to rivals like Koo.

Several important sectors of the company are seen collapsing. And, this Monday (28), Wired has indicated, through anonymous sources, that only one employee remains to work in the team dedicated to the removal of child pornography content from the network.

According to the portal, on LinkedIn, four employees of this important sector and who worked in Singapore left the company and publicly announced their departures on the professional social network. Their layoffs took place this month.

The Singapore-based team was responsible for reviewing potentially criminal content in the Asia-Pacific region. Now it takes just one employee to cover a region that is home to about 60% of the world’s population, about 4.3 billion people.

According to data aggregator Statista, Japan alone has 59 million people with Twitter accounts, second only to the United States. And, like the rest of the global market, the Singapore office was plagued by mass layoffs and the departures of those who disagreed with Elon Musk’s new terms of work.

In an interview with Wired, USP Child Safety Industry Researcher Carolina Christofoletti, “it’s delusional to think that there will be no impact on the platform if people who used to deal with child safety on Twitter are fired or resign”.

Integrated team

While Twitter has its own employees to keep children safe, it works with organizations like the Internet Monitoring Foundation (IWF) in the UK and the National Center for Missing and Exploited Children in the US. Interestingly, such organizations also support other online platforms.

IWF communications director Emma Hardy explains to Wired that the data collected by the organization and sent to Twitter should not be deleted by humans, but rather automatically by the social network system. “This ensures that the blocking process is as efficient as possible,” she says.

However, Christofoletti explained that such organizations focus on the final product, having little access to internal Twitter data.

According to the researcher, internal control panels are critical and need to analyze their own metadata to help people write detection code that identifies content before it is shared. “The only people who can see it [metadados] they are the ones within the platform,” he says.

difficult task

Twitter’s effort to combat the exposure of non-consensual children’s content is difficult, as the platform allows the sharing of consensual pornography and the tools used to check for child abuse have difficulty differentiating a consenting adult from a non-consenting child , according to Arda. Gerkens, director of the Dutch EOKM foundation responsible for reporting online abuse.

“The technology isn’t good enough yet,” he says, as well as pointing out that this is why having a team of human employees is so important.

Concern for the future of child safety

With all the changes and uncertainties on Twitter, researchers are nervous about how the company will address issues in this area. The concerns were exacerbated when Musk asked Twitter users to respond to his post if they encountered any issues on the platform that needed to be fixed.

“This shouldn’t be a Twitter topic,” Christofoletti says. It should be redirected to the child safety team that you fired. This is the contradiction.”

With information from wired

Featured image: Michael Vi/Shutterstock

The Twitter post: Musk nearly “disintegrates” Asia’s child safety team first appeared on Olhar Digital.

Source: Olhar Digital

Leave a Reply

Your email address will not be published. Required fields are marked *