Microsoft buying TikTok could lead to problems controlling social media content


If Microsoft were to complete a purchase of TikTok, it would get a company with a lot of potential for revenue growth.

But with such a purchase, Microsoft would also adopt a whole new slate of problems.

Microsoft announced on August 2 that it was in talks to buy TikTok’s company in the US, Australia and New Zealand, with a deadline to complete the deal by Sept. 15. The company is currently owned by the Chinese tech company ByteDance, and has become a target of the Trump administration and other governments over privacy and security concerns. Trump also signed an executive order last week banning U.S. companies from doing business with TikTok, but it is unclear how that order could affect a potential acquisition by Microsoft.

In the US, TikTok has grown to more than 100 million monthly users, many of whom are teenagers and young adults. Those users vote on TikTok to watch full-screen videos uploaded to the app by others. These videos often have lip-syncing over songs, flashy video editing and striking visual effects with augmented reality.

To say that TikTok represents a company that is radically different from the enterprise software in which Microsoft specializes would be an understatement.

For Microsoft, TikTok could become a powerhouse for advertising revenue, but this potential is not without its own risk. Like other social apps, TikTok is a target for all kinds of problematic content that needs to be addressed. This includes basic issues like spam and scams, but more complicated content could also be a headache for Microsoft.

This could include content such as misinformation, hoaxes, conspiracy theories, violence, prejudice and pornography, said Yuval Ben-Itzhak, CEO of Socialbakers, a social media marketing company.

“Microsoft will have to deal with all of this and will be accused and criticized for not doing so,” Ben-Itzhak said.

Microsoft declined to comment, and TikTok did not respond to a request for comment.

These challenges can be overcome, but they require large investments of capital and technical skills, two things that Microsoft is capable of delivering. And though Microsoft has some experience when it comes to moderating online communities.

In 2016, Microsoft bought LinkedIn for $ 26.2 billion, and although the career and professional-centric service does not address content issues its peers address, it is still a social network. Microsoft also has Xbox Live, the online gaming service, since its launch in 2002. Online gaming and social media are different beasts, but they share similarities.

“Fighting the wrong information will have to be a mission-critical priority. Microsoft will be new here because it has no experience managing a high-profile social network on this scale,” said Daniel Elman, an analyst at Nucleus Research. “That said, if one company can acquire or develop the required skills and capabilities quickly, it’s Microsoft.”

But these are not small challenges, and these kinds of problems have become major problems for the rivals of TikTok.

Facebook, for example, was accused of not doing enough to prevent fake news and Russian misinformation before the 2016 U.S. election, and four years later the company is still consistently under criticism for questioning whether it is doing enough to prevent that. this kind of content its services. In July, hundreds of advertisers boycotted Facebook over its failure to contain the spread of hate speech and misinformation.

Twitter has meanwhile begun to lose key users, such as cabaret artist Leslie Jones, after the company harassed a rampant on its social network. The company has over the past few years built on features to build the amount of hateful content users have to deal with in their listings.

These kinds of issues have already been noticed on TikTok. Far-right activists, white nationalists and neo-Nazis have previously been reported on the app, according to Motherboard and the Huffington Post, which found some users already banned by Facebook and Twitter.

However, potential content issues with TikTok may be more similar to those of YouTube owned by Google. The two services depend on user-generated content for videos, and they both rely heavily on algorithms that learn a user’s behavior to determine what kind of content the following will represent.

“The problem with algorithm-based content feeds is that it generally degrades to the most beneficial content that shows the highest engagement,” said Mike Jones, managing partner of Los Angeles venture capital science. “There is no doubt that as creators further understand how to drive additional views and attention to the site through algorithm manipulation, the content will increase in its salaciousness and will be a constant battle that every owner will have to deal with. . “

Another similarity with YouTube is the amount of content available on TikTok that is aimed at minors. Although TikTok does not allow users under 13 to post on the app, many of its users are between 13 and 18 years old, and their content can be easily viewed by others.

For YouTube, the challenge of hosting underage children became a major issue in February 2019 when Wired discovered a network of pedophiles who used the video game’s recommendation features to find videos of minors exposed as in their underwear.

With the number of young users on TikTok, it’s not hard to imagine that Microsoft could come up with a problem similar to Google’s.

YouTube has also become a cesspool for conspiracy theories, such as the idea that the earth is flat. That can also be a problem on TikTok, and though there is evidence for this. The collusion theory that Wayfair uses its furniture for child trafficking has gained some momentum on TikTok this year.

To address these issues, Microsoft would need to invest a tremendous amount of time and money into content moderation.

For Facebook, this problem has been addressed through a bilingual strategy. The company continuously invests in artificial intelligence technology capable of detecting bad content – such as pornography, content containing violence or hate speech – and removing it from its services before it is ever viewed by other users.

For more complicated content, Facebook is also relying on thousands of human moderators. These moderators often work for Facebook through third-party suppliers as well as contractors, and they are tasked with handling thousands of pieces of content per day in difficult working conditions that are at risk of developing PTSD. These working conditions have been criticized several times, and have created public relations headaches for Facebook.

If Microsoft purchased TikTok, it would probably also need to build similar AI technology and build a network of human moderators, while avoiding negative headlines for poor working conditions.

TikTok offers Microsoft an incredible amount of potential in the digital marketing sector, but along with all of that above will come several new challenges and responsibilities that the company must take on.

.