TikTok has swiftly acted to combat misinformation in the wake of the recent conflict between Israel and Hamas. The European Union (EU) raised concerns and issued a stern warning to TikTok, urging the company to intensify its efforts in addressing false and misleading content related to the conflict. 

In a letter to TikTok CEO Shou Zi Chew, the EU requested that the platform provide details aligning with European laws within a 24-hour deadline. 

The Israel-Palestine conflict has witnessed an influx of deceptive content, including manipulated images and mislabeled videos across social media platforms.

In response to the EU’s concerns, TikTok stated that it had taken immediate steps to remove “violative content and accounts.” The company asserted its commitment to preserving the safety of its community and the integrity of its platform.

Recognizing its popularity among young users, TikTok is especially concerned about shielding children and teenagers from violent content, terrorist propaganda, and potentially harmful challenges. 

The EU also issued similar warnings to other major social media platforms, including X (formerly Twitter), YouTube, and Meta, the parent company of Facebook and Instagram. All these platforms were given a 24-hour ultimatum to address the issue of misinformation.

TikTok, a subsidiary of Chinese firm ByteDance, outlined the measures it had taken on its website to combat misinformation and hateful content. 

These actions include establishing a command center, enhancing automated content detection systems to remove graphic and violent material, and recruiting more moderators proficient in Arabic and Hebrew.

TikTok unequivocally disapproves of content that promotes violence or supports hateful organizations and individuals. The platform maintains a strict policy against such material and does not tolerate it on its platform. 

TikTok also expressed its dismay over the recent acts of terror in Israel and the worsening humanitarian situation in Gaza.

The EU introduced the Digital Services Act (DSA) in August 2023, which imposes regulations on the content allowed on online platforms. 

The DSA requires very large online platforms, those with over 45 million EU users, to proactively remove “illegal content” and demonstrate the measures they have taken to do so upon request.

While the EU has not disclosed its next steps in these particular cases, the DSA provides the authority for interviews, inspections, and formal investigations in cases where platforms do not comply with the regulations. 

Penalties, including fines and temporary bans from the EU, are possible outcomes for platforms that fail to address identified issues effectively and pose risks to their users.

Source: googleusercontent.com