Israel-Palestine conflict: EU OK’s regulation of social media platforms

European Union invokes Digital Services Act to counter fake news, inflammatory content 

The European Union has warned Meta, TikTok and X against spreading misinformation and distorted views of the Israeli-Palestinian conflict. – Abdul Razak Latif/Scoop pic, October 15, 2023

KUALA LUMPUR – The European Union’s Digital Services Act (DSA) is being enforced at an unprecedented pace to cool temperatures stemming from the latest conflict in the Middle East.

Meta, TikTok and X (formerly known as Twitter) have been warned by the EU against spreading misinformation and distorted views of the Israeli-Palestinian conflict.

Hot on the heels of the violent attacks by Hamas on Israeli civilians and the subsequent deadly retaliation by Israel, governments are on the alert as the conflict has its fair share of supporters for both sides of the conflict. 

Earlier today, the EU urged TikTok chief executive officer Shou Zi Chew to “urgently step up” efforts, and spell out “within the next 24 hours” how it is complying with European law, reports the BBC.

It said TikTok needed to be mindful of its popularity with young people.

“TikTok has a particular obligation to protect children & teenagers from violent content and terrorist propaganda as well as death challenges & potentially life-threatening content,” EU commissioner Thierry Breton said in a post on X.

Breton has also demanded that X and Meta prove how they have taken “timely, diligent and objective action”.

Social media firms have seen a surge in misinformation about the conflict – including doctored images and mislabelled videos, the report said.

The EU warned Elon Musk’s X on Monday (October 9), and Meta which owns Facebook and Instagram yesterday, against allowing such content on their platforms. X was given a 24-hour deadline on Tuesday (October 10). 

Its chief executive Linda Yaccarino said it had removed or flagged “tens of thousands of pieces of content” since Hamas attacked Israel.

She also said it had removed hundreds of accounts. She posted on X:

“Everyday we’re reminded of our global responsibility to protect the public conversation by ensuring everyone has access to real-time information and safeguarding the platform for all our users. 

“In response to the recent terrorist attack on Israel by Hamas, we’ve redistributed resources and refocused internal teams who are working around the clock to address this rapidly evolving situation.”

Meta has also been handed a similar warning about disinformation – and a 24-hour deadline – by the EU.

A European Commission spokesperson said “contacts are ongoing” with the company’s compliance teams.

A Meta spokesperson told the BBC: “After the terrorist attacks by Hamas on Israel on Saturday, we quickly established a special operations centre staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation.”

“Our teams are working around the clock to keep our platforms safe, take action on content that violates our policies or local law, and coordinate with third-party fact checkers in the region to limit the spread of misinformation. We’ll continue this work as this conflict unfolds.”

X chief executive, Linda Yaccarino, said the company had “redistributed resources and refocused internal teams” to deal with the content.

In her letter to the EU, she said X had responded to more than 80 requests in the EU to remove content, as well as adding notes to certain posts, which give context to them.

“More than 700 unique notes related to the attacks and unfolding events are showing on X,” she wrote.

“These notes display on an additional 5,000+ posts that contain matching images or videos. This number grows automatically if the relevant images and videos are reused in new posts.”

The EU introduced new laws in August 2023 which regulate the kind of content that is allowed online.

The DSA requires so-called “very large online platforms” to proactively remove “illegal content”, and show they have taken measures to do so if requested.

The DSA allows the EU to conduct interviews and inspections and, if it is unsatisfied, proceed to a formal investigation.

If it decides that a platform has not complied or is not addressing the problems it has identified, and risks harming users, the commission can take more drastic steps.

This can include a heavy fine, and as a last resort it can even request judges ban the platform from the EU temporarily. – October 15, 2023