Al Jazeera coverage of Trump’s Jerusalem move ‘promoting hatred’

Al Jazeera coverage of a protest in Gaza showed a demonstrator taking out two pistols in front of the camera. (Screengrab)
Updated 12 December 2017
0

Al Jazeera coverage of Trump’s Jerusalem move ‘promoting hatred’

LONDON: Al Jazeera’s coverage of President Donald Trump’s recent decision to move the US Embassy in Israel to Jerusalem has been criticized for “promoting hatred and furthering tensions.”
As Trump’s decision sparked global outrage, with world leaders denouncing the move, international media gave extensive attention to global demonstrations.
However, the Qatari-owned channel’s reporting of the issue has been described as irresponsible for giving airtime to extremist views.
“The concern with Al Jazeera Arabic’s coverage of Trump’s Jerusalem announcement is that it gives airtime to some very extreme and violent comments, including calls by the terrorist group Hamas,” Tom Wilson, media commentator and fellow at the Henry Jackson Society, told Arab News.
During its daily evening program, Al Jazeera aired a tweet by Hamas — which is designated as a terrorist organization by some countries — calling the Arab and Muslim nations to mark last Friday as a “day of anger against the occupation.”
“If news agencies publicize such views in an uncritical manner, without sufficiently challenging them, then this can risk promoting hatred and furthering tensions,” Wilson said.
The Arabic news channel also aired an interview with a demonstrator who said that Palestine will be liberated only by the “child who holds a knife, and by the martyr who sacrificed his life for Palestine.”
In another report, a protester tells an Al Jazeera reporter that the US president will “meet the jihad by Muslims and Arabs.”
In another segment, the channel broadcasted a protest from Gaza where a demonstrator took out two pistols in front of the camera.
“At such a volatile time in the region channels like Al Jazeera Arabic should avoid the kind of coverage that further enflames feelings that might contribute to violence,” Wilson said. Al Jazeera’s reporting has previously been criticized for inciting hate and giving a platform to extremists and terrorists. Al Jazeera featured the Muslim Brotherhood cleric Yusuf Qaradawi, who used to promote anti-Semitism and infamously blessed suicide attacks in a 2013 interview.

Media experts accused Al Jazeera of misrepresenting information under the guise of freedom of expression and accuracy.
“All broadcasters have a responsibility to inform the public in a way that is fair and balanced and that does not involve any kind of incitement,” Wilson said.
Dalia Al-Aqidi, a media analyst and political talk-show host, said that the Qatari network had played a “dirty role” in regional conflicts.
“Manipulating the emotions of its viewers was one of the reasons behind the popularity of Al Jazeera TV, which played a dirty role in the Middle Eastern conflicts, starting with its coverage of the war in Iraq, insulting the people who were happy to get rid of the late President Saddam Hussein,” she said.
Al Jazeera has supported Osama bin Laden and other terrorists, who have “killed more innocent Muslims than what they call ‘the enemy’ —  whoever the enemy is — spreading hatred and sectarianism by legitimizing violence under the pretext of liberating Palestine.”
She added: “We just need to watch its coverage about the violent demonstrations and type of speakers they host, to realize that it’s quite clear ... Al Jazeera is taking a firm stand against the United States and Saudi Arabia. (It is) using the suffering of the people to serve its political agenda.”
Abdellatif El-Menawy, an Egyptian media analyst, pointed to the dangers of media stoking violence.
“I fully respect the anger of the Palestinian people, Arab peoples and many sympathizers around the world,” he told Arab News.
“But the mistake is when some media deal with these positions for incitement that will not lead to a positive outcome but will complicate the situation even more.”
Al Jazeera did not respond to a request for ­comment.


Facebook still auto-generating Daesh, Al-Qaeda pages

Updated 19 September 2019

Facebook still auto-generating Daesh, Al-Qaeda pages

  • Facebook has been working to limit the spread of extremist material on its service, so far with mixed success
  • But as the report shows, plenty of material gets through the cracks — and gets auto-generated

WASHINGTON: In the face of criticism that Facebook is not doing enough to combat extremist messaging, the company likes to say that its automated systems remove the vast majority of prohibited content glorifying the Daesh group and Al-Qaeda before it’s reported.
But a whistleblower’s complaint shows that Facebook itself has inadvertently provided the two extremist groups with a networking and recruitment tool by producing dozens of pages in their names.
The social networking company appears to have made little progress on the issue in the four months since The Associated Press detailed how pages that Facebook auto-generates for businesses are aiding Middle East extremists and white supremacists in the United States.
On Wednesday, US senators on the Committee on Commerce, Science, and Transportation questioned representatives from social media companies, including Monika Bickert, who heads Facebook’s efforts to stem extremist messaging. Bickert did not address Facebook’s auto-generation during the hearing, but faced some skepticism that the company’s efforts were effectively countering extremists.
The new details come from an update of a complaint to the Securities and Exchange Commission that the National Whistleblower Center plans to file this week. The filing obtained by the AP identifies almost 200 auto-generated pages — some for businesses, others for schools or other categories — that directly reference the Daesh group and dozens more representing Al-Qaeda and other known groups. One page listed as a “political ideology” is titled “I love Islamic state.” It features an IS logo inside the outlines of Facebook’s famous thumbs-up icon.
In response to a request for comment, a Facebook spokesperson told the AP: “Our priority is detecting and removing content posted by people that violates our policy against dangerous individuals and organizations to stay ahead of bad actors. Auto-generated pages are not like normal Facebook pages as people can’t comment or post on them and we remove any that violate our policies. While we cannot catch every one, we remain vigilant in this effort.”

“Yet those very same algorithms are auto-generating pages with titles like ‘I Love Islamic State,’ which are ideal for terrorists to use for networking and recruiting.”

John Kostyack, executive director of the National Whistleblower Center

Facebook has a number of functions that auto-generate pages from content posted by users. The updated complaint scrutinizes one function that is meant to help business networking. It scrapes employment information from users’ pages to create pages for businesses. In this case, it may be helping the extremist groups because it allows users to like the pages, potentially providing a list of sympathizers for recruiters.
The new filing also found that users’ pages promoting extremist groups remain easy to find with simple searches using their names. They uncovered one page for “Mohammed Atta” with an iconic photo of one of the Al-Qaeda adherents, who was a hijacker in the Sept. 11 attacks. The page lists the user’s work as “Al Qaidah” and education as “University Master Bin Laden” and “School Terrorist Afghanistan.”
Facebook has been working to limit the spread of extremist material on its service, so far with mixed success. In March, it expanded its definition of prohibited content to include US white nationalist and white separatist material as well as that from international extremist groups. It says it has banned 200 white supremacist organizations and 26 million pieces of content related to global extremist groups like IS and Al-Qaeda.
It also expanded its definition of terrorism to include not just acts of violence intended to achieve a political or ideological aim, but also attempts at violence, especially when aimed at civilians with the intent to coerce and intimidate. It’s unclear, though, how well enforcement works if the company is still having trouble ridding its platform of well-known extremist organizations’ supporters.
But as the report shows, plenty of material gets through the cracks — and gets auto-generated.
The AP story in May highlighted the auto-generation problem, but the new content identified in the report suggests that Facebook has not solved it.
The report also says that researchers found that many of the pages referenced in the AP report were removed more than six weeks later on June 25, the day before Bickert was questioned for another congressional hearing.
The issue was flagged in the initial SEC complaint filed by the center’s executive director, John Kostyack, which alleges the social media company has exaggerated its success combatting extremist messaging.
“Facebook would like us to believe that its magical algorithms are somehow scrubbing its website of extremist content,” Kostyack said. “Yet those very same algorithms are auto-generating pages with titles like ‘I Love Islamic State,’ which are ideal for terrorists to use for networking and recruiting.”