Facebook must make public human rights audit about India

Facebook CEO Mark Zuckerberg

By Abdul Bari Masoud

New Delhi: At a webinar, activist groups and whistleblowers pushed social media behemoth Facebook to make public a human rights audit of its work in India that it has reportedly suppressed for more than six months.


In the wake of leaked documents claiming that Facebook’s social media platforms were used to spread hatred and empower authoritarian countries to control political debate.

According to the US newspaper  Washington Post, Meta, Facebook’s parent company, recruited the legal firm Foley Hoag in 2020 to conduct an independent audit of its impact in India, the company’s largest market.

It was given to the company  in the middle of last year by a civil rights organisation. According to the Wall Street Journal, Meta has limited the scope of the draft report and is postponing its distribution.

Over 20 human rights organisations have demanded that the audit report be made public.

Whistleblower Sophie Zhang, a former data scientist at Facebook, spoke at an online briefing hosted by the Real Facebook Oversight Board and India Civil Watch International on Wednesday night, recounting her earlier revelations that “politicised decision-making (by Facebook) was most severe and largest in India.”

“When I called bogus accounts in India corrupting politics, Facebook agreed to take them down as violations of their community standards until we learned that the accounts were operated personally by a sitting BJP MP,” Zhang stated. I recall radio silence and the company’s unwillingness to act as soon as the discovery was made.” She didn’t say who the MP was.

“Facebook has been aware for some time that the BJP was breaking their community guidelines. They were aware of the situation in India but chose to remain silent…. The HRA (human rights assessment) has been withheld by Facebook…. Given their track record in India, this comes as no surprise to me,” Zhang remarked.

“Up until my disclosures, every time activists raised concerns, Facebook was able to say that that is anecdotal evidence, that is not what is happening on the platform — because no one could see inside the company,” said Frances Haugen, a former data scientist at the company who last year leaked documents to the US stock exchange regulator showing Facebook’s failure to regulate hate content.

“We must strive for obligatory transparency because, as Facebook has proved, when they have voluntary openness and find findings they don’t like, they stonewall, postpone, and make sure no one ever sees those results,” Haugen continued.

“Facebook isn’t participating because they know what’s in this report is horrible,” she continued. We’ve already seen a lot of coverage about Facebook’s persistent awareness and negligence when it comes to India, and we need to expect a higher degree of care and accountability from the company in terms of how it treats its Indian customers.”

In response to queries from The Telegraph, Meta’s director of human rights policy, Miranda Sissons, said in an email: “Given the complexity of this work, we want these assessments to be thorough. We will report annually on how we’re addressing human rights impacts, in line with our Human Rights Policy.”

Teesta Setalvad, an activist and journalist with Citizens for Justice and Peace, stated during the webinar, “What happens now if Facebook does not act?” … In these circumstances (of rising intolerance and weakened democratic institutions), the formidable Facebook clientele of 463 million (in India), plus the millions of WhatsApp and Instagram users — platforms also owned by Facebook and used to disseminate hate — compounds the negative societal impact.”

“Social media platforms, particularly Facebook, share a lot of responsibility in making hate normal, popular, and accessible everywhere in the country,” said former Delhi Minorities Commission chairman Zafarul Islam Khan, who is credited with an investigation that accused BJP leaders of orchestrating the 2020 Delhi riots. I am concerned that a genocide attempt could occur anytime before the 2024 general election, and that social media platforms will be heavily employed in this crime.”

Concurring with the speakers, Gregory Stanton, the president of Genocide Watch, who foresaw the Rwandan genocide half a decade before it occurred, flagged the same concern about India.

“Given the intricacy of this task, we want these assessments to be thorough,” Miranda Sissons, Meta’s director of human rights policy, told an Indian news outlet. In accordance with our Human Rights Policy, we shall report on how we are addressing human rights impacts on an annual basis.”


Please enter your comment!
Please enter your name here