Facebook, Google, Microsoft and Twitter team up to fight Terrorist Posts

Facebook, Twitter, Microsoft and Google are teaming up on a new plan to prevent the spread of terrorist content on their networks.


These companies are creating a shared database that will allow them to track the "digital fingerprints" of accounts that share terrorist images and videos across their respective networks to make it easier to identify and remove the content.




Under the new partnership, when Facebook, Twitter, YouTube or Microsoft removes a photo or video that promotes terrorism, it will add a hash — what the companies describe as a "digital fingerprint" that makes that particular piece of content identifiable — to a shared database. This will make it easier for all the companies involved to spot the same content on their own sites and remove it.

Here's their description of how it will work:

Our companies will begin sharing hashes of the most extreme and egregious terrorist images and videos we have removed from our services — content most likely to violate all of our respective companies’ content policies. Participating companies can add hashes of terrorist images or videos that are identified on one of our platforms to the database. Other participating companies can then use those hashes to identify such content on their services, review against their respective policies and definitions, and remove matching content as appropriate.

The statement notes that content won't be removed automatically as each company will review each piece of content against their respective policies. Still, it could make it easier for the companies, which operate the most far-reaching social networks, to identify terrorist content more quickly.


"There is no place for content that promotes terrorism on our hosted consumer services," the companies said in a group statement. "We hope this collaboration will lead to greater efficiency as we continue to enforce our policies to help curb the pressing global issue of terrorist content online."

Got anything to add to this article? Tell us via comments.

Comments