Business News

Meta said it will share software in attempt to combat terrorism, human trafficking

GETTY_121322_Meta
Mateusz Slodkowski/SOPA Images/LightRocket via Getty Images

(NEW YORK) — Meta, formerly named Facebook, said it’s opening up a piece of its technology to combat terrorism and human trafficking across the internet. It said it will allow other companies to share data and prevent the spread of violent images on the internet.

This software will be shared in advance of Meta’s yearlong chairmanship of the Global Internet Forum to Counter Terrorism (GIFCT), which begins in January.

Meta’s Hasher Matcher Actioner will be a free, open-source content moderation software tool “that will help platforms identify copies of images or videos and take action against them en masse,” Meta President of Global Affairs Nike Clegg said in a release.

The Hasher Matcher Actioner allows companies to find duplicated images by looking at hashes, or digital fingerprints. Those fingerprints or hashes are created after images or videos are run through an algorithm developing a series of numbers or letters specific to that image, the company said.

The hash allows for that data to be matched in mass, allowing images that violate the platform’s terms of service to be quickly addressed and taken offline, a tool Meta said will be helpful to smaller tech platforms.

Meta said it spent $5 billion on safety and security in 2021 and had over 40,000 employees dedicated to the company’s efforts on online safety.

Meta is a founding member of GIFCT, which is a non-governmental organization that was created by tech companies in 2017 to combat extremist content online, including terrorism.

When a terrorist attack happens, the GIFCT works collaboratively to create a hash based on the online video created by a perpetrator or accomplice during a terrorist attack. That hash allows companies to remove the images offline quickly. Companies in the GIFCT, including Microsoft, Airbnb, Amazon and current chair YouTube, often use a hash-sharing database that works to block videos and images that violate their terms of service from their platforms.

Matthew Schmidt, associate professor of national security, international affairs and political science, at the University of New Haven, told ABC News most organizing of terrorism events or human trafficking happens on the dark web.

He said releasing open-source software is critical in limiting the places where violating content can appear. However, he said it’s not clear how this will affect what happens on the dark web.

“The internet is infinite; there’s not going to be a good way to prevent this from continuing because they’ll just move somewhere else. Where the algorithm isn’t,” he said.

Schmidt said most efforts to prevent violent content sharing have come from the private sector, not the government, which has relied on social media companies for moderation.

He said the government has allowed “private companies to establish their own speech norms like we’ve been talking about with Twitter, and use those norms to prohibit behavior on their platforms.”

Companies work with law enforcement agencies to then prosecute what they believe to be criminal behavior.

Dina Hussein, Counterterrorism & Dangerous Organizations Policy Lead at Meta, told ABC News that by sharing Meta’s Hasher Matcher Actioner, it would be able to help the internet as a whole.

“What we’re hoping to do, is lift up our baseline best practices for the entire industry,” Hussein said.

She added, “as long as this kind of content exists in the world, it’ll manifest on the internet. And it’s only through collectively working together that we can really keep this content off the internet.”

Copyright © 2022, ABC Audio. All rights reserved.