WASHINGTON — Facebook, Twitter and YouTube officials disagreed Wednesday with a foreign policy expert who told a Senate Committee that the social platforms take too long to identify extremist and terrorist content. 

Testifying before the Senate Commerce, Science and Transportation Committee, Clinton Watts, a fellow at the Foreign Policy Research Institute, said that terrorists and extremists are using social media to complete “audience infiltration,” or pretending to be members of a group in order to influence members of that group. He added that Russian and other foreign agents tried to propagate the idea that the 2016 elections were rigged. 

“It’s about destroying democratic institutions,” he said. 

But officials from the social media companies, Facebook’s Monika Bickert, YouTube’s Juniper Downs and Twitter’s Carlos Monje said that their systems quickly and accurately targeted extremist propaganda. 

Bickert said Facebook uses a combination of counterterrorism experts, law enforcement officials and engineers to “maximize free expression while keeping people safe.” She added that the company is using image matching and machine learning to remove dangerous content.  

Downs said that YouTube also uses image matching and machine learning to identify extremist content. She added that 98 percent of content removed for promoting violent extremism from YouTube was found by algorithms and that machine learning has allowed human reviewers to remove five times more videos.  

She said the company removes nearly 70 percent of extremist content within eight hours of upload and nearly half in two hours.  

Monje said Twitter’s algorithms were responsible for identifying and removing 90 percent of terrorist accounts and 75 percent are removed before they ever tweet.  

When questioned about bots, Monje said he believed that less than 5 percent of Twitter accounts were fake.  

In response to the amount of false information on social media about the Las Vegas shooter immediately after the incident, Bickert said Facebook has changed the way its crisis center works,is removing accounts that propagate fake news and is working with responsible publishers to make sure they know how to use Facebook’s tools.  

Watts argued, however, that social media sites are only trailing the problems. He said that companies have been good at identifying extremist patterns after the damage has been done but that social media platforms should focus more on working with threat analysts to anticipate what will happen.  

Asked how he would grade each of the companies on combating threats, Watts said, “Google [which owns YouTube] and Facebook have outpaced Twitter.”