For all the latest developments in Congress, follow WTOP Capitol Hill correspondent Mitchell Miller at Today on the Hill.
The chair of the Senate Intelligence Committee said he worries the U.S. is now less prepared than it was in 2020 for foreign adversaries sowing disinformation in the presidential election.
“I am gravely concerned that as we go into the 2024 election, we will see foreign malign influence in terms of misinformation and disinformation, coming from Russia, coming from China, coming from Iran,” U.S. Sen. Mark Warner, D-Va., said.
Warner has closely monitored foreign efforts to meddle in U.S. elections since 2016, when it became clear Russia was using social media to spread disinformation designed to confuse American voters.
A bipartisan report by the Senate Intelligence Committee said the U.S. government at the time was not prepared for Russia’s tactics, and there have since been widespread efforts to improve the response.
But Warner said a pending case that will go to the U.S. Supreme Court, and a variety of other factors, may hinder efforts by the FBI and federal officials to share information about potential threats.
The case, Murthy vs. Missouri, could decide what role government officials can play in communicating with social media companies as the country deals with foreign disinformation campaigns.
Warner said he remains concerned that not enough is being done as the campaign season gears up.
“Frankly, I don’t think we’re as prepared in 2024 as we were in 2020,” he said.
Earlier this month, Warner sent a letter to the head of the Cybersecurity and Infrastructure Security Agency, calling for CISA to continue efforts to alert elections officials to potential foreign interference.
“With the heightened possibility that the FBI may (through internal policy or court decision) be hamstrung in its ability to share threat information with impacted parties outside the federal government, it will be incumbent upon CISA to fill this vacuum,” Warner wrote.
He went on to say the CISA can be “an interlocutor between private sector entities, the intelligence community and law enforcement, and state and local officials.”
One of the areas of concern continues to be artificial intelligence being used to carry out “deepfakes” to spread misinformation.
This week, an AI-generated robocall mimicking the voice of President Biden went out to some New Hampshire residents and urged them not to vote in Tuesday’s primary. The New Hampshire attorney general’s office said it has begun an investigation.