© 2023 WOSU Public Media
Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

Former Facebook Adviser On Fake Account Removal


Facebook announced yesterday that they had uncovered evidence of an effort on their platform to influence the midterm elections. Yesterday, Facebook said they removed 32 accounts or pages from both Facebook and Instagram that were associated with hot-button social issues. More than 290,000 users followed at least one of the pages involved, according to the company. Dipayan Ghosh is with me now. He was a privacy and public policy adviser at Facebook. Good morning.

DIPAYAN GHOSH: Good morning.

KING: What do you make of how Facebook is managing this?

GHOSH: I think we've heard a sliver of the facts that the company knows. We know, as you said, that they've taken down a certain number of accounts and that less than 300,000 followers of those accounts existed. I think there are a few things to take back here. First of all, the company must be very confident that these accounts were disinformation operators. We know that for a fact.

The issue here is that there are some accounts in here that the company must be tremendously confident about, but it's entirely possible that there are many others out there that the company may not be particularly confident about and for which it's not ready and willing to disclose that they are disinformation operators.

KING: The midterm elections are just a couple months away. And Facebook says that this campaign was intended to influence the midterm elections. I wonder, do you think they're moving fast enough?

GHOSH: Well, I think that there are some really difficult issues here to deal with. The scale of disinformation operations is so tremendous that it's very hard for any one company to really deal with it effectively, especially a platform as big as Facebook. This problem is not going away. Here, we've seen that 290,000 people were impacted - at least, these are just the followers of these counts, not anybody who may have been exposed to what these accounts were putting out there. And they might have already been swung by the disinformation that they saw.

KING: Should we be worried by that?

GHOSH: Absolutely because these particular people that were following it and who may have seen the content, they might be very critical swing votes. They might be people who will swing elections. So I think the industry and our national intelligence operations and our politicians all have to get much, much better at detecting this and being more forthcoming about the information that they have.

KING: Well, I want to ask you to that end about being forthcoming. In the company statement about these fake pages, Facebook officials do not say who did it. They note some similarities between the techniques that the Internet Research Agency, which has links to the Russian government, has used in the past and what is happening now. Do you think Facebook knows that it was Russia? Do you think that this was Russia?

GHOSH: I do think that this is Russia behind this effort, and Senator Warner does as well. I think that it's a very critical distinction that you're making here and a spot-on analysis that the company did not attribute this activity to any one particular actor, whether Russia or some other disinformation operator. And yet we have politicians who are ready and willing to say that. It shows that the company does not want to be behind this kind of big decision, this kind of a major attribution. They want to leave it to the government. They want to leave it to the intelligence operations. And that's all fair and well.

But I think what I would ask of any company in this industry going forward is that even if you are not ready and willing to share everything that you know or may know even if it's not at 99 percent confidence level but somewhat lower, please share it with the government. Please share it with intelligence operations. Share what you know because we need somebody who is working absolutely in the public interest to possess this information and to make decisions accordingly. I think this whole incident also shows that even Facebook, who has the - maybe the strongest corporate security team in the world, even Facebook's detection capacities are limited.

KING: Thirty-two accounts, yeah. I mean, that's - it sounds like a drop in the bucket of what you suggest may be going on.

GHOSH: It is absolutely only a drop in the bucket as to what is going on in the universe. Keep in mind, we are talking here about only the United States. It's hard to imagine that it was just 32 accounts and pages that have been active over the past year and a half. It's also hard to imagine that there aren't more disinformation operations that are in existence around the whole world. So it shows that detection capacities are limited, especially given the types of facts that they've revealed alongside this disclosure.

So what they have told us is that there were 290,000 followers, that the accounts that were pushing this content were using VPNs and third-party payments, but that they still cannot make the final attribution to - back to the Internet Research Agency, which is based in St. Petersburg and tied to the Kremlin. They're not able to do that. That shows that their capacities are very limited, their technical capacities to investigate these matters. Or it means that they may know more at a certain confidence level and are sharing it only with the government. We don't know as the electorate. We absolutely just do not know.

KING: All right. We'll have to keep watching it. Dipayan Ghosh was a privacy and public policy adviser at Facebook. He's now a fellow at Harvard Kennedy School's Shorenstein Center. Thank you so much, Dipayan.

GHOSH: Thank you so much for having me. Transcript provided by NPR, Copyright NPR.