Facebook not comfortable with making choices for users: Ajit Mohan
Facebook India vice president and managing director Ajit Mohan, who completed 10 months at the company, says the social media giant wants to give its users the information to make choices on their own, and that the company is not comfortable making those choices for them.
The company, which has been at the centre of a debate on privacy and fake news, recently changed its advertising policies, which exempt politicians from its third-party fact-checking programme. “We don’t believe, however, that it’s an appropriate role for us to referee political debates and prevent a politician’s speech from reaching its audience and being subject to public debate and scrutiny,” said the company recently. Mohan spoke to Fortune India about data privacy, free speech, and competition. Edited excerpts:
It has been about 10 months since you joined Facebook, how has the experience been? What was your brief when you joined?
There is a tremendous amount of excitement about India at Facebook. In the months leading up to my coming on board, many of the changes that the company did organisationally in India, in terms of bringing together different operating units in the country, which the company hasn’t done anywhere else, gave an indication that there was willingness to look at India as a special territory or opportunity. Now, in the last nine-10 months, that is the one big thing that stands out for me.
The other one is, it is very clear that I have joined a company that is in the middle of a big transformation. I think over the last three years, especially after 2016, it’s very clear that it’s not the same company as earlier, especially in its emphasis on privacy, on fundamentally reorienting the company with privacy at the core of it. That has implications across products, it has implications on how products are designed, how products are rolled out. Mark [Zuckerberg] and others have spoken about it a lot, but it’s fairly obvious to me that there is genuine sincerity about it. And in this conversation about what the new rules of the Internet should be, it is pretty clear that Facebook has embraced the opportunity to not only reinvent itself, but also to really learn from everything that has happened and establish leadership for the industry as a whole.
What’s happening with WhatsApp Pay? It is still in beta stage. Is there an update on when it is expected to go live?
We are still waiting for the approvals. For me, the more interesting thing is when we get the approvals and launch, I am excited about how quickly the financial inclusion agenda can be accelerated. We have done a lot of hard work putting together the foundation of the new Internet economy in India, and it’s time that we find the multipliers that have a massive positive impact on the economy, which is very aligned with the government’s articulated agenda.
The time you joined the company was an interesting time for Facebook. It was in the middle of a massive debate over privacy after the Cambridge Analytica episode. Privacy norms around the world were also becoming stricter. Where does Facebook stand on privacy, and how do you ensure that something like Cambridge Analytica doesn’t happen again?
In terms of what we have learnt, I think if we look back on the first 15 years of Facebook, there weren’t a lot of clear guidelines or rules of operating. One of the points we now articulate quite aggressively is that everyone will benefit from clearly articulated rules for how all of the different tech players should engage with consumers. We have been quite vocal that we invite regulation, and that we will all benefit from having greater clarity from governments around the world on what the new rules of the Internet should be.
There have been many lessons for us in the last two-three years in particular. Even the case of Cambridge Analytica is an interesting one to reflect on. It came from data that Facebook provided to an academic researcher. It was for research purposes, it was done in a manner that was compliant with what users had told us at that time. And yet the researcher sold that data, violating our policies. We have been a lot more cautious about engagement with third parties when it comes to data since. Secondly, [now] you have seen a lot more control being given to users and those controls are a lot more transparently available to the users. With the recent settlement with the Federal Trade Commission (FTC) which is still waiting for court approval, it is going to influence how we build every product. It is a clear articulation of looking at every new product launch, or changes to a product from the lens of what is the impact on privacy. It is a fundamental reorientation to put privacy at the core of everything we do and bringing in that lens of privacy into every effort linked to core products... We invite regulation that clarifies what the rules should be.
The EU General Data Protection Regulation (GDPR) enforced last year has clear guidelines on the right to be informed, the right to erase data, etc. In India, too, the draft privacy law proposes some strict norms. How do you look at these regulations, and how does it affect you?
Even in the discussion with India, we have had the opportunity to engage—whether with the Srikrishna committee or with the MeitY (Ministry of Electronics and Information Technology)... I think whenever we’ve had the opportunity to engage in this—and there’s been lots of it—I think we have appreciated the consultative nature of the process.
The government has been quite thoughtful in terms of getting inputs from stakeholders. I think we have had the opportunity to articulate our points of view. The process has been fairly expansive. The only other point for us [is] obviously [if] many more of these national privacy architectures are aligned—and that’s true for all companies that operate globally—it obviously is easier when there is a common framework.
But, fundamentally, the idea of regulation around privacy is something that we welcome.
Everyone is talking about the sudden rise of TikTok and how it presents formidable competition for social media giants. How does Facebook look at TikTok?
I think if I apply the India lens first, there are two things to call out: One, on the back of access to affordable mobile and broadband, we have seen the emergence of all kinds of apps and services as well as disruptive models. We made a minority investment in a company called Meesho, a social commerce platform, which is quite different from some of the traditional commerce models. And in many ways it’s a model that’s an innovation for India. I do think just on the back of the foundations being in place, I think it’s exciting to have different companies explore different models, create disruption, and grow dramatically.
The other lens is… WhatsApp with more than 400 million users, Facebook with more than 328 million users… I do think if we apply the lens of what are the platforms that have more scale and relevance in India, I feel comfortable that across our family of apps, we are so deeply embedded in India… and sometimes the conversation becomes about top-line numbers… But whether it be about [top-line] numbers, or the number of people using our apps, finding value in our services, creating businesses, and connecting with friends on Instagram, we definitely feel more excited about the relevance of each one of these three apps and the opportunity we have for creating [an] impact in the country.
Recently, Facebook in the U.S., in the context of fact-checking political speech, said that it’s appropriate for it to referee political debates. Does that mean politicians may say whatever they like in ads, even if their claims are false? Does that apply in India, too? Also, how are you tackling fake news?
Yes, the global principles apply in India as well. I think on fake news, the company has aggressively embraced the agenda of limiting misinformation. And the effort is on multiple fronts.
I’m expanding the question as we are going into the territory of addressing content from users on the platform. There’s one bucket where if that content violates community guidelines, it is taken off. The good part is that automation has evolved to a point where we can automatically block, for instance, terrorist content, even before it is distributed.
One of the uber-ideas on fake news is that we should give users the information to make choices on their own. We are not comfortable, and we don’t think the world would be comfortable, with Facebook making choices for users. And that gave rise to the third-party fact-checking programme. India, with eight fact-checking organisations in 12 languages, has one of the largest deployments of the programme. The idea is that you rely on a set of organisations that has been validated by the international fact-checking network to call out something on the platform that they believe is misinformation. The minute that [calling out misinformation] happens, its distribution becomes weakened dramatically. Then there are visual callouts that indicate that this has been classified as misinformation.
The idea is to rely on those with the ability to do that—the checking of facts—and then give the information to users so that they have the power to make the right choices for themselves.
As to political speeches, we definitely don’t want to be in the space of filtering them; and Mark [Zuckerberg] has been quite vocal about this… this is something where the rules should be chalked out by the government. This is an area where regulation should tell us what to do, but we don’t want to be in the business of limiting or blocking political speech.
What about instances where accounts have been taken off even though they hadn’t posted anything offensive, but because a particular set of people may not agree with it?
The idea is you are balancing freedom of speech with how you ensure this particular information doesn’t go viral. There are tough trade-offs involved with competing values. And I think we are trying to figure out what that balance is, and we’re also saying that regulators should tell us what to do. But fully recognising that in a lot of cases, these are not linear and there are difficult choices one has to make.