Facebook will implement new tools to divert users away from harmful content, limit political content and give parents more control on teen Instagram accounts, the company’s vice president of global affairs, Nick Clegg, told several morning news shows Sunday.
Though Clegg did not elaborate on the specifics of the tools, he told ABC’s “This Week” that one measure would urge users who are on Facebook’s platform Instagram for long periods of time to “take a break.” Another feature will nudge teens looking at content harmful to their well-being to look at something else, he said.
Clegg also said the company’s planned Instagram Kids, a service for children 13 and younger the company recently paused, is a part of the solution.
“We have no commercial incentive to do anything other than try and make sure that the experience is positive,” Clegg said. “We can’t change human nature. We always see bad things online. We can do everything we can to try to reduce and mitigate them.”
Clegg’s media appearances come in response to the Senate testimony of whistleblower Frances Haugen on Tuesday. Haugen, who leaked internal Facebook documents to The Wall Street Journal and Congress, told a Senate panel that the company consistently puts its own profits over users’ health and safety.
The leaked documents spurred a series of stories by the Journal that revealed that the company is aware of several problems — including that it knows Instagram is detrimental to the mental health of teenagers — but either ignores or does not resolve them.
The company will begin sending data on content it publishes every 12 weeks to an independent audit, Clegg told ABC, because “we need to be held to account.”
As congressional leaders call for more transparency from the tech giant on user privacy, Clegg urged lawmakers to step in.
“We’re not saying this is somehow a substitution of our own responsibilities, but there are a whole bunch of things that only regulators and lawmakers can do,” he told NBC’s “Meet the Press.” “And at the end of the day, I don’t think anyone wants a private company to adjudicate on these really difficult trade-offs between free expression on one hand and moderating or removing content on the other.”
Haugen is scheduled to speak with Facebook’s Oversight Board in the coming weeks, the board announced Monday. The board was created in 2018 to review content after the company faced a series of scandals surrounding its handling of alleged Russian interference and other misuses of the platform.
The board said on its website that it is currently reviewing whether Facebook has “been fully forthcoming” about responses to its cross-check system, which the Journal revealed allows for celebrities, politicians and other people with large followings to circumvent some of Facebook’s rules.
“I have accepted the invitation to brief the Facebook Oversight Board about what I learned while working there,” Haugen wrote on Twitter. “Facebook has lied to the board repeatedly, and I am looking forward to sharing the truth with them.”
In response to accusations that Facebook proliferated the spread of misinformation and hate speech ahead of the Jan. 6 Capitol riot, Clegg told CNN’s “State of the Union” that individuals were responsible for their own actions.
He said that removing algorithms would only promote more misinformation because they work as “giant spam filters.”
The company is also looking into ways to reduce the presence of politics on Facebook for some users, he said.
“Our job is to mitigate and reduce the bad and amplify the good, and I think those investments, that technology and some of that evidence of how little hate speech there is compared to a few years ago, shows that we are moving in the right direction,” he told “Meet the Press.”