CEO and co-founder of Facebook Mark Zuckerberg poses next to Facebook head of global policy communications and former UK deputy prime minister Nick Clegg (L) prior to a meeting with French President at the Elysee Palace in Paris, on May 10, 2019.
Yoah Valat | AFP | Getty Images
Facebook will implement new tools to divert users away from harmful content, limit political content and give parents more control on teen Instagram accounts, the company vice president of global affairs Nick Clegg told several morning news shows on Sunday.
Though Clegg did not elaborate on the specifics of the tools, he told ABC’s “This Week” that one measure would urge users on Instagram for long periods of time to “take a break.” Another feature will nudge teens looking at content harmful toward their well-being to look at something else.
Clegg also indicated that Instagram Kids, a service for children 13 and younger the company recently paused, is a part of the solution.
“… We have no commercial incentive to do anything other than try and make sure that the experience is positive,” Clegg said. “We can’t change human nature. We always see bad things online. We can do everything we can to try to reduce and mitigate them.”
The appearance comes after whistleblower Frances Haugen, who is responsible for leaking internal documents to both The Wall Street Journal and Congress, testified before a Senate panel earlier this month and said the company consistently puts its own profits over users’ health and safety.
Documents leaked by Haugen spurred a series of stories from the Journal that highlighted several issues the company is aware of but either ignores or does not resolve, including that it knows Instagram is detrimental to the mental health of teenagers.
The company will begin sending data on content it publishes every 12 weeks to an independent audit, a step Clegg told ABC it is doing because “we need to be held to account.” As congressional leaders call for more transparency from the tech giant surrounding user privacy, he also urged lawmakers to step in.
“We’re not saying this is somehow a substitution of our own responsibilities, but there are a whole bunch of things that only regulators and lawmakers can do,” he told “Meet the Press” on NBC. “And at the end of the day, I don’t think anyone wants a private company to adjudicate on these really difficult trade-offs between free expression on one hand and moderating or removing content on the other.”
In response to accusations that Facebook proliferated the spread of misinformation and hate speech ahead of the Jan. 6 Capitol riot, Clegg told CNN’s “State of the Union” that individuals were responsible for their own actions.
Removing algorithms would only promote more misinformation because they work as “giant spam filters,” he added. The company is also looking into ways to reduce the presence of politics on Facebook for some users.
“Our job is to mitigate and reduce the bad and amplify the good and I think those investments, that technology and some of that evidence of how little hate speech there is compared to a few years ago, shows that we are moving in the right direction,” he told “Meet the Press.”