Written by Everett Kirkman
On Monday, Oct. 4, Facebook’s apps — which include Facebook, Instagram, Messenger, WhatsApp and Oculus — disappeared from the internet for nearly six hours. Users began reporting error messages being displayed on the family of apps around 10:40 a.m. central time. The outage took out a vital communications platform used by billions of people worldwide and demonstrated how essential the services have become to daily life.
Facebook said there was not a hack — the network crash was a self-inflicted problem. In an update on Monday evening, the company said configuration changes on routers that coordinate network traffic blocked the ability to communicate, setting off a cascade of failures.
Facebook has been under fire in recent weeks as lawmakers and regulators have scrutinized the networking’s far-reaching ability to polarize society and shape opinions. In mid-September, the Wall Street Journal published an expose that revealed Facebook’s internal research had concluded the attention-seeking algorithms had helped to foster political dissent and ultimately contributed to mental health and emotional problems among teenagers, specifically girls.
In the 60 Minutes report, “The Facebook Whistleblower” that aired Sept. 3 on CBS, Frances Haugen revealed herself as the data scientist who anonymously filed complaints with law enforcement that Facebook’s own research shows how it magnifies hate and misinformation.
The internal documents Haugen shared have spurred more criticism from both regulators and the general public. The documents, among other things, indicated that the company knew Instagram was worsening teenagers’ body image issues, despite company executives publicly trying to minimize the downsides of the app.
“No one at Facebook is malevolent,” Haugen said during the interview. “But the incentives are misaligned, right? Like, Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.”
Haugen said whenever there was a conflict between the public good and what benefited the company, Facebook would choose its own interests.
Haugen has filed at least eight complaints with U.S. security regulators, alleging that Facebook has violated the law by withholding information about the risks posed by its network. If Facebook asserts Haugen stole confidential information from the company, they could take legal action against her.
Haugen testified in front of Congress on Capitol Hill on Tuesday, Sept. 5. Senator Amy Klobuchar of Minnesota asked Haugen whether Facebook dedicated enough resources to removing coronavirus falsehoods, following YouTube’s pledge to ban all anti-vaccine misinformation from their platform.
“I do not believe Facebook, as currently structured, has the capability to stop vaccine misinformation,” Haugen said.
Lena Pietsch, Facebook’s director of policy communications, responded to 60 Minutes’ report
“Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place,” Pietsch said. “We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”
Facebook Chief Executive Officer Mark Zuckerbeg made his first public comments related to the public debate in a Facebook post on Tuesday, resharing a memo sent to Facebook employees. Zuckerberg wrote that he waited to address employees and the public until after the two recent congressional hearings.
Zuckerberg acknowledged the difficulty in how children use social media, reiterated calls for further regulation of the industry and underscored the importance of the company’s research into difficult issues. Zuckerberg said the outage on Monday was the worst the company has had in years.
“We’ve spent the past 24 hours debriefing how we can strengthen our systems against this kind of failure,” Zuckerberg said. “This was also a reminder of how much our work matters to people. The deeper concern with an outage like this isn’t how many people switch to competitive services or how much money we lose, but what it means for the people who rely on our services to communicate with loved ones, run their businesses or support their communities.”
Following the congressional hearing Tuesday, Senator Richard Blumenthal (D-CT) and Senator Marsha Blackburn (R-TN) praised Haugen as a witness.
“I have rarely, if ever, seen or heard as credible or compelling a witness on an issue so difficult or challenging,” Blumenthal said. “Francis Haugen wants to fix Facebook, not burn it to the ground.”
Emmie Mercer, assistant professor of business, said she thinks Facebook, and entities like it, have the means to better regulate their content. She said she hopes the outage and whistleblower will lead to necessary changes to social media sites. But, Mercer said she does not see reform happening soon unless governmental regulations are imposed.
“I teach future data analysts and data scientists, and there may come a time when they encounter a similar conflict of interest where they must decide the ethical implications of the algorithms they’re building,” Mercer said. “If an algorithm causes harm to society because it’s optimized to spread hate and misinformation, or it enables violence, addictions and discontentment, but it brings astronomical profits, what will they choose?”