It's been a ropey period for Facebook, and with Chief Executive Officer Mark Zuckerberg now up before Congress in the US, things may get even ropier. Can the social media stave off a decline that many have predicted?
"Facebook is an idealistic and optimistic company," offered Mark Zuckerberg to the US senators lined up to question him during his testimony to the United States Congress on Tuesday.
The senators were skeptical, and with calls to #deletefacebook being used by high profile tech celebrities like Elon Musk coming alongside grumbles from some investors calling for Zuckerberg to step down as CEO, it appears they are not the only ones. The reputation of the world's biggest social media platform stands in jeopardy as Facebook defends its role in a data breach affecting millions.
Scheduled to speak before Congress both Tuesday and Wednesday, Zuckerberg is responding to allegations that Facebook played a major role in the foreign meddling of the 2016 US presidential elections. For Zuckerberg, the once geeky 19-year-old Harvard student who started Facebook from his dorm room, the stakes are severe.
Now 33 years old, Zuckerberg is one of the richest men in the world. Last year Facebook exceeded $500 billion (€404 billion) in market value. However, the 18 percent plunge that Facebook stock took after the scandal broke cost billions — clipping $80 billion off Facebook's value and an estimated $14 billion from Zuckerberg's personal net worth.
Why this matters
The amount of people affected by this data leak is not yet known but it is suspected to be more than 87 million users. As the first social media platform to break a billion users, Facebook remains market leader with 2.2 billion monthly users. The world's most populous country, China, has 1.4 billion people. Pew research polls cite that 62 percent of Americans get their news from social media, meaning Facebook is a thriving trove of data about what people like, dislike, will click on, will ignore and perhaps even how they are persuaded to vote one way or another.
In 2014 Facebook gave access to Cambridge Analytica without overseeing the data collection process or knowing how the data would be used or toward what end. Yet Cambridge Analytica manipulated the data to progress the political agenda of their clients.
The story recently came to light when former founder-turned-whistleblower Christopher Wylie released documentation that showed Cambridge Analytica to have improperly collected Facebook data and used it to influence voters ahead of the 2016 US Presidential election and the 2016 EU membership referendum in the UK.
According to Wylie, when Facebook was made aware of the data breach it neither alerted authorities or the users whose information was obtained. During Tuesday's questioning, the senators were not forgiving: "Don't you think you have an ethical obligation to notify 87 million users?" asked Senator Bill Nelson during Tuesday's testimony.
Chairman John Thune opened Tuesday's hearing with a cautionary tone: "Facebook's incredible reach is why we're here today. The story you created is the American dream. At the same time you have an obligation to ensure that dream does not become a privacy nightmare. America is listening, and quite possibly the world is listening."
Blowing the case wide open
"We considered this a closed case" — that was Zuckerberg's response when pressed for why no further action was taken in 2015 upon first learning about Cambridge Analytica's use of data. Yet this is not the first time Zuckerberg has been under fire for privacy issues.
In 2014 Zuckerberg issued an apology after Facebook admitted to manipulating the posts on more than a half a million people's feeds to determine how different posts affected their levels of happiness or sadness.
Each time a user logs into Facebook, there are over 1,500 items — postings, photos or news articles — that one could potentially see, but Facebook's algorithm has the final say in which 300 items will actually be presented in the feed. And with an algorithm that favors content that elicits emotion, the lingering invitation to exploit or manipulate user preferences has not been researched or guarded well, say skeptics.
Just last year, Germany passed new data protection laws, rendering Facebook's data sale to Cambridge Analytica illegal under the premise that users did not have enough information to consent. The German Federal Cartel Office accused Facebook of breaking data protection laws to support an unfair monopoly. Other charges have been brought forth in the EU and Facebook claims to be working to make Facebook compliant with the EU's General Data Protection Regulation (GDPR).
Just last month Facebook was in conversation with 10 hospitals developing a research project that would share the hospitals' patient data, including prescriptions and income levels, in an attempt to combine its own user data to explore algorithms for meeting patient needs for advanced treatment. Recently, Facebook admitted to stepping back from this project, saying they need to work on securing better privacy practices first.
Did Princeton call this?
Ironically, a study released in 2014 by Princeton researchers predicted that Facebook would lose over 80 percent of its users by 2017. Comparing the rise and fall of Facebook to that of an infectious disease, the researchers found that user adoption would largely be driven by contact, in much the same way as an illness spreads.
Eventually interest would be lost and users would gain "immunity" and move on. Although it is 2018 and Facebook has definitely not lost 80 percent of its users, there's no denying a shift has begun and a need for governance and increased security has been addressed.
Facebook shares were up 3.6 percent while Zuckerberg was being questioned by Congress, a sign that investors aren't too sick of him just yet.