In the wake of the Cambridge Analytica scandal, users are have grown wary of Facebook’s data practices
If you’ve been following the Facebook/Cambridge Analytica story, you know by now that up until Facebook shut down its Graph API, it was hemorrhaging data to third-party apps. One of those apps, designed by a Russian-American professor at Cambridge University on behalf of Cambridge Analytica, purportedly collected data on over 50-million user profiles under the guise of a personality quiz.
Now, it appears, that the United States, from its citizens all the way up to its government, are beginning to lose faith in Facebook. Maybe lose faith is a bit too strong and a bit too abstract. The better way to put it is that people in the United States are rapidly losing trust in the social media giant and its ability to protect their data privacy.
With that in mind, Mozilla has created an extension for its Firefox browser that isolates Facebook’s ability to track you online. Let’s Hash it out.
How did Facebook and Cambridge Analytica get here?
As we discussed last week, Cambridge Analytica is a data analytics firm with a hard right slant. While the company was initially started by a group of individuals with a diverse set of political viewpoints, the company got most of its funding from Richard Mercer, a Republican super donor that was looking for an edge as a political kingmaker.
Cambridge Analytica was able to find the technology to harvest the kind of data it needed through a Russian-American professor at Cambridge University that knew about the work being done at the university’s Psychometrics center and was able to duplicate it for $800,000. The resulting technology was an app that gave a personality quiz and then scraped data from the user’s profile, as well as that of their friends. The app was purportedly gathering data for academic reasons, it was connected through Facebook’s Graph API. All in all, it is reported to have grabbed data from over 50-million profiles.
The issue that most people have with Facebook is two-fold, for one, siphoning off massive amounts of data through the Graph API was not only comically easy, it was also perfectly acceptable as long as a legitimate reason for the collection was given. In Cambridge Analytica’s case, the purpose was “academic reasons.” Facebook never bothered to verify that’s what was happening, and a massive data leak occurred as a result.
Complicating matters even more was Facebook’s response, or lack thereof. Facebook has admitted that it knew about the issue as early as 2015, but no disclosure was made until the issue was forced recently. US Senate Ron Wyden, in particular, called on Facebook to answer questions about data privacy and Cambridge Analytica as far back as Spring of last year, but was met with consistent denials and a lack of transparency at every point.
A big part of the issue comes down to semantics. Google is claiming that it’s following the letter of the law with regard to disclosures on account of the fact that this situation doesn’t technically constitute a breach. After all, Cambridge Analytica didn’t circumvent any of Facebook’s defenses. This wasn’t hacking. They were legally pulling data through one of Facebook’s APIs.
In this case the difference between a breach and a leak are fairly academic. The end result, the unauthorized compromise of millions of user profiles, was the same. But Facebook, without explicitly stating it, is operating as if the distinction between the two relieves it of some of its responsibilities with regard to reporting. Facebook may be following the letter of the law, but not the spirit.
And Facebook’s reticence also causes other problems. There were thousands of third-party apps that connected to Facebook through its Graph API. That means potentially thousands of apps, and the companies behind them, could have also been collecting data at this scale. Additionally, a handful of companies’ livelihood depended entirely on that API, when it was shut down those companies went down with it. One researcher compared the situation to international affairs and a failed nuclear state. What happens to the nukes? In this metaphor the stolen data is the nukes, and it’s unaccounted for. Facebook’s lack of transparency, and at times, honesty, make its claims that no further data has been leaked seem dubious at best.
What’s going to happen to Facebook?
If the spate of recent investigations into Facebook is any indication, it looks like Facebook and its founder, Mark Zuckerberg, are going to need to have an honest accounting of this whole debacle.
It’s been a rough couple years for Facebook, starting with the social media giant’s role in the 2016 election, where misinformation and the cynical use of Facebook’s ad-targeting systems led to far more trouble than the company has yet to admit.
And unfortunately for Facebook, this Cambridge Analytica situation harkens back to the election. Cambridge Analytica used the data it had siphoned from Facebook to profile people and help to create intel first for Texas Senator Ted Cruz’s failed presidential campaign, and later for the campaign of future President Donald Trump.
While Facebook probably shouldn’t be penalized for what Cambridge Analytica did with the data, it does deserve some scrutiny for allowing all of that data to be so easily accessible in the first place. And of course then there was Facebook’s inaction once the leak was discovered. There are plenty of issues Facebook will need to sort through.
Trust in Facebook is plummeting
While it may still be too early to back it up with numbers, anecdotally at least Facebook is getting crushed in the court of public opinion. People are not happy that their information was potentially compromised and while Facebook has tried to take some accountability for its mistakes, neither the company, nor its founder have explicitly apologized.
This Mozilla extension is just more evidence of the country’s growing distrust of Facebook. Mozilla’s Facebook Container extension creates a blue browser tab that isolates your session from the rest of your browser activity. This prevents Facebook from tracking what other pages you’re visiting (a practice it uses to help target advertising to you).
The extension ensures that you’re not logged into Facebook on any other tabs, which does eliminate the ability to share straight to Facebook, but also keeps the social media giant from tracking what other websites you’re visiting. Even clicking a link on the platform causes the page to open in a new tab outside of isolation—where Facebook can’t track it.
Mozilla already takes several steps to help limit Facebook’s ability to collect additional information about you, things like blocking ads and other trackers. This Firefox extension just adds another layer of protection.
Of course you could always just delete Facebook, but then you would probably miss pictures of your friends’ breakfasts and having that political argument with the guy you haven’t spoken to since high school.
Click here to add the Facebook Container extension to Firefox.