Facebook privacy scandal shows lax data controls can have financial consequences
Social networks mishandling user information is not just an ethical issue, but ultimately affects the bottom line
The recent data mining scandal at Facebook brings into focus the privacy concerns resulting from society’s increasingly tech-centric lifestyle. The public’s use of social media relies on a degree of trust being placed in such platforms. If that trust is compromised, social media companies can face massive financial consequences.
Following investigations by The Guardian and The New York Times, it was revealed that Cambridge Analytica – a political consulting firm – had obtained data from more than 50 million American Facebook users to create psychological profiles which were used to influence the 2016 US presidential election.
It is important to point out that this was not a data breach. There was no systemic hack of the company’s servers or theft of any data. The scandal happened as a result of the privacy policy Facebook had in place and its lax data controls.
The scandal happened as a result of the privacy policy Facebook had in place and its lax data controls
The incident has resulted in demands for tougher data regulations, but regulation on its own may be insufficient in the absence of changes in consumer behaviour. “Regulations cannot prevent these kinds of breaches,” said Atefeh Mashatan, Assistant Professor at Ryerson University’s School of Information Technology Management. “They definitely guide the industry to adapt best practices, but they are not enough.”
Data mining
Facebook laid the groundwork for its data to be misused in this way in 2007, when it began allowing developers to create apps such as games and quizzes in order to increase user interactivity. To draw in developers in large numbers, Facebook had to give them something in return: data from its users.
In 2014, a Cambridge University researcher named Aleksandr Kogan created a commercial enterprise called Global Science Research (GSR), which teamed up with SCL – Cambridge Analytica’s parent company – to harvest Facebook data through a quiz called ‘thisismydigitallife’.
At the time, Facebook still allowed apps to pull in data not only from the individuals that installed them, but from their friends as well. This meant that for each of the 320,000 people who took the quiz, around 160 others had their information pulled in as well, resulting in a total haul of over 50 million users’ data being harvested.
According to reports, Facebook was concerned about the volume of data that was being taken from the platform, but was reassured by GSR that it was only used for academic purposes. That same year, Facebook changed its policy to stop apps from pulling in data from users’ friends unless they also approved the app could do so.
Facebook CEO Mark Zuckerberg said the company initially found out about the situation in 2015 after revelations from The Guardian, which reported Facebook data was being used to help US senator Ted Cruz’s campaign.
GSR had been not authorised to share the information it had gathered with Cambridge Analytica, and Facebook demanded legal certification from both parties to ensure that they had destroyed the data. Facebook received assurances from both parties, but it now appears Cambridge Analytica never followed through on its promise.
The bigger story broke in mid-March of this year when Christopher Wylie – an integral member of Cambridge Analytica’s data operations – blew the whistle to The Observer. Wylie claims that Facebook made “zero effort to get the data back” after it found out what had happened.
Zuckerberg took conspicuously long to respond to the revelations, finally posting on his personal Facebook page five days later: “We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you.”
I want to share an update on the Cambridge Analytica situation — including the steps we've already taken and our next…
Posted by Mark Zuckerberg on Wednesday, 21 March 2018
Misleading users
Facebook has a long history of controversies regarding its protection of user data. In 2011, the company settled charges from the Federal Trade Commission (FTC) over complaints that it had failed to keep its users’ information private and disclose how such information could be used.
Following an investigation in 2015, Facebook was fined €150,000 ($184,500) by France’s privacy regulator for not protecting user data and for tracking users’ internet activity through cookies, both on and off the site.
Last year, the EU fined the social media giant £94m ($132m) for a change in its privacy policy that allowed advertisers on Facebook and Instagram to use data derived from WhatsApp. The change was in direct violation of a pledge Facebook made in 2014 when it acquired the messaging app.
Compounding the fallout from the Cambridge Analytica affair, it recently came to light that Facebook collects records of phone calls and text messages from Android phones. Facebook defended this practice saying it is an opt-in feature used to connect contact books.
The collection of user data for the purpose of advertising is the bedrock of Facebook’s business model, however, many see the practice of giving away data to third parties as clashing with the company’s pledges to protect data. “I see it as fundamentally incompatible with privacy, but that is an absolutist position,” said Nora Rifon, Professor in the Department of Advertising, Public Relations and Retailing at Michigan State University. “[Government] privacy policy guidelines can be adhered to and consumers will still have their privacy violated because information will be collected and shared, often with third parties.”
According to Rifon, research shows that links to privacy policies or trust seals on a website tend to make users believe their information will not be shared. “We know that is not true in practice,” she said. “In my opinion, most social media owners take advantage of their users and use their information without their knowledge because the FTC privacy notice guidelines are meaningless and ineffective.”
Facing the consequences
The revelations have been a PR nightmare for the social network, and the backlash swift and heavy. The company has been hit by condemnation from the public, the media and governments alike.
Facebook’s stock value has dropped by over $94bn since the scandal broke. On March 16, shares were priced at $185 each; on April 2 they were worth $159. The scandal has so far slashed the company’s market cap by almost 20 percent, putting it at its lowest level since mid-2017. Fears of tightening regulations in the wake of the scandal are making investors wary of buying stock back, even though some believe the situation is only temporary and presents an opportunity to get cheap stock before it begins to rise again.
The response from the corporate sector and business leaders has been harsh. Fellow celebrity CEO Elon Musk took down the Facebook pages of Tesla and SpaceX, tweeting: “It’s not a political statement and I didn’t do this because someone dared me to do it. Just don’t like Facebook. Gives me the willies. Sorry.”
Brian Acton, Co-Founder of messaging app WhatsApp, who sold the company to Facebook for $16bn, also joined the chorus of those advocating for people to leave the site, tweeting simply, “It is time. #deletefacebook”.
It is time. #deletefacebook
— Brian Acton (@brianacton) March 20, 2018
Facebook has also come under pressure from lawmakers in both the US and the UK to explain why it allowed third parties to access information without users being made aware of how their data was being used. The company received a letter from the Committee on Energy and Commerce that communicated its intention to hold a hearing on the issue in the near future. On March 26, FTC also confirmed it had opened an investigation into Facebook’s data practices.
Broken trust
The site’s users know, to some extent, that by agreeing to use the platform they are consenting to Facebook collecting data on them. What sets this issue apart, however, is the lack of transparency on the part of the company. The fact that personal information was used in a way that users would not have agreed to – and that the company said little or nothing about it when it first found out in 2015 – may cause irreparable damage to the trust people put in the platform.
In reality, however, privacy fears may affect the choices of a small minority of users, but are unlikely to reach the critical mass. “I am a bit of a cynic regarding the propensity of consumers to give up benefits offered by social media to protect their privacy,” said Rifon. “The average user is unlikely to understand the scope of Facebook’s violations, and why or how these violations actually occurred.
“Without an understanding of how the data is collected and used, I suspect many users may dismiss this is a one-time accident rather than a result of standard operation procedures that are used for data mining.
“Users may not fully appreciate the value of the personal information that is collected and compiled. In addition, if users believe that Cambridge Analytica is completely to blame, then they might want to forgive Facebook and hope that it will not happen again.”
In theory, Facebook shouldn’t need to share data with other companies in order to help them advertise; marketers could instead be given the simple option of choosing their intended target via the platform. However, giving third parties actual data attracts developers, prompting them to do more business on the platform. Ultimately, users are not Facebook’s customers, but instead its product, used to serve the site’s advertisers.
Regulation change
The unavoidable question raised by the scandal is: ‘what regulations can be implemented to prevent this from happening again?’ The challenge for Facebook will be weighing up users’ privacy concerns with the company’s ability to do business, and the challenge for regulators is in determining whether preserving the latter is worth infringing on the former.
“The FTC has always used a clear and conspicuous standard for disclosure of practices,” said Rifon. “For social media, ‘clear and conspicuous’ needs to be reassessed. In addition, the technical details of what happens to information after it is collected, then disseminated and used needs more clear explanation.
“I would like to see the US Government take a stronger position on this and try to meet the standards that are in place in the European Union. That would be a start, but we need more. It is reasonable to ask ‘will anything we do now matter?’
“An attorney at the US Federal Trade Commission told me recently that the barn door is open and the horses have been let out. We can’t get them back. This is a sorry situation. Not sure it can be fixed [sic].”
New regulations may be a necessary step, but it is not a holistic solution. Companies can misuse data given to them, but only if it is given to them in the first place. The most effective solution to the problem likely starts and ends at the individual level. “The biggest mistake is to believe that there is such a thing as online privacy,” said Mashatan. “There is no online privacy. Another mistake is to rely on privacy laws to protect you and your data. Users should take it more seriously and proactively take steps not to divulge their private information online.”
Facebook has had privacy scandals before, but this one seems different. People tend to overlook their data being misused by people trying to sell them things – an act that doesn’t appear ostensibly harmful. However, the possibility that data is being used to influence people’s behaviour in something as consequential as an election is not as easily forgiven. Social media companies need to take into account that protecting their consumers’ data is not just a question of business ethics, but a financial consideration.
Facebook must reconsider its business model and brace for the possibility of lower – or alternative – revenue streams if a slowdown in the data flow to third parties translates into less profit for the site. No other case in the tech space has been as indicative of the potential financial damage a company can suffer from lax data policies. Facebook’s haemorrhaging market value should be a warning, not just to other social media sites, but to any service that handles sensitive data on its consumers.