Covering the intersection between technology, the law and society around the world

  • White Twitter Icon
  • White Facebook Icon
  • White Instagram Icon
  • White Tumblr Icon
  • Medium

Legal Disclaimer

The content displayed on this website does not constitute legal advice. Please consult a qualified legal expert if you are seeking legal advice or information on your rights. 

The Cambridge Analytica Case

April 10, 2018

 Feature Article  

 

A catalyst for the future of data protection

In yesteryears, it was the banks which were demonised as the banes of society, especially in the wake of the financial crises a decade ago. Now technology companies, once perceived as seemingly righteous forces for good, now appear to be subject to a less glamorous impression amongst many. The latest event to support this idea broke out in March this year, when it was revealed that Facebook, a social media behemoth, as well as other companies, were involved in a significant data breach. 

 

On both sides of the Atlantic, regulators have sought to take a closer look at any potential wrongdoing. At the time of publication, Facebook’s CEO, Mark Zuckerberg, began his testimony at a committee meeting in Congress. The UK’s information watchdog has also conducted its own investigation. 

 

The saga encompasses connotative allegations too, involving the election of Donald Trump as well as the Brexit referendum. For now, little is clear in this rather complex case, but in time regulators will make their determinations and lawmakers, and even users, will act accordingly. 

 

The Facts of the Case

The story begins with Aleksander Kogan, a research associate in the psychology department of the University of Cambridge. After joining the institution in 2012, Kogan founded a company called Global Science Research (GSR). It was “founded to optimize marketing strategies with the power of big data and psychological sciences.” The company aims to empower clients “to understand their consumers, markets, and competitors more deeply and accurately than ever before.”

 

In the summer of 2014, the UK branch of the US-based political consulting company Cambridge Analytica (CA) sought the services of GSR for some political campaigning work. It asked that GSR conduct some research on Facebook users. As such, GSR requested permission from Facebook to provide an app on its platform to users for ‘research purposes’. Once permission was given, around 270,000 Facebook users, enticed by a small payment by GSR, downloaded the app, called ‘This is Your Digital Life’, which was essentially a personality survey. 

 

Once downloaded, the app asked users to log in with their Facebook credentials. Users were asked permission to access their Facebook data, which could include names, email addresses or even whole Facebook profiles. The app also collected information on a user’s friends, if their friends’ privacy settings allowed for this. The terms of service of the app did convey that it would collect data from the user them-self and also their friends. Since 2014, Facebook asks users specifically whether they are willing to share their list of friends on the social network. Accordingly, if a friend’s privacy settings allowed the sharing of data with third party developers, like GSR, then those developers could collect their data. So privacy settings permitting, GSR was able to obtain data from users of its app as well as those users’ friends. This was consistent with Facebook’s policies at the time.

 

As a result, GSR was able to collect data from what was first thought to be around 50 million Facebook users. At the time of publication, the figure was reported to be 87 million. The data were then passed on to CA. With this Facebook data, CA, according to revelations from former employee Chris Wylie, used it in combination with other data sources (such a voter records or consumer profiles assembled by data brokers) to create profiles on those millions of users. These psychographic profiles classified people by personality type and were used to create targeted political messages that would be most influential on those users. 

 

CA would have also been able to use Facebook’s advertising tools to show tailored ads that nobody else can see, so-called ‘dark posts’. Thus, according to Wylie, CA was able to use Facebook data along with other data to predict political preferences of users and try to influence voting intentions with messages personalised to their tendencies and preferences. 

 

While the Trump campaign did hire CA in the run-up to Donald Trump’s election victory, it is unclear, for now, whether the psychographic profiles were used for the campaign. It is equally unclear how many of the 50 million users were registered as US voters. Overall, whether the profiles created and the subsequent targeted advertising conducted by CA was effective and actually swayed votes one way or another remains unknown.

 

In 2015, Facebook learned about the transfer of data from GSR to CA and said that this was a breach of its terms of service. The social media company claimed that GSR wrongly passed on data from its users to third parties, prohibited under its rules. At the time of this violation, Facebook removed GSR’s app from its platform and requested that GSR destroy the data it collected. Kogan contends that this request was fulfilled. Facebook banned the company in March of this year.

 

Facebook also, according to Sheryl Sandberg, the chief operating officer, demanded that CA delete the data it had obtained from GSR. However, Sandberg admitted that Facebook did not follow up to ensure that this request was followed, and so the social media company, at the time, did not know whether the data from GSR had been deleted or not. 

 

CA has repeatedly said that it did not use, or does not hold, any Facebook data. When summoned to Parliament to give evidence to the Digital, Culture, Media and Sport Committee in February, Alexander Nix, the chief executive, was asked whether CA had used any data provided by GSR.¹ Nix said GSR did some research for CA in 2014 but insisted that the research “proved to be fruitless.” Nix went on to firmly deny that any of CA’s datasets were based on data provided by GSR.

 

Calling in the Information Commissioner

Following the outbreak of the story earlier in March, the UK’s information watchdog, the Information Commissioners’ Office (ICO), began an investigation into the potential illegalities of CA’s conduct with regard to the Facebook data it had obtained. On the 7th of March, the ICO “issued a Demand for Access to records and data in the hands of Cambridge Analytica” but following no reply from CA the Office then sought “a warrant to obtain information and access to systems and evidence related to our investigation.”

 

The ICO is the first line of enforcement in cases such as these. Under section 51 of the Data Protection Act (DPA), it is the specific duty of the Commissioner, who is currently Elizabeth Denham, to promote good data processing practices amongst companies and others processing data as well as “promote the observance of the requirements of [the DPA].”

 

The initial action taken by the ICO in March was what is known as an ‘Information notice’. Under section 43 of the DPA, the Commissioner may request “any information for the purpose of determining whether the data controller has complied or is complying with the data protection principles.” After this notice was ignored by CA, the Commissioner sought a search warrant to enter and examine the premises of CA to obtain any documents, materials or equipment the company may hold. These warrants are regulated under Schedule 9 of the DPA, which requires the approval of a judge. Such a warrant in this case was premised on the alleged serious breaches of data protection laws committed by CA, and was eventually granted by the High Court in late March.

 

On April 5th, the Office stated that its investigation involves “looking at how data was collected from a third party app on Facebook and shared with Cambridge Analytica.” It is yet to be concluded whether any unlawful activity did take place at any point. Once the ICO’s investigation has been concluded, it will then decide whether further action is necessary. 

 

If the Office finds that CA, or others involved in the case, has contravened the provisions of the DPA, then an enforcement notice may be issued. Under section 40, the Commissioner, with such a notice, can restrict or put an end to the processing of data of the offending company in question, and can even call for the rectification or destruction of inaccurate or out-of-date data. In essence, an enforcement notice does what it says on the tin; it provides instructions to achieve compliance with the DPA. 

 

Under the DPA

While no illegalities have yet been confirmed by the ICO, based on the information available publicly already, some consideration can be made in relation to the potential liability of CA and the other companies involved under the current framework for UK data protection laws. 

 

The DPA is designed to regulate “the processing of information relating to individuals, including the obtaining, holding, uses or disclosures of such information.” Derived from the EU’s Data Protection Directive (DPD), the DPA has been the flagship data protection legislation in the UK for 20 years. But the various developments in technology and data processing practices over that time-span has resulted in the General Data Protection Regulation coming into force in May to replace the Directive and also the Act. But for the purposes of the ICO’s investigation, the legal liability of CA and others will be judged against the standards of the DPA. 

 

Under the Act, section 1 defines personal data as data which “relates to a living individual who can be identified (a) from those data, or (b) from those data and other information which is in the possession of the data controller.” The provisions of the DPA only regulate this personal data and thus the rights afforded to individuals (called ‘data subjects’ for the purposes of the Act) are only applicable to this data. There is also reference made to ‘sensitive data’, which includes information relating to race or ethnic origin, political opinions, religious beliefs and other equally confidential information. The personality app created by Kogan most likely produced data of this kind, be it personal or even sensitive. As such, the processing and supply of such data to CA would be regulated under the DPA. 

 

The Act stipulates that the processing of data includes the “obtaining, recording, or holding of” such information as well as organising, adapting, altering or disclosing (by transmission, dissemination or otherwise) such information. Thus, when Kogan’s app asked Facebook users to provide their information, it was obtaining their personal and possibly sensitive data. The supply of that data to CA is evidence of that data being disclosed by transmission. CA in turn, by using such data in combination with other information to create psychographic profiles on users was adapting such data for the purposes of the Act. This would also likely fall under the ECJ’s interpretation of ‘processing’ in Bodil Lindqvist (2004)², which gave the term a wide remit. 

 

The obtaining and subsequent altering of Facebook user data is not necessarily unlawful. The Act provides a number of ways in which the processing of data can be conducted lawfully under Schedule 2. Two in particular are relevant for this case. The first lawful way is where the data subject consents to the processing of their data. The second is where the processing is necessary either for the performance of a contract with the data subject or are part of steps requested by the data subject preliminary to an eventual contract.

 

In this particular case, there would not appear to be a problem with the data obtained from Facebook users in Kogan’s app. Users of the app  were asked permission to access their Facebook information. Equally, accessing the information of users’ friends would appear to be lawful. If their privacy settings were set to consent to such activity to take place, then the collection of the data would be permitted under the DPA. This would appear to be the case even if the a users’ friends consented to access to their information by a third party in the form of an opt-out tick box being left unticked. 

 

More controversial however is the transmission of the Facebook data from GSR to CA. This transfer of the data did not appear to be consented to by users of Kogan’s app nor their friends. It is this lack of transparency and protection that will perhaps most troublesome for GSR, Facebook and CA. That it was not made clear to Facebook users that their information would be used for the political campaigning purposes eventually pursued by CA exposes a potential flaw in the legalities of the data processing. 

 

Transparency is one of the major concepts of the data protection framework of the EU. Recital 38 of the DPD emphasises this, stating that “the data subject must be…given accurate and full information” regarding the processing of their data (as is stipulated under Schedule 1 of the Act). This includes the purposes for which data is being processed. In particular for CA, Article 11 of the DPD requires those who obtain data not directly from the data subjects to provide this information to the initial data subjects. This meshes with the requirement that consent given by a data subject for processing should be informed as part of fair and legitimate processing. In order to be informed, those data subjects ought to be told unequivocally what they are consenting to. It would appear in this case that this was not evident. 

 

Under the GDPR

The EU’s new data protection laws on the horizon, the GDPR, will be given legal effect in the UK by the Data Protection Bill currently making its way through Parliament. While the Regulation will mirror much of the content of the soon-to-be-former Directive, it is more specific and strict in many areas.

One of those, and crucial to this case, is that relating to consent. Article 4 of the Regulation defines this as “any freely given, specific, informed unambiguous indication of the data subject’s wishes…by a statement or by a clear affirmative action” signifying permission to process personal data relating to the individual in question. This can be combined with Article 5, which stipulates that data must be “processed lawfully, fairly and in a transparent manner” as well as “collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes.”

 

Some elaboration on the meaning of the kind of consent required is provided by the Article 29 Data Protection Working Party. In its Guidelines on Consent³, the group suggested emphasised the need to obtain specific consent from data subjects. Essentially, the data subject must provide consent to each use of their data and in doing so should be provided with “a specific, explicit and legitimate purpose for the intended processing activity.” Thus, if CA were to, hypothetically, be required to adhere to the provisions under the GDPR, it would have needed to inform Facebook users what its data was being used for and obtain their consent to use that data for those purposes indicated and nothing more. CA would have also been required to inform the users of their rights under the Regulation, including the right to withdraw consent. 

 

Such provisions are designed to enable greater transparency around the processing of personal data, so that data subjects know exactly what data of theirs is being collated and processed and for what purposes. It will thus hopefully tackle the “invisible processing and profiling” which may be conducted by some companies, as suggested by Denham at the Data Protection Practitioners’ Conference 2018

 

A Watershed Moment?

One potential impact the GDPR is likely to have on the state of data processing in Europe is to provide greater bargaining power to data subjects. Currently, an evident lack of control and information is available to data subjects to determine where their data is going and what it is being used for. The Regulation will require companies like Facebook, CA and other such companies be less equivocal about its operations. Additionally, the Regulation may have the effect of encouraging such companies to make better offers to their users and be able to showcase that their data will be used for something worthwhile. As such, the true value of personal data may be able to shine through.

 

Ultimately, the GDPR should hopefully establish  a fairer and less shadowy environment for data processing, one in which data subjects and companies like Facebook can still profit from. But the GDPR aside, it must first be determined what illegal activity may have taken place under the DPA, if any, of which the ICO will be looking at closely. When such investigations have come to an end, there is little doubt that this case will be a catalyst for the future of data protection, where greater trust and transparency is incorporated into the way people interact with digital services that request their personal data. 

 

Sources:

[1] The audio and visual recording of the session can be found here: https://www.parliamentlive.tv/Event/Index/28bd490e-e556-485f-bf1a-264b8a0b902e

 

[2] Bodil Lindqvist [2004] QB 1014 (ECJ)

 

[3] Article 29 Data Protection Working Party, ‘Guidelines on Consent under Regulation 2016/679’ (2017)

 

 

Please reload

  • Twitter - Black Circle
  • Facebook - Black Circle
  • Instagram - Black Circle
  • Tumblr - Black Circle
Recent Posts

November 1, 2019

September 9, 2019

Please reload