Exploring Privacy By Design
How companies can tackle this traditionally equivocal concept
Privacy by design (PbD) may seem an elusive term even to those who well-versed in data protection or privacy law. This is largely due to its historical existence as a theoretical concept adopted and promulgated by academics.¹ As a result, the perceived difficulties relating to its vagueness formed as a barrier to the practical implementation of PbD.
However, now with the advent of the GDPR, the traditional notions around PbD are changing. No longer can organisations dismiss this concept as an academic fantasy, for the EU has turned it into an identifiable legal obligation. Accordingly, greater efforts will need to be made to fulfil this new kind of duty.
Simply put, PbD is about integrating privacy principles into the design, production and development of goods, services or operations designed to process personal information. More specifically, it concerns the incorporation of privacy from the beginning and thus making it an integral part throughout the lifecycle of a product as opposed to integrating privacy after the fact.
Article 25 of the GDPR codifies this concept by splitting it into two separate obligations which fall under the umbrella of PbD; data protection by design (DPbD) and data protection by default (DPbDf). The former is about implementing the appropriate technical and organisational measures designed to implement data protection principles in an effective manner with necessary safeguards to meet the requirements of the GDPR.² The latter is about implementing the appropriate technical and organisational measures designed to ensure that only data necessary for the processing purposes are processed, taking into account the amount of data processed, the extent of that processing, the period of storage and their accessibility.³
Given the new detailed legal language provided in the GDPR, it would be a mistake for organisations to continue to treat PbD as a novel concept with no practical relevance. The DPA in Romania recently issued a fine against a bank for failing to comply with Article 25, after data leaks resulted from a lack of diligent organisational and technical measures relating to the processing of customer data.
Privacy Ex Ante
Article 25(1), which specifies the DPbD obligation, essentially reflects the ethos of the GDPR; it ensures that, with regard to every processing operation which an organisation pursues, the Regulation is being complied with by requiring that the major facets, especially the data protection principles (Article 5) and the various rights of data subjects, are given effect to and respected. This has the dual objective of increasing the implementation of the provisions of the GDPR and easing compliance for organisations.
Two important points should be made regarding the nature of the DPbD. The first is that it entails a positive obligation imposed on the data controller (the entity determining the purposes and means of the data processing, as per Article 4(7)). That positive obligation consists of implementing the appropriate technical and organisational measures. More specifically, it is a positive obligation of result whereby the measures so implemented must achieve DPbD. The Regulation gives pseudonymisation as an example of an appropriate measure, but organisations will have to decide for themselves whether this measure, as well as others, would be suitable.
Whether the measures are appropriate depends on how well they avail or mitigate the identified risks to personal data during the processing operation. This leads to the second important point, which is that DPbD requires a risk-based approach. This will require the data controller to conduct the necessary assessments to identify those risks, including data protection impact assessments (DPIAs). Such assessments will identify the relevant risks from which the controller should be able to determine what measures will need to be put in place. This may involve consulting the necessary experts, such as a data protection officer (DPO). Having detailed records of the processing operations in place, as required under Article 30, would also aid this part of the process.
A critical part of the risk assessments will be the nature, scope and context of the processing, as this is what should be included in any DPIA. The nature means how the processing is carried out, which may include the use of automation or third parties such as cloud computing companies. The scope of the processing refers to the amount of personal data being used as well as the scale of the processing operation. The context of the processing means the circumstances around the processing operation, which largely depends on the purpose of the processing and the necessity of the processing to achieve that purpose. The nature, scope and context should be considered collectively to improve the effectiveness of the risk assessments.
In addition, and quite pertinent for any business, the cost of such measures must also be considered. Since the GDPR does specify a certain amount or percentage of investment that must go into implementing the appropriate measures under Article 25, the exact cost of the measures to be put in place should be determined by the risk assessments which will reveal which measures are necessary. The nature, scope and context of the processing will be particularly useful in determining the cost.
The latter of the two obligations under Article 25 essentially requires the controller to implement the appropriate technical and organisational measures to ensure that, by default, only data which are necessary for the processing purposes are processed. These measures should be implemented whilst taking into account the amount of data collected, the extent of the processing, the period of storage and their accessibility. Thus, while Article 25(2) explicitly refers to data minimisation, in reality, the provision implicitly mentions the other data protection principles contained in the GDPR. For instance, consideration of the period of storage of the personal data requires reference to be made to the storage limitation principle, which specifies that data should not be kept for longer than is necessary for the processing purposes.
It should be noted that DPbDf is closely connected to DPbD. The latter ensures that the processing operation or system has embedded in its design mechanisms that give effect to the provisions of the GDPR, including the rights of the data subject. The former requires that those mechanisms are activated by default. In essence, the privacy settings designed from the inception of the processing system need to be initiated at the time the system is first activated or put in place. This was one of the issues highlighted by the FTC when issuing its record $5 billion fine against Facebook in July, which was the inappropriate default privacy settings concerning facial recognition technology.
Accordingly, in practice, social media companies could comply with DPbDf by ensuring that users are provided with the appropriate settings when using the platform for the first time. For example, Instagram could set user-profiles to private from the outset. However, as stated before with DPbD, the measures implemented need to be appropriate which requires a risk-based approach.
Furthermore, DPbDf requires the appropriate organisational as well as technical measures to be put in place. This means not just the use of suitable software or hardware, but also business strategies and practices. For example, the controller should ensure that there are appropriate rules around the access to personal data by employees, especially with sensitive data. This links closely with Article 24, which stipulates that for controllers to demonstrate compliance with the GDPR, they should implement appropriate data protection policies. Such documents should address certain internal issues such as incident-reporting protocols, the allocation of responsibilities, and the procedures or controls around data processing. Staff training would also play a critical role in ensuring that employees handling personal data are aware of their obligations.
The Morrisons Case
Last year, the Court of Appeal found against Morrisons, a chain of supermarkets in the UK, for a data breach affecting around 100,000 of its employees (more here). This case can be used to show some of the steps companies could take in order to avert similar kinds of breaches by implementing PbD measures.
The breach was caused by the company’s senior IT internal auditor who took a copy of payroll data to be sent to KPMG for an audit and uploaded that data onto a file-sharing website from his home computer. The rogue employee, Skelton, committed the breach as an act of vengeance against Morrisons after previously receiving disciplinary action from the company for using postal services for his personal use. He managed to obtain a copy of the personal data after it was handed to him on an encrypted USB memory stick by a member of HR. That USB stick was plugged into his computer at work where he copied the data onto the device and then subsequently copied the data onto another USB stick which was eventually provided to the auditor.
Two points could be made in relation to PbD. Firstly, Morrisons could have simplified the process of providing the data needed for the audit by simply providing that data to the auditor directly without having to go through Skelton. It is not apparent from the facts of the case that there was any need for Skelton to pass on the data to KPMG, and thus the member of HR could have directly provided the USB stick to KPMG. This may have prevented Skelton taking the data and copying it for himself to carry out the malicious leak.
Secondly, the personal data required for audit, or the memory stick it was contained on, could have been securely configured to restrict its use so that it could only be accessed or copied by certain individuals. Thus, it may have been appropriate to make such configurations to the data required for audit so that only the auditor could access the data and use it for the intended purpose. While the High Court noted that Morrisons had made use of a software called PeopleSoft to limit the number of individuals who had permitted access to the data, the company could have taken further steps to ensure that the data could not be copied with certain permissions. Thus, even if Skelton had managed to obtain the memory stick containing the data, he would not be able to copy it to his computer without the necessary permissions, keeping the data in a secured environment preventing misuse.
A combination of both the aforementioned organisational and technical measures would form an example of how companies may endeavour to implement PbD and thus reduce the chances of kind of data breaches that occurred in the Morrisons case from occurring. Ultimately, the key with PbD is minimising or eliminating risks wherever they can be identified. Such an approach will make building a coherent compliance strategy easier.
 Ivo Emanuilov et al ‘Navigating Law and Software Engineering Towards Privacy by Design: Stepping Stones for Bridging the Gap’ in Ronald Leenes et al (eds) Data Protection and Privacy (Hart Publishing 2018).
 Jasmontaite L et al, ‘Data Protection by Design and by Default: Framing Guiding Principles into Legal Obligations in the GDPR’ (2019) 4 European Data Protection Law Review 168.
Your Guide to Understanding and Operationalising the Privacy by Design Framework (IAPP Webinar)
Why Privacy Engineering is Critical with IAPP’s Senior Privacy Fellow Caitlin Fennessy (IAPP Webinar)