5 of 6

9 March 2020

Data protection by design and default – 5 of 6 Insights

Data protection by design and default checklist

Tamara Mackay-Temesy covers a variety of key practical privacy by design and default issues to consider during the design process.


Christopher Jeffery


Read More

The GDPR requires that organisations implement data protection by design and default (DPDD). This involves adopting appropriate technical and organisational measures to apply the requirements of the data protection principles and to safeguard personal data processing. These measures should take into account the state of the art, the cost of implementation, what data is being used and why, and the potential impact on individuals. In practice, this means that a proactive approach to applying the data protection principles and any specific safeguards must be embedded into an organisation's culture and processes.

DPDD is about more than just implementing privacy-enhancing technologies (PETs). It means proactively planning for and incorporating data protection compliance prior to starting a processing operation, and ensuring that privacy issues are considered not only from the inception of a new way of using personal data – whether a service or product, internal process, software or hardware – but also throughout the lifecycle of the data use.

This checklist covers a variety of key practical privacy by design and default issues to consider and, where relevant, integrate during development and design processes. Given the pervasive nature of DPDD requirements and possibilities, it is not comprehensive.

Proactive not reactive; preventative not remedial

Potential data protection and privacy issues should be considered in advance to help ensure compliance and then reviewed on an ongoing basis.

Data protection policy

  • Have we implemented a clear data protection policy document, setting out our organisation's ethos and overall approach to data protection and privacy?


  • Do we cover data protection by design and default in staff training, so individuals can understand and engage with any issues proactively, systematically, and innovatively?

Data Protection Impact Assessments (DPIAs)

  • Have we considered in advance whether any planned use of data involves technology in ways which are new, innovative, or which give rise to processing or events that might be unexpected, intrusive or could present higher risks of harm to individuals?
  • Where appropriate, have we conducted a DPIA (noting that in certain instances doing so is mandatory)? Keep a record of DPIA decisions.

Privacy by default

You must only process personal data necessary to achieve your specific purpose. In some cases when dealing with children's data, you may need to set maximum privacy by default.

Purpose and functionality evaluation

  • Do we have clearly defined, limited, relevant purpose(s) that we want to collect and use personal data for?
  • Do we tell individuals what these purposes are?

Collection ('must-have, or nice-to-have'?)

  • Can we achieve our goals without processing personal data at all?
  • Can we take steps to minimise the identifiability or linkability of data sets?
  • Is special category/sensitive data necessary and justified (eg medical information for a regulated health app)?

Data minimisation

  • Have we minimised the personal data we collect to only what we need for our purposes?

Purpose limitation

  • Can we ensure we only use the data we need for the purposes we have identified?

Putting the individual first

  • Do we set default profile or account settings in a way that is most friendly to the user? For example, where users can share profiles or content, do we start by automatically making accounts private instead of public by default?
  • Do we offer genuine, effective controls and options to individuals relating to the data we will collect and process, rather than providing an illusory choice?

Retention times

  • Do we need to retain the personal data for as long as planned? Can we delete or archive or aggregate it and, if so, what is the earliest stage we can do that?
  • Can the retention and deletion process be automated to any degree?

Privacy embedded into design

Data protection considerations should be embedded into business practices as an essential component, not as an afterthought.

Privacy settings and preferences

  • Have we created controls and/or documentation enabling individuals to review and revise their privacy settings and preferences? For example, an audit tool for users so that they can determine how their data is stored, protected and used, and decide if their rights are being adequately protected.


  • Have we created controls for opt-in and opt-out of sharing data by the user, detailing the benefits or consequences of doing so in a clear and objective manner, including any potential impact to product features or functionality?

Data erasure and destruction

  • Have we designed a process that enforces secure data erasure and/or destruction?
  • Do we have appropriate deletion methods in place for each category of personal data (eg overwriting, degaussing, shredding encryption keys, physical destruction etc)?

Pseudonymisation and anonymisation

  • Can we pseudonymise the data (so that data subjects cannot be re-identified unless that data is combined with additional information)?
  • Can we anonymise and aggregate the data (so there is no chance that data subjects can be re-identified)? Can we use one-way hashing instead of raw data?
  • If delivering a product/service requires the data to be identifiable, can any secondary uses (eg analytics, R&D, reporting etc) use aggregated or pseudonymised data?

Full functionality – positive-sum, not zero-sum

Full functionality

  • Users should have full functionality regardless of their privacy settings, except where it is not feasible to provide the service without their data (eg map apps requiring location data, or an online shop providing fit recommendations requiring user clothing size data).
  • Have we ensured that features don't require non-necessary personal data in order to access or use them?


  • Have we created controls for granular data sharing user preferences (eg opt-in/opt-out), detailing the benefits or consequences of doing so in a clear and objective manner, including any potential impact to product features or functionality?

End-to-end security – lifecycle protection

Personal data must be kept secure.

Certification and existing evidence

  • Have we considered obtaining a security certification (like ISO/IEC 27001), if appropriate?

Authentication and access control

  • Do we have appropriate user access controls in place, including appropriate logical access controls, and procedures for deleting old user IDs?

Remote working

  • Do we have protocols for remote access control including the use of two-factor authentication, one-time passwords and/or virtual private networks?

Wireless networks and firewalls

  • Do we have appropriate controls in place for wireless networks, including ring-fencing different networks, and access logs?
  • Do we have firewalls for external or separate internal networks?
  • Do we have processes to block higher-risk websites/platforms which might pose a risk to personal data (eg file-sharing sites, personal email)?
  • Have we ensured processes are in place for flagging, quarantining or deleting suspicious email?


  • Have we ensured processes are in place for encrypting data where appropriate? For example: hard drives and solid state drives on laptops and desktops, any web to user traffic, any websites (from the device to the backend service), and bluetooth connections transmitting sensitive information.

Incident response plan

  • Have we created an incident response plan during the process of designing a new product/service, and considered what security measures may be needed in case of an incident (for example, an access breach, a virus, or physical server damage)?

Data back-up and recovery

  • Have we made sure we have appropriate data back-up and recovery systems in place (for example, if there is a data breach or a natural disaster)?
  • Do we follow a business continuity plan, and test it regularly?

Security and privacy risk assessments

  • Have we implemented protocols assessing and securing guarantees from our data processors as to the sufficiency of the technical and organisational safeguards they apply when processing personal data on our behalf?

Updates, patches and vulnerability testing

  • Do we have anti-virus/anti-malware programmes in place?
  • Do we have processes in place for penetration testing of company infrastructure at regular intervals?
  • Do we have appropriate updating and patching procedures in place, including verifying patch sources and packet integrity?
  • Have we ensured that our devices and software are subject to security development lifecycle testing (including regression testing and threat modelling)?


  • Do we have protections in place for all systems to prevent personal data being copied to removable media (CD/DVDs, external hard disks, USB memory sticks etc)?

Visibility and transparency

Privacy information should be concise, transparent, intelligible, and in an easily accessible form which uses clear and plain language. Take a user-centric approach to user privacy.

Privacy policy changes

  • Do we have a privacy policy/notice in place that clearly provides all of the required information?
  • Do we update it regularly, or when we do something new?
  • Do we have a process for disclosing and explaining significant changes?


  • Do we have a cookie banner and cookie notice/policy in place?

Respect for user privacy

Data subjects have a variety of rights, depending on the circumstances, and services/products may be designed to accommodate and automate these rights.

Data portability

  • Can we export personal in a commonly used, machine readable format?

Right to be informed

  • Do we fulfil individuals' rights to be informed about the data we hold about them?

Right of access

  • Do our systems facilitate individuals' right to request access to data the company holds about them?

Right to rectification

  • Do our systems facilitate individuals' right to correct the data we hold about them?

Right to erasure

  • Do our systems facilitate individuals' right to delete the data we hold about them?

Right to restrict processing

  • Are we able to freeze/quarantine data we hold about an individual?

Right to data portability

  • Can we provide individuals with their data in a commonly used and machine readable format?
  • Can we transmit that information to another organisation if required to?

Right to object

  • Do we have procedures in place to enable data subjects to object to how we're using their information, particularly in relation to any direct marketing or higher risk uses?


  • Remember that not every right will be applicable in all situations; it will depend on the type of data being processed, and the legal basis for the processing.

Children's data

Back to

Global Data Hub

Go to Global Data Hub main hub