30th January 2018 10 minute read
As the 25th May looms large on the horizon, numerous industry bodies are predicting how GDPR will change the landscape of UK business. In this piece, we’ve rounded up 9 of the ways in which GDPR is currently forecasted to enhance the control individuals have over their data.
Every individual will have the right to be given information about how their data is being processed and why, both when they first give their consent and later. These details might include: the identity of the data controller and data protection officer; where the data originates from; any automated decision making or profiling that’s been involved; how those decisions were reached and their significance and consequences; and whether the data has been transferred to other recipients.
GDPR emphasises strongly that the information supplied must be concise, transparent and easily accessible, written in clear plain language, and free of charge.
Individuals will have the right to access their data and confirm that it’s being processed lawfully. Organisations must provide a copy of the information free of charge (though the Information Commissioner’s Office notes that they are permitted to charge a ‘reasonable fee when a request is manifestly unfounded, excessive or repetitive’).
Normally the information must be provided within a month, but if an individual’s requests become complex or numerous then a further two month’s extension is warranted (though the individual must receive an explanation why). Organisations cannot refuse to comply just because the amount of data requested is large. They can only ask for an extension on the grounds that it is, again, manifestly unfounded or excessive.
The recommended best practice is for organisations to provide remote access to a secure self-service system that provides individuals with direct access to their information.
Otherwise known as ‘the right to be forgotten’, this allows individuals to request the deletion or removal of personal data ‘where there is no compelling reason for its continued processing’. This includes situations when personal data is no longer necessary in relation to the purpose for which it was originally collected or processed, or when the individual withdraws consent. If the data has been disclosed to a third party, they must be informed of the erasure. GDPR particularly has its eyes on the online environment here, requiring organisations who make personal data public to inform those who process that data to erase links to, copies or replications of the personal data in question.
One of the notable differences between GDPR and the Data Protection Act it replaces is that previously individuals had to make the case that the holding and processing of their data caused ‘unwarranted and substantial damage or distress’. So while ‘no compelling reason’ may sound like a fairly loose instruction, it still gives the public more power.
When the right to be forgotten first surfaced in GDPR discussions, there was nervousness in the media that it might be used as a way of silencing stories some individuals might prefer not to be in the public domain. However, this is where the exemptions around freedom of expression take hold. The example the ICO gives is where a search engine notifies a media publisher that it is delisting search results linking to a news report as the result of a request for erasure from an individual. “If publication of the article is protected by the freedom of expression exemption, then the publisher is not required to erase the article”, affirms the ICO.
Individuals have the right to obtain and reuse their personal data for their own purposes across different services. For example, transferring transactional data from their bank account to a price comparison website to gain a clearer understanding of their spending habits or find a better banking deal. They should be able to move, copy or transfer personal data easily from one IT environment to another, and organisations must make that data available for free, in a ‘structured, commonly used and machine readable form’.
Individuals have the right to have personal data rectified, for example when it is inaccurate or incomplete. If organisations have disclosed that data to third parties they must both tell those parties of the rectification and let the individual know who those third parties are (this response must be within one month).
Individuals can object to direct marketing (including profiling) and processing for statistical purposes. Organisations must inform people of their right to object in privacy notices and expressly bring it to their attention at the point of first communication.
Organisations must stop processing personal data for direct marketing purposes as soon as they receive an objection. There are no exemptions allowing them to refuse (though legal or research issues around public interest do have some additional protection under the rules).
Individuals have the right to block or suppress the processing of personal data where they contest its accuracy, when processing is unlawful, or in certain instances connected with legal claims. When processing is restricted, organisations may still store the personal data, but effectively cannot use it.
OK, that’s our title, but essentially there is a parcel of rights around automated decision making and profiling meant to safeguard individuals against the risk of a potentially damaging decision make without human intervention. However, this isn’t a blanket coverage for all AI related activity. Specifically, individuals have the right not to be subject to a decision when it is based on automatic processing and produces a legal or similarly significant effect (with exemptions around contract completion or anything authorised by law).
Profiling is defined as automated processes meant to evaluate certain personal aspects of an individual – performance at work, health, personal preferences, reliability, movements etc. Here, organisations must ensure processing is fair and transparent, with a clear and meaningful logic to it. They also need to have considered the significance and envisaged the consequences.
GDPR contains specific rules to boost the protection of children’s data and the way in which online services can gain consent to use it. Under GDPR, the default age at which a person is considered a child is under 16 (though this is one area where member states can tweak that limit – though not below 13).
Data controllers cannot seek consent from a child under that age, but must obtain it from a parent or guardian to process the data. (Preventative or counselling services offered directly to children are excluded from this.)
Again, as with the general consent rules, other lawful bases for processing apply. However, the purposes of marketing or creating online profiles don’t fall within them and these are areas where GDPR has a particularly steely focus on raising the protection around children’s personal information.
For more information around GDPR and other predicted trends around customers’ attitude to their personal data, read our January edition of Intelligence here.