Profiling: The Challenges of the GDPR

The European General Data Protection Regulation (GDPR) pursues the target of harmonising data protection law within the European Union. It has been in force for some time now as a directly applicable law in the EU Member States. Compared to national data protection law for example in Germany (Federal Data Protection Act, FDPA), it contains only few innovations. Much has just been restructured.

Substantial changes regarding the profiling by GDPR: For the first time, the profiling is explicitly defined in the GDPR and an individual article regulates the admissibility requirements as well as limits of profiling in the context of automated procedures. However, there are also regulations with certain similarities to previous basic decisions of the (German) FDPA, so that companies should be able to find out on a case-by-case basis which practices can be maintained and which may need to be changed.

Profiling as a special case of automated processing of personal data

According to Article 4 No. 4 GDPR, profiling is any kind of automated processing of personal data in order to evaluate certain personal aspects based on said data. In the end, these personal aspects are used, for example, to analyse the work performance of a person, their economic situation, their personal preferences and the like and to make corresponding predictions. For this purpose, comprehensive (user) profiles are regularly created – above all in the context of web-based systems.

In this context, Article 22 GDPR should be mentioned, too. It prohibits the data controller from subjecting a data subject to any decision based solely on automated processing, where the decision has legal effect or similarly affects the data subject severely. In this context, Art. 22 GDPR terms profiling a special case.

Below, you can find a brief example of such an automated but inadmissible decision based on a previously created profile:

Over months, an intra-corporate system collects data about an employee – especially performance data. Using this information, the system creates a profile and evaluates the data by the date of the next feedback session. Then, it determines whether the employee is granted a salary increase or not. The persons conducting the feedback interview are only entitled to communicate the result to the employee without having a prior influence on the decision-making process.

The German Federal Data Protection Act also makes provisions for scoring

The explicit designation as profiling and accordingly also the definition in the General Data Protection Regulation are new and did not exist in this form in the former German Federal Data Protection Act.

The “new” FDPR, adapted to the General Data Protection Regulation, amends Article 22 GDPR insofar as it regulates a subarea of profiling, namely scoring, in Sec.31 FDPR (former Sec. 28b FDPR). Scoring involves the use of a probability value about a certain future behaviour of a natural person for the purpose of deciding whether to establish, execute or terminate a contractual relationship with that person. Sec. 31 determines the admissibility of such a scoring. A classic example of this is the calculation of a score by credit bureaus. However, it should be noted that this score is provided to interested companies only on request, without directly exposing the data subject to any legal effect. In these cases, this is not an automated individual decision, as the requesting company makes its own decision based solely on the score value. However, there are cases in which a score value is automatically calculated, evaluated and used as the basis of a decision by a computer programme – without human intervention.

In addition, Sec. 37 FDPR provides another exemption from the prohibition of automated individual decisions. Specifically, it is about taking the concerns of the insurance industry into account. Sec. 37 should allow, in particular, the automated billing of insurance settlements of private health insurance. However, the scope is not limited to a specific insurance industry.

Worth mentioning: The regulations of the former Sec. 6a of the “old” FDPR are now, to a certain extent, reflected in Article 22 GDPR. Therefore, Sec. 6a was deleted.

Profiling is not prohibited by the GDPR per se

As a reminder, profiling is inadmissible, if the processing of personal data takes place solely automatically and the decision of the data subject based thereon has a legal effect or substantially affects it in a similar manner. In view of the protective purpose, a broad interpretation of this term is to be assumed. In spite of this broad interpretation, personalised advertising, for example, is not covered by this prohibition because it has no legal effect and does not significantly affect the person concerned in any other way.

By contrast, according to Article 22 (2) GDPR, profiling is only admissible in the following three cases:

Conclusion or performance of contract: The (automated) decision is required for the conclusion or the fulfilment of a contract between the person affected and the data controller. The necessity of that depends significantly on the purpose pursued by the contract and is therefore always to be determined individually.
However, this generally refers to the cases in which the conclusion or performance of the contract corresponds to the will of the person affected and therefore does not see any violation of his rights and interests in the fully automated processing and decision. The term “required” is therefore not to be understood as compelling fully automatic data processing is essential or necessary for the performance or conclusion of a contract (such as in the field of e-commerce), but less strictly to be understood as, for example, being in the interest of reducing costs, swifter conclusion of contracts, which, in turn, may have a positive effect on the purchase price, for example.

Legal regulation: A national or union law (law, regulation, etc.) explicitly provides for one or more types of automated decisions, while providing for measures to protect the rights and freedoms of data subjects.

Consent: The decision is made with the explicit consent of the person affected. In this case, compliance with Article 4 No. 11 GDPR (voluntariness, clarity, knowledgeability) and compliance with Article 7 GDPR (traceability, clear and simple language, etc.) is essential.
It should also be mentioned that in the first and third option, the data controller must take appropriate measures to safeguard the rights, freedoms and legitimate interests of the persons concerned. Affected persons should at least have the opportunity to contest the decision, to present their own point of view and to obtain that the data controller intervenes.

In addition, recital 71 also gives some specific profiling requirements: data controllers should use appropriate mathematical or statistical methods. It is important that technical and organisational measures are taken to ensure that inaccurate personal information is corrected and, in general, the risk of error is minimised. It should also be ensured that there is no discrimination or discriminatory effect. Children should never be affected.

The processing of special categories of personal data pursuant to Article 9 (1) of the GDPR (such as health data) is only admissible, if a legal provision permits this or if consent has been given.

A violation of Article 22 GDPR bears the risk of high fines

There is no legal consequence resulting from the wording of the Article. However, it is true that Article 83 (5) (b) GDPR determines that a fine of up to 20 million Euros, or up to 4% of the world’s annual turnover, may be imposed in the event of a breach of Article 22. Thus, a supervisory authority can exploit the increased fines imposed by the General Data Protection Regulation.

It is also conceivable to supplement it with supervisory measures. In the end, Article 58 GDPR provides the supervisory authority with extensive remedies. Taking into account the purpose of the rule (prevention of fully automated individual decisions), from an official point of view, it seems sensible to instruct the infringing company to stop the respective processing and to undo the processing already carried out or the decisions based thereon. It can be assumed that the competent authority will use these tools to ensure the protection of personal data.

Recommended action for companies

Companies do not have to change their entire profiling practice. As described, many of the basic ideas of the former Sec. 6a FDPR were adopted by the General Data Protection Regulation. However, with regard to the aforementioned changes, the data controlling companies are recommended, because of the pain of a fine, to check their current practice at least once and as soon as possible in order to determine whether it still complies with the legal requirements.

One all-clear signal: The GDPR does at least not bear a change of paradigm or the like in the field of profiling. However, given the fact of a high degree of accountability, companies should be able to provide comprehensive information on data processing, especially to data subjects, but also to the authority.

14. December 2023AI

News on the AI Act: Logbook on the planned EU Regulation

Read now

Newsletter

Subscribe to our monthly newsletter with information on judgments, professional articles and events (currently only in german).

By clicking on "Subscribe", you consent to receive our monthly newsletter (with information on judgments, professional articles and events) as well as to the aggregated usage analysis (measurement of the opening rate by means of pixels, measurement of clicks on links) in the e-mails. You will find an unsubscribe link in each newsletter and can use it to withdraw your consent. You can find more information in our privacy policy.