ai health data protection

AI in healthcare: with data protection as key

In the context of digitisation, the healthcare industry has already recognised the benefits of artificial intelligence – and quite rightly so! AI can provide assistance in research and treatment, simplify the everyday life of the hospital and health insurance already offers its members AI-controlled (digital) services via apps. It is even predicted that AI will revolutionise medicine – and there seems to be much that supports this. The fact that, despite all the advantages and good prognoses, many healthcare executives are still reluctant to utilise AI-controlled systems is also due to the challenges they bring with them. In addition to the ethical ones, there are also major data protection concerns.

Both the generation and implementation of AI generate large amounts of personal data, the processing of which is governed by the General Data Protection Regulation (GDPR) and the German Data Protection Act (BDSG). The implementation of these legal formalities is hindered by the application of artificial intelligence in that increased requirements are placed on automated data processing. However, the application of artificial intelligence in the healthcare industry does not just lead to data protection challenges specific to AI, but the healthcare sector itself is already subject to special data protection requirements. Therefore special solutions must be found to special challenges – which has also been successful!

The scope of AI in healthcare

The scope of AI in healthcare can be summed up briefly – everywhere! Naturally, it is ethically dubious to omit people from complex decisions concerning physical and mental well-being, with algorithms completely taking over the work instead. But this perception of AI does not correspond to reality either! It can and will only ever be used in a supporting role. Final decisions on medical issues will still be made by people (such as doctors).

For example, thanks to deep learning, it is possible to help the doctor analyse X-rays. AI is already helping to make progress in tumour research today by enabling neural networks to identify complex tumour structures. Programs have recently been developed which can recognise depression based on a person’s language, and for some time now, robots equipped with cameras and monitors have been assisting in surgeries or relieving nursing staff from some of their duties.

In addition to ethical considerations, it is often the high financial investment that prevents hospital management from investing in new technology. However, a new study shows that AI can actually save costs. For example, in breast cancer screening, a very costly field for the healthcare sector, diagnoses can be made faster than ever before using AI, resulting in huge cost savings. This example can also be applied to other costly areas.

And it is not just with treatment and diagnosis that AI can help. Hospital staff today face an ever-increasing administrative burden, which deprives them of precious working time which could be devoted to patient care. Routine processes can be handed over to AI-controlled software. Here, programs can help to create work schedules or perform other organisational tasks. In the same way, medical records can be created digitally and examined by software for indications which point to diseases. Voice assistants in hospital rooms can help patients with everyday tasks, such as closing the shutters, thereby relieving the nursing staff of menial tasks.

The key question: the compatibility of AI and data protection in healthcare

As already stated, AI is only possible with the help of big data – and this also includes personal data. On the basis of the statutory regulations, not only the GDPR and German Data Protection Act, but which may also include the German state data protection acts, German state hospital acts, social security statutes and the German Genetic Diagnostics Act, the following solutions result for how data protection in healthcare and AI can be reconciled.

In particular, the GDPR stipulates that technical and organisational measures must be taken in order to implement the data protection provisions and to guarantee data security. Your legal advisers can work with you to develop these measures and integrate them into an appropriate innovation and AI-friendly data strategy which incorporates the following key points.

1. Artificial intelligence and health data: finding the right legal framework!

The GDPR and German Data Protection Act regulate the fully or partially automated processing of personal data. The concept of “processing” is very broad. This includes, among other things, the collection, recording, ordering and storage of personal data. The processing of personal data is permitted if it is done on the grounds of a legal basis, such as when consent has been obtained or if permitted by another statute (principle of legality).

However, specific provisions for healthcare are also laid down in the GDPR itself. If AI is used in the health sector, in many cases health data will logically be needed. In particular with the development of AI, it must be fed with enormous quantities of data in order that it can be trained within the framework of deep learning.  The “normal” rules for the processing of personal data apply only to a limited extent to health data, since they fall under a special category of personal data because of their special sensitivity. As a consequence, stricter requirements are placed on their processing.

Health data is understood as personal data related to the physical or mental health of a natural person (a patient), including the provision of healthcare services, and which reveals information about their health status.

In addition, within the framework of AI, partially automated decision making is applied, i.e. a person is subjected to a decision based exclusively on automated data processing. This could be the case, for example, with an app that measures the health status and automatically adjusts the insurance rate when certain values are reached. Such decisions, which have a legal or negative effect, are subject to separate requirements in Article 22 of the GDPR. In combination with the health data already considered particularly worthy of protection, a sensitive combination is thus obtained in the healthcare sector with AI. When processing health data within the scope of automated decisions, it is therefore required that this only be permitted on the basis of express consent or on the basis of legislation on grounds of significant public interest. In practice, however, only consent will be relevant.

It is therefore always advisable to consider an anonymisation concept due to the combination of AI and health data processing. This can determine whether and which data may be collected and used anonymously if necessary. The GDPR and German Data Protection Act do not apply to anonymous data, with the result that the data can be processed without limiting the aforementioned conditions, which is a considerable relief, especially in the generation of AI, if anonymous training data can be used. Anonymous data is available when the information makes it impossible for anyone to identify the person. The use of anonymous data can often be difficult in the health sector, especially in view of the use of data from the medical history, as this is very individual. In other areas, however, the use of anonymous data is quite conceivable.

2. AI and rights of data subjects: use pseudonymisation!

An innovation and AI-friendly data strategy can also be used to develop solutions and guidelines for dealing with the rights of the data subjects under the GDPR and the German Data Protection Act. If AI users use personal data or health data, these rights of data subjects may represent a great deal of work for them if they have not developed a proper and reasonable meaningful concept. The pseudonymisation of data as a technical measure should be considered here!

If AI companies succeed in pseudonymising the data, this has the advantage that the requests from data subjects can be completely omitted and consequently also the high costs and the workload always associated with this. Pseudonymous data is information that can only be assigned to a person with access to separately stored and protected information. As per Article 11(2) of the GDPR, data subjects are no longer entitled to their rights if it is not possible for the controller to identify the data subjects. This may be the case with pseudonymised data if the controller lacks access to the separately stored information, e.g. if the data subject in question has this information.

On the one hand, extensive transparency and information obligations exist with respect to the patient, which must be observed by the data processor. The patient must be informed when their data is processed. They are also entitled to information on all personal data concerning them, which must be provided to them without delay, in written, electronic or oral form. The same applies to the rights to rectification, erasure and restriction of the processing of data.

This poses challenges for AI users in particular, as data subjects affected by automated decisions are granted additional special rights. Thus, a data subject also has the right to advice and information about the logic involved, as well as the scope and intended effects of automated processing for the data subject.

In addition, the specified rights are accompanied by documentation and retention requirements about the data processing processes (accountability). This is more complicated with automated processes because they are harder to understand and prove. In particular, in the context of deep learning, AI users often cannot even assess how their system is developing, as it involves models that change and adapt themselves (keyword: black box). For AI specifically, this means that when artificially intelligent systems replace human decisions, decision-making must also be explainable, just as with humans.

3. Data protection impact assessment: use RPAs!

A further privilege of pseudonymised data arises within the scope of the data protection impact assessment. This is understood to be a risk analysis and assessment of data processing to be carried out by the controller. It does not always have to be carried out, but it is mandatory in particular for automated data processing operations as well as for the processing of health data. So AI users in the healthcare sector will not be able to avoid them.

However, if the data is in pseudonymous form, the estimated risk of data processing may be equal to or less than usual, which favours the controller. It is important, however, that pseudonymisation is conducted prior to data processing, i.e. within the framework of the generation of AI.

In addition, a record of processing activities (RPA) must be kept when processing health data as well as of automated processing, which in turn proves the implementation of the data protection impact assessment. It lists much of the information relevant to the data protection impact assessment, which can then be used for this purpose. For the same reason, an RPA also serves to fulfil accountability.

It is therefore advisable to bear in mind other obligations to be fulfilled when creating an RPA. In addition, the data protection impact assessment should not be seen as a tiresome obligation, but as an opportunity to obtain a good overview of the data processing. Furthermore, when establishing a data protection strategy, it should be noted that the data protection impact assessment is not a one-off occurrence, but an ongoing process.

Conclusion: AI in healthcare – with data protection as key

Within the framework of data protection law, there are many opportunities for harmonising AI and healthcare without creating obstacles to technical and medical progress. It is true that AI and health data in themselves pose a data protection challenge, so the combination does not make it any easier.

However, a good data protection strategy, which includes suitable technical and organisational measures such as anonymisation and pseudonymisation, can also make data processing possible here. AI in healthcare is economically and medically desirable and also feasible according to data protection law.

Digitisation of retail: Is the GDPR a brake on innovation?

Technological change is increasingly gaining momentum and is having a huge impact not only on online retail but also, and above all, on brick-and-mortar retail. Retailers are also relying on greater digitisation of their offers in order to be able to successfully appeal to customers directly at the point of sale with the aid of the use of personal data. Geofencing and omnichannel marketing are considered to be important aids. Customer loyalty programmes are currently also experiencing a boom. The use of physical customer cards and apps is giving brick-and-mortar retail the opportunity to comprehensively evaluate customer data and to adapt their offers.

However, the GDPR has been in force since May 2018, something which many people see as a brake on innovation or even an innovation killer. This is why the EU is now also considered the most powerful regulator in the US tech industry. And, in fact, many companies are overwhelmed with the implementation of the GDPR, as there are many outstanding issues yet to be resolved. This legal uncertainty, combined with the high threat of fines, prevents many companies from using new technologies.

But what about the horror scenarios being spread by opponents to data protection? Doesn’t the GDPR perhaps also offer opportunities? On the contrary, could it even drive innovation? In particular, how can retailers continue to use and improve data-driven customer communication without being paralysed by a fear of fines? The following article focuses on geofencing and omnichannel marketing.

The digital transformation of retail

What was once the Industrial Revolution, is now digitisation. In particular, the digitisation of retail is progressing rapidly. In addition to the retailers, customers are also among the winners. Seamless shopping, a seamless shopping experience across all channels, has now become a matter of course for them. Retailers must meet this challenge with appropriate omnichannel solutions, i.e. a cross-channel business model – one of the most important trends in the retail sector. As a result, this means an interlocking of purchasing processes: Buy online, collect offline. Order on site and then have it delivered.

Retailers may use this interlocking to optimise their offers by extending their marketing strategy across all channels and addressing customers wherever they happen to be. In order for this to be possible, all customer activities must be recorded and analysed – big data is the result. In this context, retailers must be aware, above all, of their data protection obligations under Articles 13 and 14 of the GDPR and to inform that customers in the most precise, transparent, comprehensible and easily accessible form in a clear and simple language of the relevant processing procedures. It should also be verified whether a data protection impact assessment as per Article 35 of the GDPR is required.

Many retailers also accept a variety of mobile payment options (“mobile payment”) from which the buyer may choose. The issue of data security plays a major role here, as bank and credit card information is sensitive and therefore data worthy of particular protection.

In-house shopping apps and electronic shelf labels (ESL) have now become standard. In addition to the current price, the latter also offer important additional information which may make the purchase decision easier. By networking the tills, retailers are also able to measure their flow of goods in real time, adjusting their orders accordingly.

Another impressive example of the digitisation of retail is smart mirrors. The US perfume and cosmetics giant Coty has recently launched a Magic Mirror which enables customers to try out new hair colours. This is made possible due to revolutionary technology known as augmented reality. This technology had recently caused a stir from a data protection viewpoint when Google Glasses was to be introduced. The main problem: Third parties would also be affected by the data collection.

Individual customer contact via geofencing and legal considerations

Personalisation in retail is also becoming more and more relevant. Many customers like it when they are approached personally online. Tailored product recommendations and other personalised content are also welcome due to their usefulness. This is all possible due to the collection and evaluation of data relating to the customer.

Tools are also used on site, at the point of sale or point of interest, with which customers can be individually approached. For example, so-called “beacons” (small Bluetooth transmitters and receivers) enable the customer to be located in the store in order to send relevant offers directly to their device. Today, however, localisation or tracking via WiFi serial number is more promising, as WiFi is used much more frequently by smartphone users than is Bluetooth.

Particularly popular in this context is geofencing, i.e. location-based marketing. For example, the potential customer receives offers via push message as soon as they enter a virtually defined area (usually in the immediate vicinity of a shop). However, the technical prerequisite is that the customer uses the retailer’s app, shares their location and allows push notifications. A top topic for all sectors.

The use of geofencing requires compliance with a few aspects of data protection law. Above all, any data processing requires a basis in data protection law. To be considered here are, on the one hand, consent under Article 6(1)(a) of the GDPR and the retailer’s legitimate interests, but also the conclusion of a contract:

1. Consent

Obtaining consent in compliance with the law when processing location-related data is by no means an easy undertaking. Therefore, consent in accordance with Article 7 of the GDPR is only deemed to be valid if it has been given voluntarily, in a specific case, after sufficient information and without ambiguity. In practice, this means that the individual processing steps must be recorded in as much detail as possible so that the customer knows precisely what is happening to their data. Furthermore, this information must be directly (temporarily) associated with the granting of consent. If a data subject gives their consent, they must also be entitled to revoke it at all times. This leads to an increased organisational effort on the part of the processor.

2. Legitimate interests

The path through the legitimate interests in accordance with Article 6(1)(f) of the GDPR is simpler and less time-consuming. For example, controllers can justify the planned data processing or the use of geofencing by invoking overriding legitimate interests. In this context, a trade-off must be made between the interests of the retailer in the use and the interests of the data subject worthy of protection. If the former predominate, the use of geofencing is justified from a data protection viewpoint and is therefore permissible without the need for consent. If the personal data is also pseudonymised, this will in principle have an effect within the scope of the balance of interests in favour of the controller. It also limits the number of addressees to existing customers.

3. Contract

However, the greatest legal certainty is provided by the conclusion of a contract as per Article 6(1)(b) of the GDPR. The installation of many apps regularly includes the conclusion of a service contract. Should a company (also) intend to individualise the users with the help of geofencing and its in-house app in order to be able to create profiles, it is recommended that this data processing be explicitly included in the contract as part of the service.

Depending on the app design, the processing is to be described either as an integral part of the services or as an additional service. In this way, for example, the use of geofencing and the creation of a user profile become part of the contract.

The benefits for the app user should be emphasised, but without concealing any further analysis: above all, individualisation enables the company to import tailored offers and coupons, for example, but also to improve its own services and marketing measures through evaluation of the profiles.

The guarantee of transparency is of particular importance to the validity of the contract in this context. Information about the planned data processing should therefore be provided separately, in as much detail as possible and in language which can be easily understood. For this reason, inclusion in the General Terms and Conditions (GTC) is prohibited.

No matter what legal basis a company chooses, working with pseudonymised data is always desirable, as its abuse is much more difficult. In addition, companies also fulfil the data protection principle “privacy by design” in this way and it is easier for customers to develop trust in the technology used.

In addition to the aspects of data protection law, it should also be taken into account that, depending on the design, geofencing may also be “unfair” within the meaning of the German Act against Unfair Competition (UWG). In this way, geofencing zones are set up not only in and around individual shops, but also in the immediate area of the competitor’s branches. From when this is to be qualified as a deliberate obstruction as per section 4 (4) of the UWG has not yet been judicially clarified. Qualification as an unfair enticement of customers is also conceivable.

Retailers should ask themselves the following questions when introducing new technologies:

  • Does use of the technology require the processing of personal data?
  • Is it also possible to use the technology with pseudonymised or even anonymised data (does the latter already open the scope of the GDPR)?
  • What is the legal basis for the processing?
  • What are the specific requirements given by the relevant legal basis?
  • Can these requirements be fulfilled within the scope of the use of the technology?
  • Are customers comprehensively and transparently informed about processing operations?
  • Can fulfilment of the rights of the data subjects be guaranteed (right of access, right of erasure etc.)?
  • Is the use of the new technology likely to result in a high risk to the rights and freedoms of natural persons?
  • Is it possible that the actual use of the technology may be qualified as “unfair” within the meaning of the UWG?
  • What measures are required in order to avoid such a qualification?


GDPR as a brake on innovation for technological developments?

The central question now arising is: Does the GDPR kill off innovations such as geofencing? If you ask company management, the answer is often “yes” due to the increased costs. The Chairman of Bitkom also warns: “If we take data protection too far, we will impede the use of artificial intelligence.”

Anyway, the fact is that geofencing is not possible without the processing of location data. However, the fact is also that the GDPR regulates the processing and use of personal data. In this respect, there are definitely restrictions to the use of innovations. In turn, this means that controllers may have to seek out new solutions which comply with data protection until new technology can be used without hesitation – a loss of time.

The claim that the GDPR is a brake on innovation is absolutely warranted in certain cases, but it is definitely not an innovation killer. The GDPR may even stimulate innovation.

This is illustrated by the following example:

In the event of a request for erasure as per Article 17 of the GDPR, the controller must completely delete all personal data of a data subject (with the exception of data for which there is a retention obligation).
If AI is used which is fed with large quantities of personal data, this erasure obligation certainly also affects the AI used. It must therefore learn to “forget”.
This is not an easy task, particularly if it is a so-called “black box”. A black box refers to when individual decisions of the AI or the criteria on which the decisions are based due to independent further development of the algorithm by the AI can no longer be (completely) simulated.
For the fulfilment of the erasure obligations, it is therefore indispensable to be able to keep apart and separate individual data records without compromising the (new) algorithm. Smart minds are needed here to facilitate such erasure without hindering the use of AI.

Conclusion

Retailers certainly face a number of challenges when it comes to using new technologies such as geofencing or artificial intelligence. The use of personal location data, as is the case with geofencing, must therefore be well thought out. In addition to consent as a legal basis, the processing of personal data may also be based on legitimate interests, depending on the individual case. In practice, however, the most relevant option is likely to be via a contract, as this offers most legal certainty.

The GDPR has recently brought data protection law to the attention of the public. Customers are becoming more and more cautious regarding the disclosure of their data. For this reason, retailers should prioritise the subject and invest in the building of trust by providing transparent information about all relevant data processing processes. This approach may turn out to be a competitive advantage and data protection itself may become a marketing measure. This is why Tim Cook (CEO Apple) is also calling for stricter data protection regulations, because this is the only way to strengthen customers’ confidence in new products.

The GDPR sometimes actually slows down technical progress, but it does not prevent it. There is only a delay. However, this delay should be worth the protection of personal data and the loyalty of our customers.

Use of cloud-based software solutions: legal dos & don’ts

Cloud computing

According to a study by Bitkom published in June 2018, two out of three companies are already using cloud computing. In the case of large companies (a workforce of 2000 or more), this figure is as high as 83%.

So cloud computing is a major part of business – but what exactly is cloud computing? The German Federal Office for Information Security (BSI) describes cloud computing as “a model for enabling convenient, on-demand network access to a shared pool of configurable computer resources (e.g. networks, servers, storage systems, applications and services) that can be provisioned rapidly and released with minimum management effort or service provider interaction.

The best-known cloud computing provider with a market share of approximately 32% is Amazon Web Services (AWS), which recorded a turnover of USD 25.4 billion in 2018. Microsoft Azure was second last year with a market share of around 17% and a turnover of USD 13.5 billion. According to a February 2019 analysis by Canalys, AWS recorded annual growth of 47% and Microsoft Azure 82% compared with the previous year.

Alongside the practical benefits of cloud computing, however, companies must also address the legal specifics and challenges. What are the legal risks associated with the use of cloud computing? What kinds of IT contracts are there and what contract types are involved? With which special data protection regulations must companies comply?

IT contracts and contract types in cloud computing

Cloud computing is offered in various forms –  Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).

IaaS, PaaS or SaaS – what am I dealing with?

IaaS is about the provision of IT resources, such as computing power, data storage or networks. Differentiation can be made according to the content of the IaaS, so that, for example, the provision of storage capacities in the form of data storage as a service exists independently as an IT contract, but only represents a special case of the IaaS.

The same is true of the other cloud computing groups. PaaS offers a complete infrastructure to the customer, who can develop and execute their own programs through standardised interfaces.

The SaaS is a licensing and distribution model which allows customers to use software offers for a fee for an agreed period of time without having to install the software on the user’s device.

Determining the contract type is important

The civil law classification of the contracts depends on the respective content; however, a rental contract is to be assumed, for example when using storage capacities through cloud computing. The purchase of standard software is usually a simple purchase contract, whereas individually tailored software solutions can be assumed to be a work contract. The civil law classification of the IT contract has an influence on the warranty and liability claims of the contracting parties.

In the case of cloud-based software offers as SaaS (Software as a Service), elements from work, service and lease contract law are included which prevent a simple assignment as an individual contract type. The Applications Service Provider (ASP) contract also contains elements from different contract types, but has been assigned as a lease contract concerning the main service by the Federal Court of Justice (judgement of 15 November 2006, case no. XII ZR 120/04).

For the contract typological assignment, the focus of the service object of the respective contract part must be determined. The qualification of ASP contracts as lease contracts leads to an interest-oriented result.

The classification as a work contract for cloud computing contracts does not seem appropriate for SaaS, since the performance content of a work contract is performance-based and not transferable to standardised software applications with shared responsibility. The rules regarding the due date of the remuneration, acceptance or completion of the work are inappropriate, as the provider only provides the access option. Classifying the service contract (section 611 et seq. German Civil Code) as an applicable contract type seems unfair to the user, because only performance would be due, not its success. This would result in a fee obligation for the user, if necessary, without the application being able to be used. The initial situation, comparable with the tenancy law, allows at least an analogous application of the regulations, so that the declaration by the Federal Court of Justice can continue to be followed. The common ground is that there is something left to the user to use. Furthermore, within the scope of defects, the claims arising from the tenancy law come into effect and lead to a balance between the interests of both contracting parties.

It is important to recognise that the new form of IT contracts cannot be rigidly assigned to the regime of contracts typified by the German Civil Code. For General Terms and Conditions (GTC) in the area of B2C and B2B (in compliance with Section 310 (1) of the German Civil Code (BGB), the scheduling to a contract type is essential since a check as per section 307 (3) of the German Civil Code only takes place if the GTC deviate from legislation or these additional rules; it is ineffective if it deviates from the fundamental principles of the statutory provisions and is irreconcilable. This depends on the previously determined contract typology, which serves as the basis for the legal assessment.

In addition, conventional problems of software law must of course also be solved in the drafting of contracts in order to ensure comprehensive legal protection. Thus, cloud-based applications are also subject to copyright law, so care must be taken to ensure the adequate granting of usage rights or the effectiveness of open source licences. In addition, special attention must be paid to the definition of service levels and availability in order to limit the liability risk in this respect.

Dos:

  • Clear agreement on the performance content
  • Identification of the type of IT contract
  • Civil law determination of the typology
  • Examine effectiveness of rules and clauses

Don’ts:

  • Postpone the scheduling of the contract type to later
  • Examine liability and warranty claims only in the event of a performance failure

Risks and benefits of cloud computing

The growth in cloud computing and its growing popularity amongst businesses is due to various aspects. First of all, cloud computing enables the real-time scalability of IT performance and can therefore be quickly adapted to individual needs. By outsourcing and utilising third-party expertise, cloud users reduce their own IT administration costs without becoming burdened with additional costs, such as for servers, thereby freeing up capital for development and investment in other areas. Another benefit is the location-independent access to data and the opportunity for simultaneous processing of documents and processes by several employees. Cloud computing enables the user to have increased flexibility at different levels, thereby creating a competitive advantage over other companies.

When it comes to the disadvantages of cloud computing, there are both conventional and cloud-specific risks. Conventional risks when using services from third parties include irresponsible handling by the provider’s employees as well as insufficient technical and organisational measures. Cloud-specific disadvantages for cloud users include dependence on the provider. There is a particular risk for data security with respect to the following conventional protective goals: Confidentiality, integrity and accessibility. The dangers of data loss, data manipulation and at least temporary unavailability of the data represent potential risks. When selecting the provider, one crucial criterion should be the measures taken to prevent dangers. The BSI has published a catalogue of requirements for assessing the information security of cloud services, which can be used for decision-making.

The study conducted by bitkom found that there were more data security incidents in the internal IT of companies than with companies which used public cloud applications. Use of the cloud alone therefore does not increase the risk to data security. Although large cloud providers are more often targeted by hacking attempts, they place a particularly high emphasis on the protection of their applications and invest greater sums than companies would and could do in their own IT systems. Fifty percent of the companies surveyed say that the security of their data has been included in the cloud.

Dos:

  • Determination of the necessity for outsourcing and scope of external IT structures
  • Certifications of the cloud provider regarding data security
  • Assessment of the long, medium or only short-term benefits of cloud computing

Don’ts:

  • Absolute dependence on the provider
  • Neglect of in-house security standards

Data protection requirements

Cloud computing faces legal challenges in the area of data protection. When erasing data, the user cannot simulate and verify whether all data has been erased and that no data has remained in the backup system, for example. There are very few cases where system and usage logs are provided to the user to inform them of difficulties or incidents, unless this is agreed in a contract. With major cloud providers in particular, the individual design of such clauses in contracts is usually not possible, resulting in the widest possible self-control on the part of the cloud provider.

Also problematic in terms of data protection law is the mostly non-transparent storage and duplication of data on servers in different countries, which may have different data protection standards and which may cause a divergence in the level of protection mandatory for the user and what is afforded by the supplier.

A processing contract in accordance with Art. 28 (3) of the GDPR is to be concluded between the user (controller as per Art. 4 (7) of the GDPR) and the provider (processor as per Art. 4 (8) of the GDPR). In addition to the mandatory section listed in paragraph 3, it is also recommended with regard to the erasure of data after the end of the contract, that regulations be established regarding information obligations and the commissioning of subcontractors, control rights and processing of data outside the EU/EEA.

Dos:

  • Conclusion of a processing contract as per Art. 28 (s) of the GDPR
  • Guarantee of control rights
  • Regulations regarding processing outside the EU/EEA and the handling of data after the end of the contract

Don’ts:

  • Infringement of obligations as a controller and GDPR
  • “Shifting of responsibility” to the cloud provider
  • Processing data in the cloud without local backup
  • Loss of control over data

Conclusion

Cloud computing has become an indispensable part of the digital economy, providing companies with a variety of usage and development opportunities. For many users of cloud computing, concerns about data security have given way to the joy of flexibility and practicality, so it is expected that the cloud computing sector will grow in the coming years. The practical benefits are accompanied by legal issues. Determining the type of contract and complying with data protection regulations represent the greatest challenges for users. As noted, the specific content of the contract and the subject matter of the performance object to be determined serve as a starting point for using cloud computing in a legally compliant manner. In particular, the contract typological classification may pose problems for users, but also providers, with the result that claims are not made or not made in time.

saas-legal-implementation

Software as a Service (SaaS) in the process of legal implementation

Software as a Service (SaaS) is not a new term, although the business model has gained in importance in cloud computing over recent years. In 2018, German companies were expected to invest more than 20 billion euros in cloud services. A not insignificant part of this relates to SaaS solutions.

What is Software as a Service (SaaS)?

SaaS is understood as the cloud or server-based provision of certain software services for an agreed period for a fee. As a rule, the software is used directly via the browser. Due to their handling, SaaS solutions can be used in all corporate divisions, in particular in customer relationship management (CRM), human resource management (HRM) and financial management. The SaaS provider is not necessarily the same as the software manufacturer, but does at least own the required copyright usage rights to the software. Issues of copyright and data protection law are therefore often raised in connection with SaaS contracts.

Application Service Providing (ASP) can be understood as the predecessor to SaaS. The main difference between SaaS and ASP is that, as part of an ASP solution, the user’s computer is usually assigned to the software, while the SaaS services can be used without this assignment to specific hardware resources, thereby offering greater flexibility.

Pros and cons of SaaS

From a customer’s viewpoint, the greatest advantage of SaaS solutions is the low acquisition costs. By outsourcing maintenance and updates of the software as well as the server administration to the provider, a higher cost control is created overall. In addition, the cloud-based or server-based solution guarantees comprehensive mobility for company employees. Despite these advantages, however, there are risks associated with the use of SaaS services, because the customers are dependent on the provider for the functionality of the software. Not only are service faults considered to be within the sphere of the provider, but also those related to the Internet connection. In addition, company data is usually also shared during use, requiring a certain degree of confidence in the provider’s data integrity.

Service level agreements

There are still many legal aspects unresolved in connection with SaaS. Even the question of the legal classification of the agreement between the supplier and the customer is disputed. The determination of the contractual nature not only has academic value, but is also extremely relevant from a practical viewpoint. While, for example, in the event of service faults in the rental contract which are present at the time the contract is concluded, the provider’s liability for defects, regardless of fault, is provided for by law, there is no such provision in the service contract. If, for example, the provider suffers server outages, the assertion of claims would depend on the classification of the contract. The prevailing opinion, taking into account case law of the Federal Court of Justice, accepts a lease agreement, but this is countered, however, by the fact that the decisions given here relate only to ASP contracts and, based on the technical innovation of the SaaS services, are not applicable to these.

In order to minimise risk, it is therefore advisable to expressly regulate questions of liability and compensation for damages in the event of service faults by means of service level agreements. These should include at least the following areas:

  • Availability of the service, including a measurement method for its determination
  • Definition of service faults
  • Response and recovery times in the event of service faults
  • Regulations on the distribution of the burden of proof in the event of service faults
  • Contractual penalties and termination options in the event of service faults

Copyright licence as subject of the SaaS contract

Copyright law is also to be considered within the scope of SaaS contracts. Specifically, the question is whether the customer must be granted an (at least) simple usage right to the software by the provider. This is assumed here in part, on the grounds that the right of reproduction in section 69 c no. 1 of the German Copyright Act (UrhG) is affected by the provision of the software, as in any case a “temporary reproduction” takes place on the customer’s main memory. However, against this is the fact that it is countered that the essential technical reproduction takes place on the provider’s server. Merely using the software applications by the customer is insufficient for this purpose.

As a result, the specific design and use of the software is probably decisive, e.g. when the software must be installed on the user’s computer. If the access software has been provided to the customer by the provider, it may therefore also be seen as the implied transfer of a right of reproduction. However, a regulation regarding the rights of use is nevertheless useful and recommended in a SaaS contract in order to create legal certainty for the SaaS provider and user.

SaaS and data protection law

In addition, the security of the customer’s data is of great importance, as it is first created and then stored on the provider’s systems. Not only the provider, but also the customer must take into account certain provisions of the data protection law when concluding a SaaS contract, specifically when the use of the software (also) involves the processing of personal data. In these cases, a job processing contract would have to be concluded in order to avoid possible fines for the violation of data protection regulations, taking into account section 11 of the German Data Protection Act (BDSG) or Article 28 of the GDPR.

Another important aspect is that the customer must convince itself of compliance with the organisational and technical measures when processing data by means of SaaS via preliminary inspection and then with regular checks. How exactly these checks should look and which are appropriate should be determined in individual cases. Among other things, which data is processed when using the SaaS solution plays a role. The provider may, for example, be obliged to comply with contractually agreed data security concepts, to present IT security certificates or to provide additional information obligations.

Conclusion and evaluation

As a result, the SaaS services have numerous advantages from the user’s viewpoint, but there are also certain legal risks involved. Not only does determination of the nature of the contract relevant for the liability remain controversial, but also the question as to whether the SaaS customer must be granted a usage right. In addition, the security of the customer’s data should be guaranteed and data protection regulations should also comply when processing personal data, because the data is no longer stored on the customer’s own computer, but on that of the provider. Selection of the SaaS service provider should therefore be made with care and, if necessary, regular checks should take place. It is also recommended that special care should be taken when preparing or examining the contracts used in order to avoid the existing risks as far as possible.

The German Trade Secret Act: New Challenges, New Measures

The German Trade Secret Act: New Challenges, New Measures

The protection of business and trade secrets is of central importance to companies. Often, trade secrets represent a significant business value that not only promises an advantage over competitors but can even be fundamental to the business model. This not only applies for large groups, but especially for start-ups that have a good idea and a promising business approach.

There are high expectations (but also justified fears) of the German Trade Secrets Act (GTSA), which entered into force on 26 April 2019. The GTSA is a new principal law passed by the legislator against the unlawful acquisition, use and disclosure of trade secrets and implements Directive (EU) 2016/943 (TSD) on the protection of undisclosed know-how and business information. The Implementation Act on the GTSA also repeals Sec. 17 et seq. of Unfair Competition Act (UWG), which up to now regulated the (punishable) protection of business and trade secrets.

Formerly known as “Know-how protection guideline”, the implementation of the specifications of the GTSA raised high expectations. It is all the more surprising since many companies will now be forced to react quickly if they want to continue protecting their business and trade secrets.

Those companies that have taken organisational, technical and legal measures in the context of the implementation of the General Data Protection Regulation (GDPR) may have already made valuable preparatory work.

Trade Secret – new definition – new challenges

According to the previous definition of the German jurisdiction, anyone, who wanted to protect his know-how, could easily declare it to be a trade secret (subjective intention). Even without an explicit explanation, it could still be argued that the intention to secrecy results “out of the nature of the secret matter” (so far, the will to secrecy was derived from an objectively legitimate economic interest). In case, the secret fact was also company-related and not notoriously known, the scope of protection had already been opened to application.

However, anyone wishing to invoke trade secrets under the new legal framework must be able to state that he has protected his know-how through outwardly recognisable (objective) appropriate non-disclosure measures.

In particular, companies face two major challenges with the new definition. On the one hand, it should be clear that there is a need for action if important know-how or other information worthy of protection should fall under the protection of the secret. On the other hand, however, the question arises: “What is an appropriate measure?” This can only be answered on a case-by-case basis. In other words, depending on the information worthy of protection, different non-disclosure measures must be taken. This dynamic concept provides companies a certain room for manoeuvre regarding the measures to be taken. Not all information requires strict confidentiality. However, understated non-disclosure measures may have drastic consequences: it means that the information is not protected as a secret and the company can lose ownership of the information.

For this reason, companies should first of all evaluate which information should be kept a secret. Anyone who has already become familiar with handling records of processing activities in accordance with Art. 30 GDPR, can fall back on the practise gained there. Even though there is no obligation to maintain such a record (but this is recommended for evidence in litigation – see below), it is important to identify which trade secrets, e.g. in the form of know-how within the company and to what extent they are to be evaluated as particularly secretive. In particular, it must be taken into account which employees and business partners are aware of which business secrets or should be acquainted with such.

Particular attention should be paid to the verification of the status quo, e.g. information such as customer data, balance sheets, data on suppliers, calculations, prototypes, plans, recipes, algorithms, source codes and the documentation of the programmer.

Recognise the need for action and seize confidentiality measures

If the information worthy of protection has been determined in the company, it is then necessary to take appropriate measures within the meaning of the German Trade Secrets Act.

Especially, since the measures to be taken always depend on the individual case (for example, the nature and value of the information, the circle of the participants, context of use) that we recommend carrying out a tripartite examination of the appropriate measures to be taken:

(1) Organisational measures

First, clear responsibilities for the protection of information should be developed. Information worth protection should be marked or explained as confidential and employees of the company should be trained and sensitised to dealing with trade secrets. Employees should also be informed about Whistleblowing in order to avoid misunderstandings. Depending on the scope of the information to be marked, the effort for the measures is limited, but must be well documented.

(2) Technical measures

If information is determined to be secret, technical measures must be taken to protect it from unauthorised access. For most companies, the focus will be on IT security. However, anyone who already took action on implementing appropriate technical measures under Art. 32 GDPR will not find it difficult to do so. However, it also applies here that, depending on the information, a different benchmark must be set to the appropriateness of the measure.

(3) Legal measures

After all, legal measures can also make a crucial contribution to protecting the secret. In particular, this requires that the holder of the trade secret must demonstrate that he has lawful control over the information. Often, the company’s own employees are one of the major risks in the protection of trade secrets. The employees are already obliged to secrecy due to secondary contractual obligations, but for effective protection of the secret it is advisable to conclude specific non-disclosure agreements. The maintaining of secrecy required by the GTSA can only be achieved, if the recipient is obliged to take specific confidentiality measures as well. Companies that previously contented themselves with a general secrecy clause in the employment contracts of all employees, have to take special attention to confidentiality clauses. There is no catch-all solution. This means that they alone are not enough to effectively maintain secrecy. But it also means that the same agreement should not be presented to every employee. The more one employee gets in touch with business secrets, the more precise it should be. What has already been mentioned regarding the employees, shall apply especially for business partners. These, too, should also be contractually tied to protection of the secret (via so-called NDA: non-disclosure agreements).

The admissibility of the so-called reverse engineering (obtaining a secret by reverse engineering a product) has been newly introduced into German law, since according to the previous legal situation, it was predominantly considered as inadmissible. However, through contractual arrangements, reverse engineering can still be excluded – within certain limits. Therefore, one should design corresponding contractual drafts in cooperation agreements e.g. with customers, licensees, or partners.

Update and document confidentiality measures

Adopted confidentiality measures should be reviewed on a regular basis and the developed protection concept should be updated where needed. Establishing a know-how management system can ensure that there are clear responsibilities for this task and that appropriate measures are taken to protect trade secrets.

A detailed documentation of the know-how worth protecting and the measures that have been taken pays off at the latest in the event of a dispute. Anyone who wants to invoke a trade secret bears the full burden of evidence and proof regarding the confidentiality measures taken. Those who can adhere their burden of proof through detailed documentation, clearly has the advantage.

Protect trade secrets and enforce them in court

Even though the German Trade Secret Act brings up a lot of work and effort for companies, it simplifies judicial intervention and the enforcement of the protection of secrets.

In addition to claims for cease and desist and removal of the infringement, the trade secret holder is eligible for destruction, restitution, return, removal and market withdrawal regarding the infringing products, documents, objects or files containing or embodying the trade secret against the infringer. If the trade secret is violated by employees of a company, the company itself may be subject to these obligations.

In order to provide the business owner with the most effective protection possible for his trade secrets, he has a right of access on e.g. the origin and recipients of unlawfully obtained or disclosed trade secrets against the infringer.

Anyone who violates a trade secret is also liable for damages under the GTSA. In addition to the obligation to pay civil damages, any breach of a trade secret always bears the risk of imprisonment or a fine.

Profiling: The Challenges of the GDPR

Profiling: The Challenges of the GDPR

The European General Data Protection Regulation (GDPR) pursues the target of harmonising data protection law within the European Union. It has been in force for some time now as a directly applicable law in the EU Member States. Compared to national data protection law for example in Germany (Federal Data Protection Act, FDPA), it contains only few innovations. Much has just been restructured.

Substantial changes regarding the profiling by GDPR: For the first time, the profiling is explicitly defined in the GDPR and an individual article regulates the admissibility requirements as well as limits of profiling in the context of automated procedures. However, there are also regulations with certain similarities to previous basic decisions of the (German) FDPA, so that companies should be able to find out on a case-by-case basis which practices can be maintained and which may need to be changed.

Profiling as a special case of automated processing of personal data

According to Article 4 No. 4 GDPR, profiling is any kind of automated processing of personal data in order to evaluate certain personal aspects based on said data. In the end, these personal aspects are used, for example, to analyse the work performance of a person, their economic situation, their personal preferences and the like and to make corresponding predictions. For this purpose, comprehensive (user) profiles are regularly created – above all in the context of web-based systems.

In this context, Article 22 GDPR should be mentioned, too. It prohibits the data controller from subjecting a data subject to any decision based solely on automated processing, where the decision has legal effect or similarly affects the data subject severely. In this context, Art. 22 GDPR terms profiling a special case.

Below, you can find a brief example of such an automated but inadmissible decision based on a previously created profile:

Over months, an intra-corporate system collects data about an employee – especially performance data. Using this information, the system creates a profile and evaluates the data by the date of the next feedback session. Then, it determines whether the employee is granted a salary increase or not. The persons conducting the feedback interview are only entitled to communicate the result to the employee without having a prior influence on the decision-making process.

The German Federal Data Protection Act also makes provisions for scoring

The explicit designation as profiling and accordingly also the definition in the General Data Protection Regulation are new and did not exist in this form in the former German Federal Data Protection Act.

The “new” FDPR, adapted to the General Data Protection Regulation, amends Article 22 GDPR insofar as it regulates a subarea of profiling, namely scoring, in Sec.31 FDPR (former Sec. 28b FDPR). Scoring involves the use of a probability value about a certain future behaviour of a natural person for the purpose of deciding whether to establish, execute or terminate a contractual relationship with that person. Sec. 31 determines the admissibility of such a scoring. A classic example of this is the calculation of a score by credit bureaus. However, it should be noted that this score is provided to interested companies only on request, without directly exposing the data subject to any legal effect. In these cases, this is not an automated individual decision, as the requesting company makes its own decision based solely on the score value. However, there are cases in which a score value is automatically calculated, evaluated and used as the basis of a decision by a computer programme – without human intervention.

In addition, Sec. 37 FDPR provides another exemption from the prohibition of automated individual decisions. Specifically, it is about taking the concerns of the insurance industry into account. Sec. 37 should allow, in particular, the automated billing of insurance settlements of private health insurance. However, the scope is not limited to a specific insurance industry.

Worth mentioning: The regulations of the former Sec. 6a of the “old” FDPR are now, to a certain extent, reflected in Article 22 GDPR. Therefore, Sec. 6a was deleted.

Profiling is not prohibited by the GDPR per se

As a reminder, profiling is inadmissible, if the processing of personal data takes place solely automatically and the decision of the data subject based thereon has a legal effect or substantially affects it in a similar manner. In view of the protective purpose, a broad interpretation of this term is to be assumed. In spite of this broad interpretation, personalised advertising, for example, is not covered by this prohibition because it has no legal effect and does not significantly affect the person concerned in any other way.

By contrast, according to Article 22 (2) GDPR, profiling is only admissible in the following three cases:

Conclusion or performance of contract: The (automated) decision is required for the conclusion or the fulfilment of a contract between the person affected and the data controller. The necessity of that depends significantly on the purpose pursued by the contract and is therefore always to be determined individually.
However, this generally refers to the cases in which the conclusion or performance of the contract corresponds to the will of the person affected and therefore does not see any violation of his rights and interests in the fully automated processing and decision. The term “required” is therefore not to be understood as compelling fully automatic data processing is essential or necessary for the performance or conclusion of a contract (such as in the field of e-commerce), but less strictly to be understood as, for example, being in the interest of reducing costs, swifter conclusion of contracts, which, in turn, may have a positive effect on the purchase price, for example.

Legal regulation: A national or union law (law, regulation, etc.) explicitly provides for one or more types of automated decisions, while providing for measures to protect the rights and freedoms of data subjects.

Consent: The decision is made with the explicit consent of the person affected. In this case, compliance with Article 4 No. 11 GDPR (voluntariness, clarity, knowledgeability) and compliance with Article 7 GDPR (traceability, clear and simple language, etc.) is essential.
It should also be mentioned that in the first and third option, the data controller must take appropriate measures to safeguard the rights, freedoms and legitimate interests of the persons concerned. Affected persons should at least have the opportunity to contest the decision, to present their own point of view and to obtain that the data controller intervenes.

In addition, recital 71 also gives some specific profiling requirements: data controllers should use appropriate mathematical or statistical methods. It is important that technical and organisational measures are taken to ensure that inaccurate personal information is corrected and, in general, the risk of error is minimised. It should also be ensured that there is no discrimination or discriminatory effect. Children should never be affected.

The processing of special categories of personal data pursuant to Article 9 (1) of the GDPR (such as health data) is only admissible, if a legal provision permits this or if consent has been given.

A violation of Article 22 GDPR bears the risk of high fines

There is no legal consequence resulting from the wording of the Article. However, it is true that Article 83 (5) (b) GDPR determines that a fine of up to 20 million Euros, or up to 4% of the world’s annual turnover, may be imposed in the event of a breach of Article 22. Thus, a supervisory authority can exploit the increased fines imposed by the General Data Protection Regulation.

It is also conceivable to supplement it with supervisory measures. In the end, Article 58 GDPR provides the supervisory authority with extensive remedies. Taking into account the purpose of the rule (prevention of fully automated individual decisions), from an official point of view, it seems sensible to instruct the infringing company to stop the respective processing and to undo the processing already carried out or the decisions based thereon. It can be assumed that the competent authority will use these tools to ensure the protection of personal data.

Recommended action for companies

Companies do not have to change their entire profiling practice. As described, many of the basic ideas of the former Sec. 6a FDPR were adopted by the General Data Protection Regulation. However, with regard to the aforementioned changes, the data controlling companies are recommended, because of the pain of a fine, to check their current practice at least once and as soon as possible in order to determine whether it still complies with the legal requirements.

One all-clear signal: The GDPR does at least not bear a change of paradigm or the like in the field of profiling. However, given the fact of a high degree of accountability, companies should be able to provide comprehensive information on data processing, especially to data subjects, but also to the authority.

Contact

office@srd-rechtsanwaelte.de

Berlin

+49 (0)30 21 30 028-0

Düsseldorf

+49 (0)211 41 55 868-0

München

+49 (0)89 61 42 412-70

Newsletter

Subscribe to our monthly newsletter with information on judgments, professional articles and events (currently only in german).

Unsubscribe link in every newsletter. More in the Privacy policy.