Utilizziamo cookie tecnici per personalizzare il sito web e offrire all’utente un servizio di maggior valore. Chiudendo il banner e continuando con la navigazione verranno installati nel Suo dispositivo i cookie tecnici necessari ai fini della navigazione nel Sito. L’installazione dei cookie tecnici non richiede alcun consenso da parte Sua. Ulteriori informazioni sono contenute nella nostra Cookie Policy.



Algorithms and Healthcare: the focus of the Italian Authority

PrintMailRate-it

published on 24 February 2023 | reading time approx. 6 minutes


The Italian Data Protection Authority ("Authority") is showing increasing attention with respect to the use of AI and algorithms within the healthcare sector. Which are the risks and the possible measures to be taken?

The Authority is showing a strong focus regarding the use of algorithms and Artificial Intelligence within the healthcare sector, in fact the Authority has: 
  1. Started an investigation toward the Region of Veneto with a request to provide clarification regarding the use of an algorithm for assigning priority classes of health care services to patients.
  2. Sanctioned 3 local health authorities of the Region of Friuli-Venezia Giulia for illegally using algorithms on health databases for profiling patients to identify the possible risk to have complications related to the Covid-19 infections.
  3. Issued a preliminary order preventing the processing of the data of the Italian users by "Replika," a U.S.-based chatbot using artificial intelligence, since the chatbot posed a real risk to minors due to its ability to influence a person's mood or behavior.

In light of these decisions, it is important to analyze and understand which are the major risks pointed out from the Authority and, consequently, which aspects the companies that want to use algorithmic or artificial intelligence systems within the health care sector must be particularly careful about.

The first point that deserves to be analyzed concerns the fact that the Authority is increasingly anticipating the moment of its intervention when detects a risk regarding the large-scale processing of special categories of data through innovative technologies.

With reference to the request made to the Veneto Region, the Authority asked to obtain some clarifications to verify the compliance of the algorithms used with the GDPR, since the algorithmic system used poses some delicate questions about the problem of processing a large scale of health data.

Similar considerations were also made with reference to the Replika case. The Authority noted how the artificial intelligence was collecting (without any prior verification) data, including special categories of data (i.e., the mental and emotional state) of vulnerable subjects such as minors. 

For this reason, the Authority considered, without undue delay, to order the temporary restriction of the processing activities. Indeed, the risk was related to the possibility that the chatbot, through its role as a virtual assistant, could negatively influence children.

The second aspect that deserves to be analyzed relates to the specific requests addressed to the Veneto Region, Replika, as well as the Friuli-Venezia.
Precisely, regarding the Veneto Region, the Authority specifically request to verify:
  • The type of algorithm used and if truly automated.
  • The dataset used (i.e., databases and clinical records), along with the number of patients involved.

Moreover, the Authority requested to receive specific evidence of the:
  • Union or Member State law identified as the legal basis for the processing of health data under Article 9 of the GDPR.
  • Methods used to inform data subjects of the processing under Articles 12, 13 and 14 of the GDPR.
  • The Data Protection Impact Assessment (DPIA) carried out under Article 35 of the GDPR.

Interestingly, the requests made by the Authority to the Veneto Region are the same requests made to the 3 local health authorities of the Friuli-Venezia Giulia Region and for which they were then sanctioned, namely: 
  • Incorrect identification of the legal basis of the processing. The Authority held that the performance of predictive medicine activity (which consisted in the profiling of the patient in order to predict the evolution of the health situation and the possible correlation with other risk elements due the Covid-19 infections) does not fall within the processing strictly necessary for the ordinary activities of preventive and occupational medicine under Article 9(2)(h) of the GDPR and can therefore be carried out only on the basis of the explicit consent of the data subject under Article 9(2)(a) of the GDPR.
  • Violation of the principle of transparency. The Authority found that the 3 local health authorities had not provided to the data subjects specific information regarding the predictive medicine processing activities, as required by Article 13 and 14 of the GDPR.
  • Violation of the obligation to conduct impact assessment. The Authority also found that the data controller did not conducted the Data Protection Impact Assessment under Article 35 of the GDPR, although at least two of the nine criteria for conducting an impact assessment were present, namely: i) processing of "sensitive data or data of a highly personal nature"; ii) processing "data concerning vulnerable data subjects" (i.e., patients); iii) " data processed on a large scale "; and iv) "the innovative use of new technological solutions”.

Finally, regarding to the Replika decision, the Authority pointed out that there was:
  • Lack of transparency of the information requirements under Article 13 of the GDPR, particularly with reference to the processing of children's data.
  • Incorrect indication of the legal basis of processing activities, since, for minors, this cannot be identified in the necessity to perform a contract.
  • The absence of an age verification mechanism, in fact the system only asks for the name, e-mail and gender of the data subjects.

In conclusion, from the analysis of the aforementioned decisions of the Italian Authority, it is possible to identify some important aspects that should be address by companies that want to use artificial intelligence or algorithms to process heath data or special categories of personal data, namely:
  • Correctly identify the legal basis for the processing activities, paying particular attention if the data processed concern vulnerable subjects such as children.
  • Provide clear and transparent information to data subjects, in accordance with Articles 12, 13 and 14 of the GDPR.
  • Evaluate the need to carry out a Data Protection Impact Assessment under Article 35 of the GDPR, given that at least two of the nine criteria for conducting an impact assessment could be present: i) processing of "sensitive data or data of a highly personal nature"; ii) processing "data concerning vulnerable data subjects" (i.e. patients); iii) "data processed on a large scale"; and iv) "the innovative use of new technological solutions”.
  • Finally, in cases where data of minors are processed, implement an age verification system (preferably a dynamic one) that allows to correctly verify the age of the data subject, as well as to provide a blocking mechanism in case the children make his or her age explicit.

In addition to the obligations expressly mentioned by the Authority, we also remind the importance to:
  • Maintain update the records of processing activities.
  • Implement appropriate measures for the storage of data.
  • Appoint the authorized persons for processing, who should be specifically educated and trained.
  • Provide simple and intuitive channels for the exercise of data subjects' rights.
  • Take appropriate procedural and technical security measures in accordance with Article 32 GDPR.

DATA PROTECTION BITES

author

Contact Person Picture

Stefano Foffani

Avvocato

Associate

+39 049 8046 911

Invia richiesta

Profilo

Contact Person Picture

Nadia Martini

Avvocato

Partner

+39 02 6328 841

Invia richiesta

Profilo

RÖDL & PARTNER ITALY

​Discover more about our offices in Italy. Read more »
Skip Ribbon Commands
Skip to main content
Deutschland Weltweit Search Menu