Utilizziamo cookie tecnici per personalizzare il sito web e offrire all’utente un servizio di maggior valore. Chiudendo il banner e continuando con la navigazione verranno installati nel Suo dispositivo i cookie tecnici necessari ai fini della navigazione nel Sito. L’installazione dei cookie tecnici non richiede alcun consenso da parte Sua. Ulteriori informazioni sono contenute nella nostra Cookie Policy.



Protecting your privacy with Chatbots

PrintMailRate-it

published on 17 January 2024 | reading time approx. 4 minutes


In the 21st century, the first place people turn to for answers or help is the Internet. The use of the Internet has been greatly facilitated by the advent of the well-known ‘chatbots’. Although, this tool seems convenient and extremely helpful in everyday life, precautions must be taken when protecting your data. 

Artificial intelligence (AI) powered chatbots can be used for a variety of tasks such as information gathering, problem solving, text generation and many others. However, before engaging in a conversation with AI, everyone should be aware that the use of such tools also involves risks to users' privacy. 

When using chatbots one is not only a recipient of this service, because without due to functionality and AI help chatbot users are a valuable resource from which needed information can be extracted. Namely, when the chatbot receives a request for information, it may not only use the Internet, but it may also use information that is obtained from conversations with other users, so the information that which you share in during conversations with the chatbot needs to be carefully evaluated. 

Chatbots are divided into two types: ask-oriented (declarative) chatbots and data-driven and predictive (conversational) chatbots. If the first type of chatbot is programmed to specific functions and responds with programmed responses, the second type of chatbot can evolve from conversations with the user. They apply predictive intelligence and analytics to enable personalization based on user profiles and past user behaviour. 

Attention should be paid specifically to the second type of chatbots, i.e., predicative chatbots. Often chatbots are used as virtual assistants on companies’ websites to address customer issues, reducing the use of human resources as much as possible and making service delivery more efficient, as people’s resources limit the number of customers that can be helped at the same time. In addition, there are some programs that focus directly on chatbots services, helping businesses, for example, create documents, write an e-mail response, and help translate. Such services make everyday life and work much easier, but it could be unsafe for use regarding data protection. 

When you ask chatbot to create a document in conversation, do a translation, or generate an e-mail response, it is not recommended that you include personal information, company information, or other people's data in the text. If a company uses chatbots, for example, to send automatic e-mails to customers, the company is obliged to inform the customer about the use of their data (in this case, the use of their e-mail addresses). In order to respect the principles of privacy and data protection, companies should restrict the collection and retention of personal information, clearly indicate the purpose of data collection from the outset and respect this objective. Organizations need to explain how they use customer data, to whom they are available, how long they are retained, etc.

There are risks of disclosure of personal data or other confidential information, both when we use the AI tool to generate text and when we use it to check or edit already made text. It is therefore necessary to assess the content of the information in order not to harm yourself or your business. Providers of such chatbot services have provided several options to avoid the use of user information for the AI training, such as the option for the user to manually turn off the retention of conversation history or to delete conversations that have already taken place. However, even such measures are not sufficient and will not ensure full protection of personal data, as well as commercially valuable information. There is always a risk that information entered through such services may be leaked to third parties or the chatbot user can become a target for cyber criminals. Therefore, it is of utmost importance to properly protect personal information when using such services and act carefully when disclosing data.

DATA PROTECTION BITES

author

Contact Person Picture

Staņislavs Sviderskis

Assistant Attorney, Certified Data Protection Specialist

Senior Associate

+371 6733 8125

Invia richiesta

RÖDL & PARTNER LATVIA

Discover more about our offices in Latvia. 
Skip Ribbon Commands
Skip to main content
Deutschland Weltweit Search Menu