15.10.20
Intervalo: 10/01/2020 - 10/01/2020

Hi, In this edition, the Bulletin highlights the main movements of national data protection authorities from different countries, two academic articles on civil liability of personal data processing agents and […]

Hi,

In this edition, the Bulletin highlights the main movements of national data protection authorities from different countries, two academic articles on civil liability of personal data processing agents and on the influence of the use of Big Data in technologies related to Public Health, one draft law on distance learning and the TRT-4 decision on access to personal data of employees by the employer.

We emphasize the opinion published by CNIL on the use of facial recognition at airports. The Authority pointed to the prohibition, a priori, of the collection of biometric data, which are considered sensitive and highlighted the need, if the collection is necessary, to respect the principles provided for in the GDPR, including the collection of free, express and informed consent. In Brazil, the Federal Government wants to implement, at all airports, the use of facial recognition, through the “safe boarding” project. It should be noted in this regard, that biometric data are considered sensitive by the General Data Protection Law and, therefore, the regime is more restrictive than other personal data and the safeguards that must be adopted by the processing agent need to be more robust.

In this light, we consider that, as pointed out by the French Data Protection Authority, the realization of the Data Protection Impact Assessment of the “Safe Shipment” project prior to the implementation of facial recognition technology at airports is important and indicated, in order measure what are the possible risks and what safeguards will be adopted to mitigate them.

We also emphasize the decision of the Regional Labor Court of the 4th Region, which determined that the practice of consulting, on the part of the employer, the database on credit restriction records (SPC and SERASA), which is a tax situation before the Internal Revenue Service, is illegal. Federal, civil lawsuits, police investigations, police occurrences and criminal proceedings without final judgment on the employee, pointing to the violation of the constitutionally provided right to privacy.

Good reading!

Bruno Bioni, Iasmine Favaro & Mariana Rielli

Data Protection at Authorities

Belgium

New document with frequently asked questions about small and medium-sized enterprises

The frequently asked document explains certain aspects and obligations of the GDPR, with concrete examples and references to additional sources of information. The document contains, for example, answers to practical questions such as: whether GDPR applies to the organization, in which cases does it act as a controller or processor or how to write a privacy policy.

Belgian authority publishes 2019 annual report

According to the document, the authority has implemented a monitoring system that allows it to actively know the technological and social developments related to data protection. The authority was thus able to react quickly when it became aware of processing operations involving possible risks to citizens’ rights in terms of data protection. According to the Authority, with the development of the inspection service, 2019 was an important year for the implementation of the control task. The first investigations were initiated and, in May 2019, the Authority handed its first fine, worth 2,000 euros, against a parliamentarian who had diverted e-mail addresses obtained during his tenure for the purposes of political propaganda. In the same year, he imposed two more fines, of 5,000 euros each, for similar facts. For the Authority, agents and public authorities must set an example in terms of data protection, which is why it decided to make the public sector one of its major strategic priorities.

Czech republic

Czech authority president Jiří Kaucký spoke about the role of DPAs at the “GDPR 2020” conference

The president mainly reaffirmed the main mission of a National Data Protection Authority, which consists of regulating the processing of personal data. According to the president, especially today, this information is a great source of energy: “the legal regulation of the processing of personal data is not an end in itself. The reason is not primarily ‘protection for protection’, but because protecting citizens’ privacy, their dignity is protected through personal data. The processes are typically unfair practices of other individuals and, as it turns out, not only in connection with the coronavirus pandemic, especially the excessive interference of the state in our freedom, “he said. At this point, the President of the Authority wants to focus primarily on partnerships with local governments. According to him, the relationship with municipalities and regions should not be based on confrontation, but on the principle of cooperation: “above all, we must help municipalities and their commissioners to establish processes to carry out their necessary tasks in a way that respects the privacy of its citizens ”.

Denmark

New decision on violations due to lack of security system testing

The Authority took a decision in three cases of breaches of the security of personal data. What is common between the cases is that – due to the lack of basic controls – a lot of information has been transmitted to the recipients or access to information related to many citizens has been granted. In two of the cases, it is employment-related information about 1.5 million citizens and, in the second case, up to 4.2 million citizens. The Danish Data Protection Authority has given severe criticism to companies that process data for municipalities. In one case, there was unintentional disclosure of name and address, when a person filled out the digital form of a municipality for the purpose of complement or unique benefit on behalf of another citizen. This last incident is an example of a case in which the municipality that controls the data did not carry out a basic verification of what information was passed on when an IT system was put into operation. Likewise, in the previous cases, operators did not carry out basic checks on the information passed on and on the information provided after changing the system after an update.

Danish authority initiates investigation into a range of research activities

On June 22, 2020, the Authority published a report that it would take a closer look at the Chamber Counsel’s reports on the processing of personal data from the Statens Serum Instituts (SSI) for use in research. Based on the preliminary findings of the Chamber’s Lawyer reports, the Authority decided to open a case on its own against SSI. In this regard, he asked the institute to explain a number of recurring issues in the preliminary reports. The Authority is also awaiting the House Counsel’s final report, due in November 2020, to then decide to what extent this gives rise to further reactions to SSI. In order to investigate whether the same problems that apparently occur with SSI are occurring with other similar controllers, the Authority has also initiated a series of written inspections by public authorities conducting research. Audits focus on: (i) roles and responsibilities (responsibility for data); (ii) legal basis for the processing; (iii) transfer of personal data to recipients in EEA countries and to recipients in countries outside the EU; (iv) supervision of operators; (v) the Authority’s possible permission for disclosure, in accordance with the Data Protection Act, section 10, subsection 3; (vi) in accordance with Article 30 of the Data Protection Regulation and (vii) data protection policies / guidelines in connection with the implementation of research projects.

France

CNIL publishes guidelines for collecting health data in sports

Many sports structures (associations, clubs) intend to implement appropriate measures to limit the spread of the virus and guarantee the resumption of sports activities and events in complete safety (training, tournaments, friendlies, etc.). In this perspective, they question the conditions in which the personal data of athletes, coaches, referees or supervisors can be used, particularly regarding health: systematic measurement of temperatures before accessing sports equipment, organization of serological tests before the organization of an event sports, communication of a negative serological test in case the athlete is absent from training, filling out a health questionnaire specifically dedicated to the risks of exposure to COVID-19, etc. To answer these questions, CNIL recalled the principles of protection of privacy and personal data applied to sports. First, the Authority defined that any temperature reading, any result of a serological test, any medical certificate sent to sports facilities to assess a risk of exposure to COVID-19, constitutes personal health data within the meaning of the GDPR. Due to their sensitive nature, the health data of people involved in sports facilities or during sporting events are subject to very specific legal protection. Thus, the processing of these data, whether it is the collection, registration, transmission, use of temperatures or the results of serological tests carried out, is in principle prohibited. However, within the scope of COVID-19, health data may, exceptionally, be processed by sports facilities, provided that they fall within one of the following hypotheses: (i) sports facilities obtain, before collection health data, the consent of the people involved (athletes, coaches, referees, etc.) and (ii) the collection of health data is justified for reasons of important public interest.

CNIL publishes guidelines for the use of facial recognition at airports

According to the Authority, there is widespread use of biometric facial recognition devices at airports in France and, particularly, internationally. According to the report “Air Transport IT Insights 2018” by the international aeronautical telecommunications company (SITA), 59% of airports and 63% of airlines plan to deploy facial recognition devices by 2021. It is in this context that several airport managers or providers French service providers have appealed to CNIL to support them in experimenting with this type of device. For the Authority, within an airport, facial recognition can enable the automation of the various control steps (such as baggage delivery or boarding), replacing the control of travel and identity documents, in order to speed up the traveler’s journey and guarantee an improvement in your experience, reducing waiting time. In practice, the photograph of the passenger’s face on his identity document is compared by facial recognition to the face captured during his passage through the airport checkpoints. Biometric data have the particularity of being produced by the body itself and characterize it definitively. They are, therefore, unique, permanent over time. Unlike any other personal data, biometric data is not assigned by third parties or even chosen by the person. Unlike a password or an identifier, it cannot therefore be modified in case of compromise (loss, intrusion into the system, etc.). Biometric data is therefore “sensitive” data within the meaning of data protection legislation and, as such, is subject to a specific regime. Its processing is prohibited in principle and can only be implemented, as an exception, in some cases listed by GDPR. In this sense, the main principles to be respected are: (i) justify the need and proportionality of the planned facial recognition device; (ii) obtain the prior consent of the passengers in question; (iii) keep biometric data under the exclusive control of the passengers involved and (iv) carry out a data protection impact assessment (DPIA).

Germany

German authority criticizes data retention plans

German Authority President Professor Ulrich Kelber calls on the Federal Government to view the European Court of Justice (ECJ) judgment on data retention as a limit for future laws: “it is incomprehensible that, a year before the federal elections , future laws in the field of telecommunications are planned in a way that contradicts the ECJ line. Instead, Germany should work to ensure that no new regulations for data retention at European level emerge. ” This applies in particular to the Regulation of electronic privacy currently discussed. The Authority criticizes the unreasonable data retention for years and receives the judgment of the ECJ as validating its criticisms. According to the publication, with this innovative decision, the unconditional and generalized retention of traffic and location data, which documents who called whom, when, for how long and from where, it is declared incompatible with European legislation. ECJ makes it clear that data retention is still possible under certain conditions, in order to prevent serious criminal offenses and ensure national security. The respective national order for storage must, however, be limited in time and subject to effective review by a court or an independent administrative authority.

Netherlands

Dutch authority publishes privacy recommendations in digital home education

Parents, students and teachers raised concerns about the processing of personal data in distance learning to the Authority. They wonder, for example, whether the systems that schools use for video calling are really safe. And if the data can’t fall into the wrong hands. The objective of the Authority’s research was therefore to gain insight into how educational institutions handle personal data during online video calls and online supervision. Based on that, he made concrete recommendations on how best to protect the privacy of his students and staff, now and in the future. With online video calls, it is especially important that educational institutions develop a clear policy on which applications can be used. You must also specify, so that video images can be used. If not necessary, the educational institution must ensure that no student is qualified when capturing images from a digital class. The educational institution must also pay attention to the hiring of the software supplier, including in relation to the retention period of the images. Images cannot be kept for longer than is strictly necessary.

Norway

European Union Court of Justice prohibits mass collection of communication data

On June 11, 2020, a new law on the intelligence service was passed, which allows so-called digital border defense, in which communication service providers may be required to provide access to metadata that the intelligence service will store for 18 months. In Privacy International (Case C-623/17) and La Quadrature du Net and Others (Case C-511/18), the Court considers that a general and indiscriminate collection and storage of communication information is contrary to fundamental rights and to Communications Protection Directive (2002/58 / EC), implemented in Norwegian law in the Electronic Communications Law. It follows from Article 5 (1) of the Directive that Member States are obliged to guarantee the confidentiality of communications and related traffic data. The Court states that national legislation cannot impose on providers an obligation to provide information and access to mass communication data, which would imply general and indiscriminate prior storage of such data. Targeted collection of traffic and location data may be acceptable if storage is limited to what is strictly necessary, based on a real threat, limited to specific categories of data, the media used, the people involved and the length of time collection and storage.

A similar attempt occurred in the “PL das Fake News” in Brazil, in which metadata storage was sought in order to identify accounts that spread false news. The retention of metadata was removed from deputy Orlando Silva’s draft (PCdoB – SP) after a strong incidence of civil society organizations.

United Kingdom

ICO concludes investigation into the use of personal data in political campaigns

The Authority analyzed an entire ecosystem – data analysis companies, platforms, political parties and data brokers – and then sought to make changes to the way people’s personal information was being used. The action led to fines paid by Vote Leave, Leave.EU, Emma’s Diary and Facebook, the latter considering the maximum financial penalty that the ICO could charge under the law of the time. If Cambridge Analytica had continued to exist, according to Commissioner Elizabeth Denham, it would also have sought to act against its harmful practices in relation to data protection. For Denham, the investigation has been completed, but the work in this area does not end here. The Authority will soon publish an audit report of the main political parties, in addition to working with the main credit reference agencies and data brokers.

Data Protection at Universities

Civil liability of processing agents under the General Data Protection Law

The article analyzes the civil liability of processing agents from the perspective of the General Law for the Protection of Personal Data (LGPD). In this sense, in view of the growing demand for the use of technology, it became necessary to regulate the use of personal data. Thus, the legal text innovated with the creation of “processing agents”, who are responsible for data processing, and are therefore immersed in a range of duties. These duties, in turn, are necessary for the balance of the relationship, making possible the realization of the rights of the holders of personal data, according to the law. Thus, the responsibility arises from the exercise of the data processing activity that violates the commands of the current legislation, causing material or moral damage to a holder or a community. From this, the LGPD separates the responsibilities of the processing agents, but stipulates hypotheses of joint and several liability, as well as situations of exclusion. For this study, bibliographic research was carried out based on scientific articles from up-to-date and specialized magazines on Law and Technology.

Reflections on the use of Big Data in predictive models of epidemiological surveillance in Brazil

The aim of the study is to discuss the bioethical implications from the announcement of Digital Health by the World Health Organization and the use of big data in the production of predictive health surveillance systems in Brazil. A literature review was carried out based on the search for articles on the Scielo, Bireme, Jstor platforms and on the page of the World Health Organization and the Ministry of Health of Brazil, with the big data, bioethics and ethics descriptors, from May to July of 2020. Limits on the use of big data as a tool for predictive epidemiological surveillance were evidenced, notably with its use during the Covid-19 pandemic, although justified from the theory of protection bioethics and public health ethics. The greatest limits observed were the absence of adequate data protection legislation and bias in the data obtained. In order to analyze the bioethical impacts of the use of big data in the medicine of the future, it is essential to deepen the discussion about the possible impacts that the use of these technologies can have on society, with an emphasis on the development of surveillance capitalism, on interference in social and social life. in the intensification of regional inequalities.

Data Protection in the Brazilian Legislative

Bill presented on the employment relationship between educational establishments working in remote education by digital means and their teachers

Presented on October 5 by Dep. Fed. Vanderlei Macris, Bill 4816/2020, intends to regulate the relationship between remote or hybrid education institutions and their respective teachers. With regard to the protection of personal data, the Bill defines that the prior and express consent of the teacher must be required for the production of academic activities disseminated on open, extracurricular virtual platforms, in which personal data (image, voice, name) or teaching material produced by the professional.

Data Protection in the Brazilian Judiciary

TRT-4 decides in case of improper access to employee personal data

Case No. 0000241-06.2013.5.04.0802, reported by Minister Tânia Regina Silva Reckziegel, concerned the practice of investigating the private lives of drivers who provide service to the defendant, starting with the hiring of companies specialized in this type of investigation, in order to to create employee profiles. The minister pointed to the injury to the right to privacy, foreseen in the Federal Constitution and determining the practice as illegal. The rapporteur’s vote was followed by Minister Marcelo Ambroso and the appeal of the author’s union was upheld to determine the prohibition of the companies complained of researching, using, storing and/or passing on information about professional drivers based on consultations on restrictions on credit (SPC and SERASA), tax situation before the Federal Revenue Service, civil lawsuits, police inquiries, police occurrences and criminal cases without final judgment, under penalty of a daily fine of not less than 10 thousand reais for each violation.

Compartilhar: