19.08.20
Intervalo: 08/19/2020 - 08/19/2020

In this edition you will see the Czech authority’s comments on the end of Privacy Shield, the Danish authority’s decision on the use of fingerprints for access control, the Danish authority’s inventory guidelines and much more!

Data Protection at Authorities

The Office for Personal Data Protection – Czech Republic

Czech authority commented on the effects of the end of Privacy Shield for controllers

The authority pointed out that GDPR requires that the monitoring and analysis of the risks to the rights and freedoms of data subjects occur on a continuous basis, reviewing the threats considered, applying corrections and measuring the effectiveness of the measures applied, including through contractual review. It further states that if operators, as data importers, are subject to US law, the risks arising not only from the decision of the CJEU, but also from the CLOUD Act (legislation adopted by the US National Congress, which clarifies the use data abroad), need to be evaluated. Finally, the authority points out that the controller can only use an operator who provides sufficient guarantees for the implementation of the appropriate technical and organizational measures, as provided for in art. 28 of the GDPR.

Datatilsynet – Denmark

Danish authority fined online real estate service for leaking tenants’ information

The authority fined the company DKK 150,000 after investigating complaints about the transfer of confidential information from tenants. In 2018, PrivatBo supported a housing fund with a sale of three properties. On that occasion, the company provided material that was distributed to the occupants of the properties. However, PrivatBo was unaware that, in some of the lease agreements, documents containing personal data of a confidential nature and which should not be disclosed were attached. The Danish Data Protection Agency therefore assessed that PrivatBo did not comply with the requirements of Article 32 of the Data Protection Regulation to implement appropriate technical and organizational security measures.

Danish authority decision on the use of fingerprints for access control

The Danish Data Protection Agency took a decision in a case where a union, on behalf of one of the association’s members, complained that a company handles fingerprint information in order to unambiguously identify which employee is arriving at the workplace and leaving. The company justified the control measure on the grounds that it is fundamental to food security, including the company’s export opportunities, that unauthorized people do not have access to production and that whoever participated in the production of a product can be identified at any time. The keys that employees use for identity verification do not provide sufficient assurance that whoever was present in the production can be unambiguously identified, as the keys may be stolen or replaced. After a general review, the Danish Data Protection Agency concluded that there was no basis for ignoring the company’s assessment. The processing could therefore be carried out within the scope of data protection rules.

Danish authority updates inventory guidelines

The requirement to keep a record should be seen as an extension of the Data Protection Regulation’s greater focus on the accountability aspect. The idea of ​​accountability implies, in the first place, that the controller – and in some cases the operator – is responsible for ensuring compliance with the rules of the regulation. Second, the controller must also be able to demonstrate that the processing is in accordance with their rules. The authority assessed, in the new guidelines, that a list of processing activities should contain a clear link between the categories of personal data being processed and the individual categories of data subjects. If personal data is or will be disclosed in connection with a processing activity, the list must also contain information on which categories of personal data are or will be disclosed to the recipient in question. It should also be indicated to which categories of data subjects the information in question refers.

Danish authority completed three audits focusing on recording treatment activities

The inspections focused on the fulfillment, by three municipalities, of the obligation to keep records of treatment activities, including, in particular, whether the municipal records could be used for the purposes on which the requirement to maintain records. In one case, the authority concluded that most of the municipality’s lists were prepared properly, as the lists generally provided a good overview of the municipality’s processing activities. In two of the cases, the authority concluded that certain sections of the municipalities’ lists raised some challenges in relation to the objectives underlying the processing.

European Data Protection Supervisor – EDPS

EDPS publishes TechDispatch # 2/2020 on quantum computers and cryptography

Quantum computers are those in which the physical laws of quantum mechanics allow an alternative method to what today’s computers use to process information. While traditional computers use bits (0 or 1) as a building block, quantum computers employ quantum bits, or qubits. One of the reasons for the risk of protecting personal data when using quantum computers is the ability to break encryption. Quantum computing can break many of today’s classic cryptographic and, as such, severely undermine IT security. The document explains the risk imposed on different types of cryptography, such as public key cryptography, symmetric cryptography and retrospective decryption. According to the EDPS, to run useful algorithms of practical relevance, it is necessary to build a quantum computer with more qubits and lower error rates than is possible today. The creation of a large, usable quantum computer in the next ten years is highly unlikely, but difficult to predict.

Garante per la Protezione dei Dati Personali – Italy

Italian authority director takes a stand on the proliferation of contact tracing applications

The director pointed out that the emergence of COVID-19 does not automatically represent, by itself, a sufficient legal basis to affect constitutionally protected rights and freedoms, legitimizing the processing of particularly invasive data, such as that which aims to allow tracking of contacts by any public or private entity. The authority specified that the only processing of personal data which, at present, can enjoy an adequate legal basis, are those based on national legislation. Any other processing aimed at tracking contact is, therefore, devoid of an appropriate legal source and thus carried out in violation of European and national legislation on the protection of personal data.

Autoriteit Persoonsgegevens – Netherlands

Dutch authority publishes initial survey results on smart cities

The research focuses on the processing of personal data in public space with sensors and other technologies. The authority maps how municipalities deal with the privacy of residents and visitors. In the meantime, a diverse group of municipalities, spread out in size and location, provided information for the study. During the investigation, the authority requested the Personal Data Protection Impact Reports (DPIA) for smart city applications, among other things. The Authority has the following tips for municipalities: (i) pay attention to the quality of your DPIA and clearly indicate what data you process in practice with, for example, sensors and cameras. (ii) keep your DPIA updated and check from time to time that the processing is still the same. It may be necessary to revise the DPIA in case of changes; (iii) take into account a period of 14 weeks for a possible prior consultation, so start on time, identifying the risks, if, for example, you want to use a smart city app for a specific event; (iv) involve citizens in the development of smart city applications and ask for their opinion before the project starts.

Dutch authority determines that the contact tracing application developed by the government does not yet have sufficient guarantees

Aleid Wolfsen, president of the authority, pointed out: “It is clearly designed with privacy as a starting point. With all kinds of technical safeguards for users’ privacy, such as encrypting data traffic and sending false codes to prevent you from being able to read anything from data traffic.”. The authority said the application depends on other technical and regulatory components, and therein lies the concerns. For the president, “this application is not only what you see on the screen, but also the technology of Google and Apple, and also the servers to which you send your data. This application is part of a system. Privacy also must be in order in those other parts of the system, as well as in the application itself.”

Datatilsynet – Norway

Norwegian authority says digitization of the school sector violates student privacy

According to the article, in Norway, compulsory education for children is 10 years. During this period, significant amounts of personal information about students are collected, such as name, birth number, academic achievements and skills, family relationships, life events and health information. Information is processed in various administrative systems and digital tools that the school and the municipality use. They are used in training and communicating with the student’s home and are shared with other public bodies such as health funds and the police when necessary. There is also a high likelihood that the information collected about someone starting school in 2020 will be used for completely new purposes when the student finishes primary school in 2030. The authority realized that the school system does not keep pace with the use of technological tools. The risk of leaking personal and confidential information about students is high. The large number of breaches in the security of personal data confirmed how the lack of competence leads to the leakage of information. During 2019, the authority received almost 2,000 notifications of deviations from the purpose of the processing, with about one in 10 deviations affecting children and young people.

European Data Protection Board – EDPB

Spanish authority imposes a fine on the company for non-compliance with the exclusion of advertising

The authority imposed a fine of 1,200 euros on a company for calling data subjects, offering them an offer of hotels, even though they were in an advertising exclusion system. By adhering to this system, the data subject exercised his right to oppose processing for marketing purposes, under article 21 of the GDPR. However, the company did not fulfill its obligation to consult the advertising exclusion system before making a phone call for marketing purposes.

Spanish authority fines VODAFONE

The authority imposed a fine of 75 thousand euros on VODAFONE for handling the applicant’s phone number for marketing purposes, after exercising its right of erasure in 2015, despite the sending of advertising SMS to the data subject. The controller stated that the complainant’s number, being easy to remember, had been used as a “fictitious number” by his employees.

Data Protection at Univesities

Law versus technology: Blockchain, GDPR, and Tradeoffs

TATAR, Unal. GOKCE, Yasir. NUSSBAUM, Brian.

The article studies the contradiction of blockchain technology and the requirements of GDPR. The three contradictions that are examined are (i) the right to forget versus the irreversibility/immutability of records; (ii) data protection by design versus inviolability and transparency of blockchain and (iii) data controller versus decentralized nodes. The article highlights that conflicts can be addressed by focusing on the common points of GDPR and blockchain, developing new approaches and interpretations, and adapting blockchain technology according to the needs of the data protection law.

Smart Cities, Big Data, Artificial Intelligence and Respect for the European Union Data Protection Rules

RUIZ, Francisco Javier Durán. 

The article performs a bibliographic review in order to define the concept of smart cities. Subsequently, it analyzes the legal implications of smart cities, comparing the existing European legislation on the protection of personal data, in order to understand its effectiveness in relation to the regulation of the use of Big Data, necessary for the operation of smart cities. The article concludes that the implications of the implementation of smart cities in Europe are many, among them the regulation of the digital government, transparency and the right to access public information. With a combination of modern and protective regulation of fundamental rights and the development of technologies for smart cities, the author believes that it is possible to create an ideal city model.

Data Protection in the Brazilian Legislative

Proposed Bill that provides for the disclosure of information of collective interest

Presented by Dep. Fed. Ricardo Guidi, from PSD, Bill 4189/2020 amends Law No. 12,527/2011 (Access to Information Law) to determine the creation of a content search tool, especially through the name agencies, entities, companies, works or addresses, which allows access to information in an objective, transparent, clear and easy to understand language. The Bill is at the Board of Directors.

Proposed Bill on Telehealth

Presented by Federal Deputy Rejane Dias, from PT, Bill 4137/2020 regulates telehealth throughout the national territory. Telehealth is defined as the provision of services in the health area, using information and communication tools and technologies, including teleconsulting, telediagnosis, second training opinion, tele-education, among others. Currently the Bill is in Plenary.

Bill presented that creates a new employment contract on digital platforms for private individual transport or delivery of goods

Presented by Dep. Fed. Henrique Fontana, from PT, Bill 4172/2020 determines, among other things, that the codes and algorithms used by digital platforms must be regularly audited, carried out by labor inspection and other bodies control authorities of the Public Power, obeying, as appropriate, the provisions of the General Data Protection Law. Currently the Bill is at the Board of Directors.

Bill introduced to regulate the use of algorithms by digital platforms

Presented by Dep. Fed. Bosco Costa, from PL, Bill 4120/2020 regulates the use of algorithms by digital platforms, ensuring transparency in the use of computational tools that can induce decision making or act on users’ preferences. To this end, it defines concepts such as automated high-risk decision, determines the formulation of impact reports, with a detailed description of the systems, among other measures. Currently the Bill is at the Board of Directors.

Compartilhar: