What’s Happening in the World?

While the field of Data Protection is developing at an accelerating pace in our country, worldwide innovations continue to remain on the radar of the Personal Data Protection Authority (“Authority”).

From the examples we have repeatedly encountered before, we witness that the Authority keeps pace with the world agenda, especially the European General Data Protection Regulation (“GDPR”) regulations, and tries to catch up with the requirements of the fast-moving data privacy world.

As GRC LEGAL, we closely follow the world agenda and present the current selections for your information with this content.

The news below belongs to 2023 July.

GDPR Procedural Regulation

The European Commission (the “EC”) has recently proposed new rules to support the effectiveness and efficiency of the implementation of the European General Data Protection Regulation (the “GDPR”) in cross-border cases. The GDPR Procedural Regulation (the “Regulation”) aims to facilitate cooperation between data protection authorities by harmonising certain aspects of administrative procedures in cross-border cases.

The EC’s proposed Regulation does not affect important aspects of the GDPR, such as the rights of data subjects, the obligations of controllers and processors or the conditions for processing personal data.

In its 2020 report on the implementation of the GDPR, the EC found that procedural differences applied by data protection authorities prevented the smooth and effective functioning of the GDPR’s co-operation and dispute resolution mechanisms in cross-border cases (i.e. involving complainants from more than one Member State).

The EC found that a more harmonised approach to issues such as the admissibility of complaints, the exercise of data subject rights and the involvement of data subjects in the procedure would improve efficiency and outcomes for citizens, businesses and data protection authorities alike. These elements have also been recognised as important by the European Parliament and the European Data Protection Board (EDPB).

The Regulation fully protects and supports the system whereby individuals and organisations can deal with their lead/local supervisory data protection authority. Individuals will benefit from a one-stop-shop system, relying on their lead/local data protection authority to protect their rights, regardless of where the organisation processing their data is headquartered, while businesses will continue to benefit from the right to deal with a single data protection authority.

The proposal complements the GDPR by setting out detailed procedural rules for the cross-border enforcement system, without changing the procedural steps provided by the GDPR or the roles of data subjects, lead/local data protection authorities, relevant supervisory authorities or the EDPB in the cross-border enforcement procedure.

Information Corner

The GDPR introduces a ‘single competent authority’ system for organisations established in the European Union and engaged in cross-border processing of personal data. This system enables organisations to deal with a single lead supervisory authority for most of their processing activities. For the system to apply to organisations, the organisation must be established in the EU and engaged in cross-border processing. If the establishment is established in the EU and engages in cross-border processing, it is important to determine the location of the parent organisation. The supervisory authority of the EU Member State where the parent establishment is located will be the lead/local supervisory authority for the organisation’s processing activities.

The lead/local supervisory authority is the supervisory authority of the Member State where the establishment has its parent establishment. The lead/local supervisory authority will have primary responsibility for dealing with the organisation’s processing activities and in most cases will be the supervisory authority with which the organisation deals in relation to cross-border processing activities.

The fact that the organisation engages in cross-border processing activities means that supervisory authorities other than the lead/local supervisory authority (“relevant supervisory authorities”) will also be involved in the processing activities.

In this context, relevant supervisory authorities will be interested in the organisation’s processing activities where any of the following apply

The organisation is established in the Member State of the relevant supervisory authority,

Relevant persons resident in the Member State of the relevant supervisory authority are, or are likely to be, significantly affected by the organisation’s processing activities,

If a complaint has been lodged with the relevant supervisory authority regarding the organisation’s processing activities.

If the lead/local supervisory authority needs to investigate the organisation’s cross-border processing activities, it shall do so in accordance with the cooperation and consistency procedures of the GDPR. In such investigations, the lead/local supervisory authority shall coordinate closely with the relevant supervisory authorities as appropriate.

The GDPR is enforced by national courts as well as by independent national data protection authorities. In cases involving cross-border processing of personal data (data processing taking place in more than one Member State or significantly affecting the persons concerned) The GDPR’s “single competent authority” system applies. In such cases, the local supervisory authority where the organisation under investigation is located conducts the investigation in cooperation with other relevant supervisory authorities.

Under the GDPR, data protection authorities co-operate to reach consensus on the implementation of the GDPR. If data protection authorities cannot reach consensus in a cross-border case, the GDPR provides for dispute resolution by the EDBP.

Currently, data protection authorities have fragmented approaches to the concept of complaints. The proposal ensures that, regardless of where the complaint is lodged or which data protection authority is conducting the investigation, data subjects have the same procedural rights in cross-border cases, such as the right to be heard before a decision is taken to reject a complaint in whole or in part.

Under the new rules, parties under investigation will have the right to be heard at key stages of the procedure, including dispute resolution by the EDPB. They also clarify the content of the administrative file and the parties’ rights of access to it.

European Commission x Qualification Decision

The EC recently adopted an adequacy decision on the EU-US (European Union-United States) Data Privacy Framework. The decision concluded that the United States provides an adequate level of protection comparable to the European Union for personal data transferred from the EU to US companies under the new framework. Based on the new adequacy decision, personal data can be transferred securely from the EU to US companies participating in the Data Privacy Framework without having to take additional data protection measures.

The EU-US Data Privacy Framework introduces new binding safeguards to address all concerns raised by the European Court of Justice, including limiting US intelligence services’ access to EU data in a necessary and proportionate manner and establishing a Data Protection Review Court (DPRC) accessible to EU individuals.

The new framework brings significant improvements compared to the existing mechanism under the Privacy Shield. For example, if the DPRC finds that data has been collected in breach of the new safeguards, it will be able to order its erasure. The new safeguards on state access to data will complement the obligations that US companies importing data from the EU must comply with.

US companies will be able to join the EU-US Data Privacy Framework by committing to comply with a detailed set of privacy obligations, such as deleting personal data when the purpose for which it was collected no longer exists and ensuring continuity of protection when personal data is shared with third parties.

EU citizens will have access to a range of remedies in the event that their data is mishandled by US companies. These include free independent dispute resolution mechanisms and an arbitration panel.

The US legal framework provides a number of safeguards regarding access by US public authorities to data transferred under the framework, in particular for law enforcement and national security purposes. Access to data will be limited to what is necessary and proportionate to protect national security.

EU citizens will have access to an independent and impartial remedy mechanism for the collection and use of their data by US intelligence agencies, including the newly created DPRC. The Court will independently investigate and resolve complaints, including by adopting binding remedial measures.

The safeguards introduced by the US are expected to facilitate transatlantic data flows more generally, as they also apply when data is transferred using other means such as standard contractual clauses and binding corporate rules.

The functioning of the EU-US Data Privacy Framework will be subject to periodic reviews by the European Commission together with representatives of the European data protection authorities and the competent US authorities. The first review will take place within one year after the entry into force of the adequacy decision, in order to verify that all relevant elements are fully implemented and functioning effectively within the US legal framework.

1 Million Euros Penalty for Use of Google Analytics

Following complaints by Austrian data privacy group Noyb about allegedly illegal data transfers between the EU and the US, the Swedish Authority for Privacy Protection (IMY) has issued decisions against four companies. As a result, telecommunications provider Tele2 was fined SEK 12 million (EUR 1 million) and online retailer CDON SEK 300,000.

Previously, many EU member states such as Austria, France and Italy had also found that the use of Google Analytics violated the GDPR. Although the Court of Justice of the European Union (CJEU) has already ruled on data transfers between the EU and the US, the IMY decision was the first to impose a fine.

In 2020, the CJEU found that data transfers between the European Union and the US were largely illegal given the US government’s extensive surveillance options. However, despite this, many European businesses continued to use the services of Google, Meta, Microsoft, Amazon and others, ignoring the CJEU rulings and basing their defence on their supplementary measures, the Standard Contract Clauses (“SCC”).

Information Corner

The SCC are standardised and pre-approved model data protection clauses published by the European Commission that enable data controllers and processors to comply with their obligations under the EU Data Protection Act. These clauses do not have to be used but are intended to be included by data controllers and processors in contracts with third parties to demonstrate compliance with data protection requirements. The SCC consists of two sets, the first set regulates the relationship between data controllers and processors and the second set regulates data transfers to countries outside the European Union.

In its related decisions, the IMY emphasised that Google’s supplementary measures were not sufficient. Google had largely been directing EU business users to Google Supplementary Measures in order to overcome the shortcomings in US law, but this was also rejected by an EU authority.

GRC LEGAL Comment: The judgements have again demonstrated that EU-US data transfers jeopardise personal data security. Marco Blocher, Data Protection Attorney at Noyb, commented: “Finally, a Data Protection Authority has imposed a significant fine and banned the use of a tool that transfers personal data to the US in breach of the GDPR. This is a satisfactory ruling compared to other data protection authority rulings that merely found a breach but created no incentive for future compliance. We hope that other data protection authorities will follow IMY’s lead and put an end to illegal transfers.” The IMY decision is promising.

Google x Bard

Google has announced the launch of its artificial intelligence chatbot Bard in the European Union after addressing concerns raised by the Irish Data Protection Authority (“IDPA”).

US tech giant Google delayed the launch of its OpenAI product ChatGPT’s competitor in June after the Irish regulator stated that the company had provided insufficient information on Bard’s compliance with GDPR, the EU’s privacy rules. IDPA is Google’s main data regulator in the EU, as Google’s European headquarters are in Ireland.

Google’s Senior Product Director Jack Krawczyk told reporters ahead of the launch that Google is enhancing Bard with new features to increase “transparency”, “control” and “choice” for users. According to Krawczyk, users will be able to learn how their information is being used, opt out of certain uses, and control whether their conversations with Bard are recorded or deleted by Google. The chatbot will be available in more than 40 languages, including Arabic, Chinese, Hindi, German and Spanish.

An official from the Irish regulator said in a statement that they will continue their contacts with Google regarding Bard after the launch and that Google has agreed to submit a report to IDPA subject to a review three months after Bard becomes operational in the EU.

Information Corner

Bard’s main competitor, ChatGPT, was temporarily banned in Italy in March over concerns that it may violate privacy standards, and is under investigation in several other countries, including Spain and Germany. European data protection authorities are currently examining various privacy issues raised by generative AI tools under the EDPB umbrella.

Meta also launched Threads as a competitor to Twitter in more than a hundred countries earlier this month, but delayed launching the platform in the European Union “due to impending regulatory uncertainty” linked to the new digital competition law, the so-called Digital Markets Act.

However, it is thought that the development of applications such as Bard and ChatGPT will have a serious impact on unemployment rates around the world. Pengcheng Shi, Associate Dean of Programming and Information Science at the Rochester Institute of Technology, said that ChatGPT, which is currently banned in schools in the US and New York, “could easily be taught”, especially if it is trained in middle and high school courses.

GRC LEGAL Comment: With the proliferation of artificial intelligence, many people and leaders around the world have begun to see it as a threat to humanity. As AI technology continues to evolve, questions about the extent to which privacy will be violated or interfered with by these applications are likely to remain on the agenda.

New Application from Meta: “Threads”

While Meta’s new Threads app has reached 100 million users in just a few days, making it Twitter’s strongest competitor to date, the app has privacy experts concerned about the personal data it collects. The biggest source of concern about Threads is Meta’s history of privacy practices, which was previously fined for providing sensitive personal data required by the GDPR without the appropriate consent of the data subjects.

Although Threads is a newcomer to the world of social media platforms, much is already known about how the platform collects, stores and shares user data. That’s because Threads is subject to the same privacy policy and business model as Meta’s other Meta-owned platforms in terms of what it can and cannot do with the private information it collects about its users; just like its sister platforms Instagram and Facebook, Threads will collect a lot of data about its users.

Meta’s applications receive all kinds of information entered by users; According to Threads’ app store entry, this information may include sensitive data such as health and fitness information, financial information, location and browsing history.

The platform provides the company with information about which posts users interact with and who they follow. According to the Threads privacy policy, this includes “the types of content you view or interact with and how you interact with it” and how long and how often you use Threads. In addition to users’ Threads activity, the company’s privacy policy states that it can also access GPS location, cameras, photos, IP information, the type of device being used, and device signals, including Bluetooth signals, nearby Wi-Fi hotspots, beacons, and cell towers.

Putting this information together, especially when taken together with all the data Meta has already collected through Facebook, Instagram and Meta Pixel, it would not be difficult to create a highly detailed and complex map of people’s lives.

You are the Product!

Meta’s massive collection of data is for one purpose: to sell adverts. Although Threads does not currently run adverts, experts say it will undoubtedly do so in the future. In the meantime, it is also on the agenda that the information collected in Threads can be used as part of the larger data ecosystem that Meta uses to serve ads on its other platforms.

“Despite public outcry, warnings from regulators and fines, Meta has not only not changed its business model; it continues to serve targeted ads, i.e. surveillance advertising,” said Carissa Veliz, associate professor at Oxford University’s Institute for Artificial Intelligence Ethics. The sensitivity of the data the company collects is also a concern, Veliz said, adding that this data could include sexual orientation, race and ethnicity, biometric data, union membership, pregnancy status, politics, religious beliefs, and could potentially be transferred to third parties.

These third parties include marketers and law enforcement agencies. Last year, Meta received about 240,000 requests for user data from law enforcement agencies.

In the US, Meta has recently faced special scrutiny for its collection and distribution of users’ health data and data that could be used in abortion-related prosecutions, following the Supreme Court’s decision to overhaul federal abortion protections. Last year, a mother and her daughter were charged with aiding and seeking illegal abortions in Nebraska based on Facebook messages Meta shared with local police.

In the case of Threads, health information can even be revealed, for example, in the form of interacting with or sharing posts that may indicate whether a user is pregnant or not. At that point, law enforcement will be able to subpoena Meta for its posts, even if the profile is private.

GRC LEGAL Comment: Although the Threads app does not run adverts, it is clear that Threads collects all sensitive data for inevitable future advertising and marketing activities and will subject users to profiling. The fact that the analyses are performed on users’ sensitive data such as sexual orientation, race and ethnic origin, health information, etc. may trigger high-level discrimination activities. Meta’s continuation of the relevant practices despite the highest administrative fine imposed by the Irish Data Protection Authority in recent months within the scope of GDPR raises concerns about the notion of privacy day by day and raises questions about whether the penalties are deterrent and whether the measures are sufficient.