What’s Happening in the World?

While the field of Data Protection is developing at an accelerating pace in our country, worldwide innovations continue to remain on the radar of the Personal Data Protection Authority (“Authority”).

From the examples we have repeatedly encountered before, we witness that the Authority keeps up with the world agenda, especially the European General Data Protection Regulation (“GDPR”) regulations, and tries to catch up with the requirements of the fast-moving data privacy world.

As GRC Legal Law Firm, we closely follow the world agenda and present a selection of the current developments for your information with this content.

The news below belongs to December 2022.

Twitter x Zero-Day Attack

Twitter confirmed that a threat actor used a Zero-Day vulnerability to compile a database of user information. Twitter said the vulnerability was fixed in January 2022, but the database, which contains non-public information on more than 5 million users, was reportedly shared for free on a data marketplace forum, and another database containing potentially 17 million records was reportedly created using the same vulnerability.

The database of 5,485,635 Twitter user records, which was offered for sale for $30,000 in July, was shared for free on Breach Forums on 24 November. Much of the data was exposed, including Twitter usernames, login names and verification status, and private information such as phone numbers and email addresses.

The information was collected using an Application Programming Interface (API) vulnerability, as disclosed by a hacker. It is stated that APIs allow computers to communicate with each other and that they are very important and should be addressed in line with this importance, as they account for approximately 80% of all traffic passing through the internet.

In recent data breaches, it is witnessed that API-based problems such as authentication (or lack thereof) based problems, lack of resources and speed limitation, error detection and logging, and the combination of these have exposed a significant amount of personal data.

In a statement in August, Twitter said: “As a result of a vulnerability, if someone identifies an email address or phone number to Twitter systems, Twitter systems will tell the person which Twitter account the email address or phone number is associated with. This bug was caused by an update to our code in June 2021. When we learned about it, we immediately investigated and fixed it. At the time, there was no indication that anyone had exploited the vulnerability.”

Twitter also said, “…we cannot verify every potentially affected account, and we are particularly vigilant about people with pseudonymous accounts that could be targeted by government or other actors,” and confirmed that it would contact any users affected by the issue.

A sample of the database, which contains more than one million phone numbers for French Twitter users, was obtained by Bleeping Computer, which confirmed with multiple users that the numbers are real and include data from the US, as well as European countries and Israel.

Among the concerns raised by the data leak are that users who choose to remain anonymous could be compromised, accounts of high-profile individuals could be compromised, and that more personal data could be accessed than could be transmitted in emails mistakenly opened as being sent by Twitter.

Telegram x India

India is one of the largest markets for Telegram, bringing together around 150 million users in the South Asian market, which had gained particular popularity among some users, partly due to the piracy problem. The platform is full of channels where films and TV shows are widely shared, sometimes easily discoverable with tens of thousands of users.

Telegram has disclosed the names, phone numbers and IP addresses of executives who have been found guilty of copyright infringement under a court order in India. The disclosure was an important example of the data the instant messaging platform stores on its users, which can be disclosed by the authorities.

A teacher, Neetu Singh, discovered that his course material was being sold unauthorisedly on Telegram channels, sued the platform for not doing enough to prevent the unauthorised distribution of the material, and was subsequently forced by the Delhi High Court to share the data.

An Indian court had previously ordered Telegram to comply with Indian law and disclose details about those operating such channels. Telegram argued that disclosing user information would violate the privacy policy and laws of Singapore, where it places its physical servers to store users’ data, but the defence was not successful. The Indian court said copyright owners could not be left “vulnerable to actual infringers” because Telegram chose to locate its servers outside the country. Judge Prathiba Singh said Telegram had complied with this ruling and shared the data.

The data in question could be disclosed to government officials and police with the express instruction that it would not be disclosed to any third party and only for the purpose of conducting the court, the court said.

Telegram spokesperson Remi Vaughn said in a statement: “Telegram stores very limited or no data about its users. In most cases, we cannot even access any user data without specific entry points, and we believe that is the case here. As a result, we cannot confirm that any private data was shared in this case,” he said.

Hungary x Election Campaign

The Hungarian government abused personal data during the 2022 national election campaign, tilting the already uneven playing field in favour of the ruling party, Human Rights Watch (“HRW”) said in a report published today.

The blurring of boundaries between government and ruling party resources and government capture of key institutions led to selective application of the law, which resulted in Fidesz’s advantage.

“Trapped in a Web: The Exploitation of Personal Data in Hungary’s 2022 Elections” examines the data-driven campaigning in Hungary’s April 2022 elections, which resulted in a fourth consecutive term for Fidesz and Prime Minister Viktor Orbán. HRW found that data collected from people applying for public services was used to disseminate Fidesz campaign messages.

“Using people’s personal data collected to access public services to bombard them with political campaign messages is a betrayal of trust and an abuse of power,” said HRW researcher Deborah Brown. The Hungarian government must stop using personal data for political campaigning and ensure a level playing field for elections.”

HRW spoke to experts on privacy and data protection, electoral integrity and political campaigns, including representatives of political parties, companies involved in data-driven campaigns and individuals whose data is misused by political campaigns.

It was found that the government misused data collected from people who signed up for the Covid-19 vaccine, applied for tax benefits or signed up for compulsory membership of a professional association to spread Fidesz’s campaign messages. For example, people who submitted their personal data to a state-run website to register for the Covid-19 vaccine received political messages intended to influence the elections in favour of the ruling party.

The 2022 elections took place after 12 years of rule by the Orbán government, which undermined judicial independence, captured public institutions, controlled the media environment, criminalised civil society organisation activities, and demonised vulnerable groups and minorities.

Hungary has a responsibility under national and international law to protect privacy and guarantee the right to participate in democratic elections. This requires a level political playing field on which all parties can act subject to the same conditions.

People whose data was misused said they did not believe they had consented to receive other government communications when they registered on the vaccine website, were angry that their registration data was being used for political and election campaigns, and felt that the government was taking advantage of them at a particularly vulnerable time during the pandemic.

Political parties in Hungary, like in other countries, have invested in data-driven campaigns, such as building detailed voter databases, conducting online surveys and consultations, buying adverts, deploying chatbots on social media, and reaching out to voters through direct automated calls, bulk SMS messages and emails, forgetting that these investments can have significant human rights, in particular privacy, consequences.

HRW found that opposition parties’ handling of personal data also lacked transparency and carried privacy risks, but unlike the ruling party, HRW found no evidence that their handling of data created unfairness in the electoral process.

HRW said Hungary needs to address shortcomings in laws, policies and practices regarding the use of personal data for political campaigning. First and foremost, it said, the independence and impartiality of the judiciary and government bodies representing the electoral pillar should be ensured, supported by legislation, and independent and impartial oversight of data protection authorities should be strengthened.

Meta x IDPC

The Irish Data Protection Commissioner (IDPC) fined Meta-owned Instagram and Facebook €265 million for data scraping practices. As most Big Tech companies have their European headquarters in Ireland, it was inevitable that the IDPC would become the GDPR enforcer.

What is data scraping?

Data scraping can be defined as the process of extracting data from a website. In the simplest terms, copying information from a website and then pasting it elsewhere is actually a manual scraping (“scraping”) process. In general terms, data scraping infrastructures can be used to listen to the data flow between web servers and/or applications and scrape the relevant data from files.

The investigation was prompted by the leak of personal data on Facebook, which was posted online on a hacker forum in April 2021 and included personal data such as name, location, date of birth, telephone number and e-mail address.

The data leak reportedly concerned 533 million people in 106 countries, affecting 86 million people in the European Union (“EU”) alone. At the time, Facebook stated that mass data scraping had occurred due to a security vulnerability that the company fixed in August 2019, and that the leaked data was therefore old.

The investigation was not directly related to Instagram’s data leak, but focused on the Facebook Search Button and Contacts Matching on Facebook and Instagram Messenger. It is known that these tools enable users to find their friends/acquaintances on Facebook and Instagram through the platform based on their phone numbers.

The December decision concluded that the social networks concerned had breached European privacy rules between May 2018 and September 2019 and imposed a series of corrective actions, as well as a fine of €265 million. A spokesperson for Meta said in a statement that the company had made changes to its systems, including removing the ability to scrape features using phone numbers, and that unauthorised data scraping was unacceptable.

Meta is expected to appeal this decision. Because this penalty is the second largest penalty imposed on Meta after the 405 million euro fine imposed on Instagram for violating the privacy of children (For Detailed Information: For detailed information, you can refer to the fourth issue of our What’s Happening in the World Bulletin, which includes news from September.), it also surpassed the € 225 million fine imposed on WhatsApp for not complying with the EU’s transparency requirements.

As it is known, these decisions regarding Instagram and WhatsApp in the past have passed through the dispute resolution mechanism due to other European data protection authorities objecting to the Irish Data Protection Authority’s conclusion and demanding heavier fines; however, this decision has not yet been challenged by any authority.

Meta was sanctioned approximately €1 billion for data protection violations under EU law. Meta, which has experienced a sharp decline in revenues in recent months and recently had to lay off more than 11,000 staff, continues to be hit by data protection authorities.

European Union x NIS2

The new Directive, dubbed NIS2 (“the Directive”), will replace the existing directive on the security of network and information systems. Ivan Bartoš, Minister of Regional Development of the Czech Republic, said: “Cyber security will undoubtedly remain a major challenge in the coming years. The risks for our economies and citizens are too great. Today we have taken another step to increase our capacity to counter this threat.”

It is believed that NIS2 will form the basis for cyber security risk management measures and reporting obligations in all sectors covered by the directive, such as energy, transport, health and digital infrastructure.

The revised Directive aims to harmonise cybersecurity requirements and the implementation of cybersecurity measures in different Member States, and to achieve this, it sets out minimum rules for a regulatory framework, establishes mechanisms for effective cooperation between the relevant authorities in each Member State, updates the list of sectors and activities subject to cybersecurity obligations, and provides remedies and sanctions to ensure implementation.

It is also reported that the Directive will formally establish the European Cyber Crises Liaison Organisation Network EU-CyCLONe, which will support the coordinated management of large-scale cybersecurity incidents and crises.

While under the old NIS directive, Member States were responsible for determining which entities would meet the criteria to qualify as operators of essential services, the new NIS2 Directive introduces a size limit rule as a general rule for determining regulated entities. In other words, this means that all medium-sized and large organisations operating in sectors or providing services covered by the Directive will be covered by the Directive.

The Directive also clarifies that it does not apply to organisations operating in areas such as defence or national security, public safety and law enforcement. The judiciary, parliaments and central banks are also excluded.

While it is stated that the NIS2 will also apply to public administrations at central and regional level, it is also emphasised that Member States may decide to apply the Directive to such organisations at local level. A 21-month harmonisation period is envisaged for member states to adapt the Directive to their domestic laws.

Twitter x IDPC

Data scraping practices seem to be on the IDPC’s radar. The data scraping practices we reported in our “Meta x Irish Data Protection Authority” article have also come to Twitter’s attention.

The IDPC, which is responsible for overseeing Twitter’s activities in the EU, has requested an explanation from Twitter about a data scraping incident in which the profile information of millions of Twitter users, including e-mail and phone numbers, was leaked online and is awaiting a response from Twitter.

Twitter acknowledged in August that hackers had exploited a vulnerability in its system to obtain Twitter profiles linked to phone numbers and emails, but said at the time that it had fixed the vulnerability.

While Twitter did not confirm the number of affected accounts, media reports citing hackers said that profile details, including email addresses and phone numbers, of 5.4 million users were freely shared on a hacker forum as recently as 24 November. According to a website, a second Twitter profile account dump, which exploited the same vulnerability, exposed the details of millions more users. The IDPC said it was awaiting responses on reports of both incidents.

There are already two investigations into Twitter in Dublin, both of which were opened before Elon Musk became CEO of the company. Helen Dixon of the IDPC said last month that her office would question the company’s executives about ongoing investigations.

While there was no response from Twitter, the fact that this incident came to the agenda exactly one week after the penalty imposed on Meta and took place in almost the same way strengthens the possibility that a similar penalty may come to Twitter.

Meta x Record Fine

Meta looks set to soon face a big bill for the three social networks it owns, Facebook, WhatsApp and Instagram. The European Data Protection Board (EDPB) is expected to issue rulings targeting the three platforms, after which Meta’s chief regulator in Ireland will issue a final decision within a month.

Details of the fine and its potential value will be kept confidential until then, but according to Meta’s financial statements, the triple total of fines could exceed €2 billion. This estimated fine amount is expected to be the largest GDPR-related fine ever levied on a single company.

According to Irish documents, Meta has earmarked €3 billion for EU privacy fines in 2022 and 2023.Instagram was fined €405 million in September for violating children’s privacy, and Facebook has so far been fined €282 million for data breaches.

When the budget in question and the fines that have to be paid so far are calculated, Meta leaves 2 billion euros for future data breaches. When it comes to industry giants, we see that the perspective has evolved that the breach is inevitable. Therefore, it becomes a more reasonable expectation that the budgets allocated for fines will become huge instead of increasing measures in the future.

Critics consider Meta’s layoff of 11,000 employees worldwide for economic reasons when evaluating the budget allocated. The three fines expected to be levied may not only hurt Meta’s pocket, but may also affect its business model.

The decisions stem from complaints by Max Schrems, who accused the company of not having the proper legal basis for processing the data of millions of Europeans. If the final judgements invalidate Meta’s claim that it processes data as part of a contract with users, the company will have to seek another legal basis for its data-driven ad targeting model.

The IDPC largely supported Meta’s argument in a draft decision published last year that it “needs the data to fulfil a contract with its users” to provide personalised ads. This viewpoint, however, has long left Ireland in the minority among its peers. The Norwegian Data Protection Authority said the Irish interpretation would render the GDPR “meaningless”. The IDPC was also alone in voting against EU guidelines that prohibit companies from using data to target adverts by making the contract the legal basis.

As a result, Meta may need to seek new legal grounds for data processing in the wake of the rulings. Considering that a high-profile case about data transfer from Europe to the US is also ongoing, it seems inevitable that it will be under scrutiny for a long time and its steps will be followed.

Apple x Cyber Security

Apple has announced a series of security and privacy enhancements that it is offering as a way to help protect personal data from hackers, including what civil liberties and privacy advocates have long noted.

Apple is making it possible to choose to protect more of the data backed up in the cloud (“iCloud”) using end-to-end encryption, ensuring that no one but the user can access that information. The changes are said to help users protect their digital lives from hackers in the unlikely event that an advanced state actor manages to infiltrate company servers.

Privacy advocates say the changes could have a more immediate impact on the types of user data that law enforcement and government agencies can obtain from Apple. “This kind of protection is more valuable in that it protects not against cybercriminals, but against people who abuse the power of government to force the company to hand over data,” Cahn said.

For many years, Apple has been in a position where police have been looking for information. Apple’s Law Enforcement Guides provide guidance on how they can help with investigations, and the new change will provide a safe harbour for those who choose to use the privacy feature.

The change is a concern for government agencies that want to secure user data to aid their investigations, and the FBI has already expressed its displeasure at being prevented from accessing user data by the agency. In a statement, the FBI said it was “deeply concerned” by Apple’s decision and that the agency needed another way or alternative solutions to access this information.

Companies like Apple have become an increasingly attractive asset to both hackers and law enforcement agencies because of the vast amount of information they hold about people. According to the company’s latest transparency report, government and law enforcement requests for the data Apple collects have increased.

Recent years have seen a spike in global cyberattacks and data breaches. According to a report from the Identity Theft Resource Center, there were 404 publicly disclosed data breaches in the first quarter of 2022, up 14% from the same quarter a year earlier.

Apple’s so-called “enhanced data protection for iCloud,” which refers to end-to-end encryption of user information stored in iCloud, is detailed below:

It will be released in the US before the end of the year and worldwide in 2023,

It will first be made available to a small test group before launch,

Information such as messages, notes and photos backed up to iCloud will be fully encrypted,

The change will not cover all data: contacts, calendar information and e-mail will not be encrypted,

Users will activate the feature voluntarily,

The encryption key and the code used to access the encrypted data will be stored on the device, not in iCloud,

The feature, which is not turned on for all users by default, remains a point of contention for privacy advocates.

“I think switching to privacy by default for iCloud is the most important step, but I’m less critical of Apple given how difficult it would be to disable so many email programs and calendar tools,” Cahn said. Apple says the system is not set by default because it requires users to be responsible for encryption keys and for regaining and recovering access to information.

Finally, the company offers a code system that allows people to verify that their messages only went to the intended recipient and were not intercepted by a hacker. The process may be familiar to users of the encrypted messaging app Signal. Two people who activate the system will be able to exchange their unique codes, and their devices will automatically recognise if someone with a different code has entered the conversation.

Conversations between users who have enabled the verification feature will trigger automatic alerts if an attacker manages to breach cloud servers and plant their own devices to listen in on these encrypted communications, the press release announcing the products said.

Clubhouse x Garante

Italy’s Data Protection Authority (Garante Per La Protezione Dei Dati Personali, “Garante”) has fined Clubhouse, a US social media app developed in 2020 and popularised during the Covid-19 lockdowns, €2 million for violating the GDPR.

In the press release made by Garante, it was stated that the application owned by Alpha Exploration was not transparent enough in the use of user data and violated numerous provisions of the GDPR. The statement emphasised that the app allowed users to store and share audio without authorisation, profiled and shared account information without a proper legal basis, and that the recordings made by the social network had indefinite retention periods.

In addition to the fine, Garante ordered the app to introduce a feature that allows users to know before entering a chat room that the chat may be recorded, and to create a mechanism to inform non-users about their personal data. Clubhouse was also instructed to clarify and add to its privacy notice some information about data retention periods.

Although there is no response from the application yet, it is reported that the application has also entered the investigation radar of the French Data Protection Authority.

Meta x Personalised Advertising

According to The Wall Street Journal (“WSJ”), the EDPB has ruled that Meta cannot force users to accept personalised advertisements.

Meta’s Irish subsidiary had believed that it could eliminate the requirement to obtain consent from users by adding an addendum to its terms and conditions when the GDPR came into force in May 2018, and the IDPC had supported Meta in doing so. Four and a half years later, the EDPB’s decision rejected the IDPC’s view.

The ruling stated that Meta’s use of personal data for advertising purposes cannot be justified by citing privacy policies or contracts, and that users should be given a yes/no choice by invoking their explicit consent.

While the EDPB decision itself has not yet been published, it is expected to be published in January 2023, together with the IDPC’s final decision, in light of the reversal directed to the IDPC. According to WJS, in addition to the general cessation of personalised advertising, Meta was also ordered to pay a large fine, the amount of which is not yet known.

The IDPC is alleged to have favoured Meta during the procedure and even tried to influence the EDPB Guidelines in Meta’s interests. However, other European data protection authorities also insist on rejecting the IDPC’s view.

It is also on the agenda that Meta may appeal the decision to be issued in January 2023, but it seems highly unlikely that the objections will be finalised after the EDPB decision. In particular, it is thought that users may take action against the non-consensual use of their data for the last four and a half years.

Google x CJEU

The Court of Justice of the European Union (“CJEU”), the EU’s highest court, has ruled that individuals based in Europe can get Google to remove search results about them if they can prove that the information is clearly false.

The case began when two investment managers requested Google to remove the results of a search based on their names that linked to certain articles criticising the group’s investment model. Although the information in the articles was allegedly false, Google rejected the request, arguing that it had no certain knowledge of the accuracy of the information.

With its judgement, the CJEU paved the way for investment managers to successfully exercise the so-called “right to be forgotten” under the GDPR. In the press release accompanying its judgement, the court said: “Persons seeking to purge false results from search engines must provide sufficient evidence that what is said about them is false. But this evidence need not come, for example, from a lawsuit against a publisher. They need only provide such evidence as may reasonably be requested of them.”

OECD x Intergovernmental Agreement

On 14 December, OECD countries adopted the first intergovernmental agreement on common approaches to protecting privacy and other human rights and freedoms when accessing personal data for national security and law enforcement purposes. Signed by 38 OECD countries and the EU, the Declaration, which is open for other countries to join, marks a major political commitment.

The OECD Declaration on Government Access to Personal Data Held by Private Sector Entities (OECD Declaration on Government Access to Personal Data Held by Private Sector Entities, the “Declaration”) aims to increase confidence in cross-border data flows in the digital transformation of the global economy by clarifying national security and law enforcement access to personal data under existing legal frameworks.

“In today’s digital age, the ability to transfer data across borders is essential for everything from social media use to international trade and co-operation on global health issues. Without common principles and safeguards, sharing personal data across jurisdictions raises privacy concerns, particularly in sensitive areas such as national security.”

This landmark agreement formally recognises that OECD countries support common standards and safeguards. It is believed to help ensure the flow of data between democracies with the rule of law, with the necessary safeguards for individuals’ trust in the digital economy and mutual trust between governments regarding their citizens’ personal data.

Rejecting any approach to state access to personal data that is inconsistent with democratic values and the rule of law, the Declaration was prompted by growing concerns that the lack of common principles in sensitive areas such as law enforcement and national security could lead to unnecessary restrictions on data flows.

The Declaration complements the OECD’s current and third phase of the Transition to Digital project, which focuses on data governance for growth and prosperity and provides evidence-based solutions to critical data governance challenges facing countries.

Apple x CNIL

Francois Pellegrini, Rapporteur of the French Data Protection Authority (Commission Nationale de l’Informatique et des Libertes, “CNIL”), recommended that Apple be fined 6 million euros for breaching privacy rules.

Although it is known that the CNIL is not bound by the rapporteur’s recommendations when making a sanction decision, it is generally stated that the rapporteur’s opinions have an extremely important weight on the Authority’s decisions; in this context, Pellegrini’s recommendation following the investigation carried out by the authority following a complaint made by “France Digitale Lobby” last year is thought to strengthen the possibility of a sanction decision from the CNIL.

The Lobby, which represents a large number of digital entrepreneurs and venture capitalists in France, stated in its complaint that iPhone manufacturer Apple’s old operating software iOS 14 does not comply with European Union privacy requirements. It was argued that iOS 14 collects unauthorised data from users to advertise mobile applications installed on their phones and to carry out targeted advertising activities, and that targeted advertising campaigns are carried out without a prior consent process.

It is known that Apple’s privacy updates called “App Tracking Transparency” offer users the option to prevent apps from tracking activity on other companies’ apps and websites. In his statement, Pellegrini stated that Apple’s previous operating system version iOS 14.6 did not obtain permissions from users with a correct process management at the point of collecting personal data, that these permissions were automatically given to the user and that only the options to turn off / not allow were offered, thus contradicting the privacy provisions of the European Union’s ePrivacy Directive (ePrivacy Directive).

In the next version of Apple’s operating system, iOS 15, it was reported that such a pre-authorisation mechanism was created. Gary Davis, Apple’s director of privacy, disputed the Rapporteur’s conclusions, stating that the company is extremely sensitive to user privacy. “Even the fact that the violation is not serious indicates that the fine should be reduced,” Davis said, requesting that the amount of the fine not be made public, while there is no information yet about when the CNIL will reach a decision.

Microsoft x 2023

Microsoft Corp (MSFT.O), which operates more than a dozen data centres in European countries including France, Germany, Spain and Switzerland, said in a statement that from 1 January, EU cloud customers will be able to process and store some of their data in the region.

Large businesses have become increasingly concerned about the international flow of customer data since the EU enacted GDPR in 2018. The European Commission is working on proposals to protect the privacy of European users whose data is transferred to the US.

The phased rollout of the EU data cap will apply to all core cloud services such as Azure, Microsoft 365, Dynamics 365 and the Power BI platform. “As we dug deeper into this project, we learnt that we needed to take a more phased approach,” said Julie Brill, Microsoft’s Chief Privacy Officer. “The first phase will be customer data. In subsequent phases, we will move logging data, service data, and other types of data to the edge,” she said, noting that the second phase will be completed by the end of 2023 and the third phase by 2024.

For large companies, data storage has become so large and spread across so many countries that it is difficult for them to understand where their data is and whether it complies with rules such as GDPR. “We are building this solution to make our customers feel more secure and have clear conversations with their regulators about where their data is processed and stored,” Brill said.

Microsoft has previously said it would challenge government requests for customer data and financially compensate any customer whose data it shared in violation of GDPR.