What’s Happening in the World?

While the field of Data Protection is developing at an accelerating pace in our country, worldwide innovations continue to remain on the radar of the Personal Data Protection Authority (“Authority”).

From the examples we have repeatedly encountered before, we witness that the Authority keeps pace with the world agenda, especially the European General Data Protection Regulation (“GDPR”) regulations, and tries to catch up with the requirements of the fast-moving data privacy world.

As GRC LEGAL, we closely follow the world agenda and present a selection of the current news for your information with this content.

The news below belongs to 2023 January.

The Data Act: “The Data Act”

In today’s digital age, data is the new currency. With this in mind, the European Union (“EU”) has enacted The Data Act (“Data Act”), which contains a set of new rules that will revolutionise the way data generated by connected devices is shared and used.

Consumers and businesses will be able to access their devices’ data and use it for after-sales and value-added services. Commercial and industrial players will have more data and benefit from a competitive data market. After-sales service providers will be able to offer more personalised services – and compete on an equal footing with similar services offered by manufacturers – while industrial data can be combined to develop entirely new digital services. The new regulation will allow users of connected devices, ranging from smart home appliances to smart industrial machines, to gain access to the data generated by the use of these devices, which is typically only collected by manufacturers and service providers.

Knowledge Corner

Data Act: The Data Act is a new EU law that aims to create a fair and competitive data market by facilitating data sharing and reuse across sectors and stakeholders.

Obligations on data sharing: The Data Act imposes an obligation on connected device manufacturers and related service providers to make data accessible to users and third parties under certain conditions and exceptions.

Unfair contractual terms: The Data Act prohibits and provides examples of contractual terms that are unilaterally imposed and deviate substantially from good commercial practice.

Making data available to public sector organisations: The Data Act sets out the framework for making data available to public sector organisations free of charge where there is an exceptional need to use the data for a specific public interest task.

Migration between cloud and edge services: The Data Act includes new rules to allow customers to switch between different data processing providers without undue delay or cost, moving their data and digital assets to another provider or to their own infrastructure.

International transfer of non-personal data: The Data Act extends the obligations for international data transfers under the European Union General Data Protection Regulation (General Data Protection Regulation, “GDPR”) and the Schrems II decision to providers of data processing services and requires them to implement appropriate safeguards to prevent incompatible or unlawful access by third states.

In order to comply with any data sharing requests, companies should ensure that they have appropriate mechanisms and policies in place to extract any data (including metadata necessary to identify that data) from their connected devices in a fast, secure, comprehensive and structured manner. Companies should also review their contracts to ensure that there are no terms that could be considered unfair under the Data Act.

Official Gazette Data Act

The Data Law entered into force on 11 January 2024 following its publication in the Official Gazette on 22 December 2023. Although some rules will come into force later, most of its rules will start to apply from 12 September 2025 and will apply to the stakeholders specified in the table on the next page.

Manufacturers and providers Manufacturers of connected products and providers of ‘related services’ placed on the market in the EU, regardless of their place of establishment.
Users Users of connected products or related services in the EU.
Data Subjects Data subjects who provide data to recipients in the EU, regardless of their place of establishment.
Data recipients Data recipients in the EU to whom the data is made available.
Public sector bodies Public sector bodies, the European Commission, the European Central Bank and EU bodies that request data subjects to provide data if such data is exceptionally needed for the performance of a specific task carried out in the public interest, and data subjects that provide such data in response to such request.
Cloud and edge providers Providers of data processing services that provide such services to customers in the EU, regardless of their place of establishment.
Other Stakeholders Participants in data domains and vendors of applications using smart contracts and persons whose trade, business or profession involves the deployment of smart contracts for others in the context of the execution of an agreement.

Data Sharing

The Data Act states that data subjects must not make it unnecessarily difficult for the user to exercise his or her preferences in relation to his or her data, for example by providing the user with options in a non-neutral manner or by adversely affecting the user’s autonomy, will to make decisions or choices through the structure, design, function or mode of operation of a user digital interface.

The Data Act imposes certain obligations on third parties that receive data at the request of a user:

To use the received data only for the purposes agreed with the user and to delete the data if these purposes no longer exist or are no longer necessary.

Not to make the data available to an organisation that is considered a ‘gatekeeper’ under the Digital Markets Act (“Digital Markets Act”)

Not to use the data in a way that has a negative impact on the security of the connected product or related services.

Not to prevent the user, who is a consumer, from making the data available to other parties.

Large online platforms designated as gatekeepers under the Digital Markets Act are not eligible third parties.

Micro and small companies are excluded from data sharing obligations. The same applies to data generated through the use of linked products or related services provided by an enterprise that qualifies as a medium-sized enterprise for less than one year and to linked products for one year after the date of their launch by a medium-sized enterprise.

While the concept of data sharing forms a fundamental part of the Data Act, there are limited circumstances in which users and data subjects may restrict or prohibit access to, use of or sharing of data. These limited circumstances include situations where the product connected to the business could undermine the security requirements and lead to a serious adverse effect on the health, safety or security of natural persons.

Obligations for Data Subjects Required to Make Data Accessible Under EU Law

In certain cases in business-to-business relationships, data subjects shall agree with the data recipient on the arrangements for making data available and shall do so on fair, reasonable and non-discriminatory terms and in a transparent manner. It is further provided that a data subject shall not discriminate between comparable categories of data recipients with regard to the arrangements for making data available. If a data recipient considers that the conditions under which data is made available to him or her are discriminatory, the data subject shall provide information to the data recipient to demonstrate non-discrimination.

Unfair Contract Terms

The Data Act states that any contractual term on access to and use of data or on liability and remedies for breach or termination of data-related obligations imposed unilaterally on a micro, small or medium-sized organisation will not be binding if it deviates substantially from good commercial practice and is therefore considered unfair, and gives examples of contractual terms that would automatically be considered unfair.

For example, an inappropriate limitation of remedies in the event of non-fulfilment of contractual obligations would be considered an unfair contractual term and would be void.

Switching between Data Processing Services

The Data Act contains new rules that allow customers to effectively switch between different cloud and edge service providers. The aim is to remove any pre-trade, commercial, technical, contractual or organisational barriers that hinder customers. For example, arrangements that prevent a customer from entering into new contracts with a different data processing services provider covering the same type of service will be recognised in this context.

The Data Act aims to tackle these issues by providing contractual requirements that should allow a customer to switch to another service provider or to migrate all exportable data and digital assets without undue delay – and in any case within 30 days after the expiry of a maximum two-month notice period – with full support and continuity of service during the transition.

In addition to its contractual measures, the Data Act sets out information obligations imposed on providers of data processing services, as well as a framework for the gradual reduction of fees related to migration to another service provider, and provides that three years after the entry into force of the Data Act, the so-called exit service will be provided free of charge to the customer.

International Transfer of Non-Personal Data

Adopting a similar regime to the GDPR, the Data Act will extend the obligations on international data transfers under the GDPR and the Schrems II judgement of the Court of Justice of the European Union to providers of data processing services who must implement appropriate measures to prevent international transfers of industrial data or access by a third state that is incompatible with EU or national legislation.


The Data Act provides for the development of interoperability standards for industrial data to be reused across sectors, defines and lays down basic requirements for facilitating the interoperability of industrial data, data sharing mechanisms and services, as well as basic requirements for smart contracts for data sharing.

It also lays down open interoperability specifications, such as architectural models and technical standards implementing rules and arrangements between parties promoting data sharing on issues such as access rights and technical translation of consent or authorisation, and European standards for interoperability of cloud service providers to promote a seamless multi-vendor cloud environment.

GRC LEGAL Commentary

As we mentioned in the fourteenth issue of our “What’s Happening in the World” series, the Data Act aims to bring about significant changes in data management in the European Union and to provide standardisation in this field. In this context, the Act strengthens the right of users to access their data by ensuring that data is made available in an easy, secure, comprehensive and structured manner and the right of users to share their data with third parties by ensuring that data can only be shared with the user’s consent.

In addition, the Data Act requires data recipients to regularly review their data processing activities and take necessary measures. At this point, data recipients are expected to take the necessary steps to protect the security and confidentiality of data. Considering that the Data Law will be implemented gradually and the adaptation process, it is not possible to fully assess its effects at this stage, but it can be said that it will strengthen the control of data subjects over their data and help increase the transparency and accountability of data processing activities.

Vehicle Location Data Increases the Risk of Tracking for Domestic Abuse Victims!

According to The New York Times, privacy advocates are concerned that modern vehicles are becoming “smartphones on wheels”, putting victims of domestic violence at greater risk. This is due to the number of cameras, weight sensors and smartphone-connected devices that transmit vast amounts of personal data to unknown organisations. For example, a woman in divorce proceedings was reportedly tracked by her husband using the “Mercedes Me” app, and her access to the app could not be blocked by Mercedes because her husband had initially obtained her car loan and registration.

GRC LEGAL Commentary

According to the findings of Mozilla’s “Privacy Not Included” project in 2023, it was seen that even the most basic privacy and security standards were not complied with in the new models of major car brands connected to the internet, and 25 brands examined by Mozilla failed the test.

In addition to data such as drivers’ race, facial expressions, weight, health information and where they drive, some of the vehicles tested collected data that you would not expect the vehicles to know, including details about sexual activity, race and immigration status.

It is known that manufacturers use a variety of data collection tools in modern cars, including microphones, cameras and phones that drivers connect to cars, but also collect data through apps and websites. As mentioned in the news article above, it is seen that these applications have even infiltrated the process of a couple in the process of divorce. In this sense, it is important that these brands carry out their privacy processes with great sensitivity and transparency. However, according to Mozilla, many car brands have been found to resort to “privacy washing” or provide consumers with information suggesting that they do not need to worry about privacy issues when the opposite is true. In this sense, it is quite possible to say that our privacy is violated even in our cars, in which we feel at home.

Information Corner

‘Privacy washing’ may be the worst type of deceptive advertising practice. In its simplest form, it refers to a company’s efforts to make it appear that the notion of privacy is very high within the company without actually following its data privacy set-up. Companies that dress up their products and/or services as ‘privacy’ are nonetheless collecting, sharing and selling customers’ sensitive/private data.

Google Chrome x Data Tracking Cookies

Google has begun testing changes to the way companies can track users online. The new feature in the Chrome browser will disable third-party cookies, which are small files stored on your device to collect analytics data, personalise online ads and track browsing. Initially, the feature will be available to about 30 million people, 1 per cent of global users.

Google describes the changes as a test and plans a comprehensive release to eliminate cookies later this year. However, some advertisers say they will suffer as a result.

Cookies can be used to save a variety of data about users, including what you do on the site, your location, the device you use, and where you go online next. Rivals such as Safari and Mozilla Firefox, which have far less internet traffic than Chrome, the world’s most popular internet browser, already include options to block third-party cookies that collect this data.

Google says randomly selected users will be asked if they want to “browse with more privacy in the browser”. Google Vice President Anthony Chavez said: “We are taking a responsible approach to phasing out third-party cookies in Chrome. If a site doesn’t work without third-party cookies and Chrome notices you’re having trouble, we’ll give you the option to temporarily re-enable third-party cookies for that website.”

Google says it is working to make the internet more private, but for many websites, cookies are a vital part of selling the adverts they are linked to. For users who have experienced the experience of visiting a website or making a purchase and then having the relevant adverts appear on all the sites visited, it is not hard to see that there is a disturbing dimension to such adverts.


Cookies, which have many types, are defined as low-sized rich text formats that allow some information about users to be stored on users’ devices when a web page is visited, while advertising and marketing cookies are used to show advertisements for their interests by tracking users’ online movements on the internet.

With the new feature to be offered by Google, it is aimed to disable advertising and marketing cookies. With the introduction of this feature by Google, which has the most widely used browser worldwide, it is possible that platforms such as Meta, X, which are on the agenda due to their advertising / marketing activities, which benefit from these cookies on a global scale, may be negatively affected, but with this step, users / interested persons will be prevented from being exposed to targeted advertising just by entering the website.

ICO x Businesses

According to analysis by a cybersecurity and data protection consultant, in 2023 the Information Commissioner’s Office (“ICO”) fined 18 businesses more than £14.3 million for misusing data. The ICO also reprimanded 36 companies, issued enforcement orders against 19 companies and sued four businesses for failing to fulfil their information rights obligations.

Social media platform TikTok was fined the highest at £12.7 million for breaches of data protection law, including failing to use children’s personal data in accordance with the law. The ICO estimated that there were 1.4 million under-13s in the UK who could use the video-sharing app in 2020.

Three marketing firms were fined a total of £310,000 for making a total of 483,051 unsolicited marketing calls to businesses and sending 107 million spam emails to jobseekers; two energy firms were fined a total of £250,000 for bombarding people and businesses on the UK’s ‘do not call’ register with illegal marketing calls; a business support consultancy was fined £558,354 for sending 558. 354 direct marketing SMS messages without valid authorisation; and an appliance service and repair company was fined £200,000 for making more than 1.7 million unsolicited direct marketing calls.

According to research by CSS Assure, in the last six months of the year, 10 companies were fined more than £800,000 for sending a total of 4,698,841 unsolicited text messages, 39,906,342 emails and making 1,937,028 nuisance phone calls.

Charlotte Riley, Information Security Director at CSS Assure Technology, said:

“The fines imposed by the ICO in 2023 underline the serious consequences of data misuse. The misuse of personal data not only violates data protection laws, but also undermines trust among consumers.

TikTok’s £12.7 million fine emphasises the importance of using personal data in accordance with the law and implementing appropriate safeguards, especially where children are concerned. TikTok is a large, recognisable brand and the fine was quite high due to the amount of data involved. However, much smaller SMEs were also sanctioned and fined.

Fines for unsolicited calls, text messages and spam emails, as well as for firms that ignored the ‘do not call’ recording, demonstrate the significant impact of invasive marketing practices. These fines send a clear message that companies must respect individuals’ privacy preferences and avoid bombarding them with unwanted communications.”


In order to comply with data protection laws, the ICO does not refrain from subjecting companies to administrative fines. The fines imposed on TikTok, which has a global notoriety, emphasise the need for stricter measures for vulnerable groups of people such as children, while the fines imposed on other companies aim to combat unsolicited marketing calls, spam emails and other forms of unwanted communication that all users are subject to today.

Considering the size of the fines, it can be considered as a warning to small and medium-sized businesses to respect the privacy rights of individuals and comply with data protection standards.

Microsoft Allows Cloud Users to Keep Personal Data in Europe to Address Privacy Concerns

Microsoft announced in January that it was updating its cloud computing service to allow customers to store all their personal data within the European Union rather than transferring it to the United States, where national privacy laws do not exist.

Cloud computing companies are moving towards localising data storage and processing due to increasing requirements in the European Union, which has strict data protection laws.

After former National Security Agency employee Edward Snowden revealed that the US government was eavesdropping on people’s online data and communications, Brussels and Washington have argued for years over the security of EU citizens’ data stored by technology companies in the US.

Microsoft said that its “EU Data Border solution goes beyond European harmonisation requirements”. The company had previously promised that its customers’ data would not be moved outside the EU. While Microsoft began storing and processing some data within Europe last year, it is now extending this to all personal data, including pseudonymised data found in automated system logs, which are automatically generated when online services are run. Later this year, Microsoft will begin to ensure that technical support data is kept in Europe.

Information Corner

Pseudonymisation is a data processing method that is not defined in Turkish legislation and is used to protect the confidentiality of personal data. In this method, the direct identifying elements of personal data are replaced with false or altered values called pseudonyms, and the relevant data continues to be subject to the LPPD as it is not anonymised.

For example, in data collected for a health survey, the names of individuals may be replaced with a specific coding system or other method instead of real values. This makes the data safer to analyse and share because the true values of the personal information are not revealed. However, researchers can still carry out their analyses and understand general trends.

GRC LEGAL Commentary

Companies are taking steps to store personal data within Europe. The steps taken by companies reflect their efforts to comply with increasingly stringent data privacy laws in the European Union. These developments reflect the need for technology companies to comply with local regulations on the storage and processing of customer data and can be considered as an important step in the context of personal data processing law. The adoption of the notion of personal data security by all companies in the coming period will facilitate the establishment of an infrastructure that will require less interference with personal rights.

CNIL x Yahoo

The French Data Protection Authority (Commission Nationale De L’informatique Et Des Libertés, “CNIL”) fined Yahoo Emea Company (“Yahoo” or “the Company”) 10 million euros for failing to respect the choice of Internet users to refuse cookies on the “Yahoo.com” website and for failing to allow users of the “Yahoo! Mail” messaging service to freely withdraw their consent to cookies.

What happened?

As is well known, Yahoo provides various internet services, such as a search engine and an e-mail service. In October 2020 and June 2021, the CNIL received numerous complaints about the failure to take into account the refusal of cookies and the obstacles encountered in withdrawing consent to their storage, and carried out several online inspections of the Yahoo.com website and the Yahoo! Mail messaging service.

The violations that were effective in imposing the relevant penalty are as follows;

Cookies placed without the user’s consent

During the investigation, the CNIL found that the cookie panel displayed when an internet user visits the “Yahoo.com” site provides access to a page containing a number of buttons designed to obtain consent for the storage of cookies and that, even in the absence of any explicit consent, approximately twenty cookies for advertising purposes are placed on the internet user’s terminal.

Guidelines on the non-revocation of explicit consent

The CNIL noted that users of the “Yahoo! Mail” messaging service were informed that if they wished to withdraw their consent to cookies, they would no longer be able to access the Company’s services and would lose access to messaging services.

The CNIL stated that the lack of consent or withdrawal of consent should not have a negative impact on the user, and that in this case, the Company, which only offered the option to withdraw from the use of the service without providing an alternative to the users who wanted to withdraw their consent, crippled the element of free will of consent.

The CNIL further emphasised that an e-mail address is an element of the user’s private life to the extent that it allows the user to make purchases, develop his network and archive important personal or professional conversations. Since users also have an archive of their e-mail addresses which they have built up over time, the options of changing the e-mail address concerned or stopping the use of the service are not fundamentally pro-user and easily implementable solutions.


It can be observed that cookie applications have recently been on the agenda of data protection authorities and companies have faced high fines as a result of incorrect applications. It is known that the Personal Data Protection Board recently imposed an administrative fine of 750,000 TL on the data controller, which is the distributor of an online game in Turkey, due to the improper implementation of cookie applications. In this context, it is of utmost importance that cookie panels are user-friendly, cookies that require explicit consent are processed in line with consent not only visibly but also in the background, and the opt-in mechanism is adopted.