REMEDIES TO THE PLIGHT OF PRIVACY AND EXCESSIVE DATA COLLECTION IN DIGITAL MARKETS
Akshat Kothari and Samridhi Shrimali
Data is critical not only for gaining competitive advantage but also for surviving in the modern environment and gaining geopolitical influence. A market player can achieve near-monopoly status as a result of excessive data on their users and create a “winner-takes-almost-all effect.” In digital markets where data is collected through direct and indirect ways, users are often not cognizant of the data being sold and shared for different purposes. This raises the concern as to the degree privacy and data security issues can be factored into competition law. Thus, through this article, the author has suggested remedies that can counteract the challenge of privacy and abuse of dominance via excessive data collection.
Notably, consumers resort to online platforms like Facebook and Google for personalized and fine-tuned services administered by them. Pertinently, these specialized services are provided to the consumers with the help of an excessive amount of data which is gathered by the third-party trackers.[i] This third-party tracking gives rise to numerous questions on antitrust, privacy and democracy. However, in contemporary times, the antitrust concern in the breach of privacy is imperative.
COLLECTION OF DATA vis-à-vis ABUSE OF DOMINANCE
The constantly growing digital platforms, such as Google and Facebook, have started to exercise their substantial market share in a manner that infringes fundamental privacy obligations on these platforms. For example, these platforms create an illusion of choice. In many cases, the services disguise the fact that users have very few actual options, and that extensive data sharing is implicitly accepted simply by using the service. Users may be persuaded to share more information if they feel in control. These companies use a variety of user interface design strategies to encourage users to click and select certain alternatives. For example, Facebook uses these dark patterns to nudge their users to subscribe to newsletters add items to their carts or sign up for services. This is not the problem, but this excessive nudging hinders an individual’s right to privacy by using dark patterns or privacy intrusive default settings. Dark patterns are manipulative or deceptive practises that have the effect, intentionally or unintentionally, of obscuring, subverting, or impairing consumer autonomy, decision-making, or choice. Dark patterns are frequently carefully designed to influence user decision-making or trick users into performing actions they did not intend to perform. Excessive data collection, unfair trade practises, personalised pricing, and behavioural manipulation are examples of ex-post enforcement exploitative abuses which occurs due to the vague and inconsistent privacy policies of online platforms (data collector) provided to the user (data holder). Undertakings that possess an excessive amount of data pose a challenge to the undertakings that do not have access to such indispensable data. This creates an unfair trading condition. Thus, for excessive data collection to be analogous to the unfair trading practices, as highlighted under Article 102(a) of TFEU, there need to be a significant adaptation apropos data-driven economies.
When an undertaking possesses a large volume of data, they tend to extract more data from the users. This way, the businesses can profile individual consumers and manipulate their choices and preferences based on users’ personality traits. For example, once it was alleged that Uber is aware of the mobile phone battery of its users, and it enables them to pay more price by taking the advantage of the urgency. Further, from the perspective of antitrust, this invasion of privacy degrades the quality of services of the platform. Degrading the privacy is seen as a quality assessment tool. For example, when Facebook was investigated by German watchdog Bundeskartellamt, it was alleged that Facebook collects a vast amount of consumer data and amalgamated it to the particular user’s Facebook data thereby infringing the user’s privacy. Hence, invasion of privacy yields in degradation of quality. Thus, the theories of harm like excessive pricing and unfair trading conditions raise competition law and data privacy concerns. In the absence of any uniform legislation governing the data privacy and competition laws issues, the authors have suggested the remedies to address the challenges posed by the online platforms.
Competition and privacy-friendly domain
The dominant undertakings have a significant advantage and can provide services that the other competitors cannot because of their ability to collect massive quantities of data. Therefore, to minimalize these anti-competitive impacts, competition authorities from around the globe may want to consider the strategy followed by the French Competition Commission in the case of GDF Suez, in order to alleviate such issues. Wherein, the dominant undertaking was directed to allow access of its consumer data to all the other competitors. Provided that this data ought to be indispensable for that particular business and if not shared it would otherwise lead to exclusionary conduct by the dominant undertaking. Additionally, this arrangement could be supported by a mutual agreement to share essential data with the other undertakings in the market, empowering them to compete. Moreover, even though the arrangement would aid to the measure for anti-competitive conduct. It would certainly create an appalling situation for user privacy as this solution to dominance by the way of expanding accumulated data to more and more companies could create havoc for consumer privacy. Therefore, to come to a middle ground, dominant undertakings could share the pseudonymized personal data or anonymised data instead. The most viable solution will be in the case of pseudonymization of the data, wherein the data is processed in such a way that it can no longer be attributed to a specific individual without the use of additional information, which can be kept separately and subject to technological and organizational safeguards to ensure non-attribution to a specific or recognizable person. For example,
- Encryption with a secret key, a technique whereby the plain text is converted into unreadable code, and the decryption key is kept secret.
- Deterministic encryption with key deletion: a technique in which a random number is chosen as a pseudonym for each attribute in a database and then the correspondence table is erased.
- Tokenisation is a tactic that involves replacing card ID numbers with values that are less useful to an attacker.
This could be useful for some data-driven businesses for processing personal data, although could only be generalized for all businesses, only if implemented in accordance with the corresponding laws and regulations in technical terms.
‘opt in’ and ‘opt out’ options in the structural framework
Under the provisions of the General Data Protection Regulation, having user’s free consent in the data collection is essential. The Consent ought to be given freely, without coercion and on voluntary basis. Further, the options like “opt-in” and “opt-out” should be made available in the settings of the individual account on these undertakings. Whereby the defaults option in the trackers would be disabled so that the undertakings would not be able to use the data automatically. This kind of arrangement could be demarcated as the “opt-out” option as mentioned above. Whereas the “opt-in” option could be enabled only in return for compensations paid to the consumers by the undertakings, where these undertakings can harness the data, provided that, there ought to be regulations and provisions to check any misuse of the said positive payment by the dominant undertakings with intense network effect The said positive payment becomes essential along with the necessary application of regulations because there is a high possibility that these undertakings can manipulate the prices because of the bargaining power vis-à-vis individual users to give them a monopsony price(in terms of data harvesting).
Additionally, the price of the data could be subjective to the individual whose data is being taken into consideration. In these cases, the user shall be compensated up to the extent she/he sells personal data. For instance, the user shall be paid in accordance with the utilization of the data for a specific purpose such as political campaigns, or for selling data to other third parties. This positive payment by the undertaking in return for data may lead to the emergence of licensing market for users opting-in to share their data with the undertakings in order for them the harvest the same, as it also facilitates users to transfer this data to undertaking that provides a higher return. This arrangement also aids better conditions in reference to the higher value for their privacy. In contemporary times, this remedy will only work when the national competition authorities would ease users to collectively bargain with the platform’s rates for the payments so that there lies no place for misuse caused by the dominant undertaking because of their market power. The value and the price of the data collected may also fluctuate by collective bargaining between privacy-conscious users and digital platforms, ultimately by the formation of a collecting bargaining society by the various user groups that can negotiate with the digital platform, providing an effective solution. These bodies could exist in a unified form or bifurcated in several parts as in accordance with the different preferences for data protection, assuming preference for privacy protection is heterogeneous in nature.
Coherent application of laws
In competition law, the same could be analysed when other competitors suffer because of inaccessible data causing ‘unfair’ conditions as well as consumer harm in terms of market ‘unfairness. In Data Protection law, users face ‘unfairness’ when there is lack of consent and ‘fairness’. Lastly, the ‘unfair’ use of data could have reverberation on Consumer Protection law. Therefore, a coherent applications of all areas of law based on the mutually inclusive concept of fairness, rather than having separate enforcements and sticking to the vicious cycle of procedures and sanctions. The flexible implementation of the principle could be used for better outcomes and feasible procedures vis-à-vis privacy protection.
The importance of the competition law-data protection nexus has increased dramatically during the last decade. Data protection concerns about the acquisition and processing of personal data by prominent digital firms have grown in response to the growing impact of digital services in customers’ daily activities. For the same, the authors have proposed policy-based approaches to solve the problem of three fundamental discrepancies. First, there is abuse of dominance in digital markets; second, there is a skewed structural framework; and third, in regard of the inconsistency in application of law. Increasing economic concentration and modern technological breakthroughs have prompted many to demand for modifications to current laws to keep up. Hence, It is high time for competition law authorities around the world to develop competition policies that can prevent abuse of dominance in digital marketplaces while also striking a balance between individual privacy and enhanced services for the customers.
The authors are third-year students at the Institute of Law, Nirma University
[i] Viktoria H.S.E. Robertson, ‘Excessive data collection: Privacy considerations and abuse of dominance in the era of big data’ (2020) 57 Common Market Law Review 161-190.