Giving a new meaning to “user-friendly”: Recoupling of technological and social progress by promoting user (=human) rights, democracy and an efficient economy in the online world.
Daily zoom conferences and remote access to work files, shopping online, video calls to friends and colleagues, online webinars instead of face to face teaching: the Covid-19 pandemic has accelerated the use of digital solutions in both our private and our working life considerably. Yet for society at large to truly benefit from digitalization, the rules and regulation of this vast and complex conglomeration of different industries need to change. Whilst it is difficult and indeed pointless to argue against the benefits of digital products, the downsides and drawbacks of this fast moving and largely unregulated process are increasingly in the spotlight. Digitalization is creating space for fake news and cyber criminality while eschewing accountability or transparency, eroding trust in democracy, and many in the tech industry find themselves uninhibited by what are standards in the offline economy such as fair competition rules, anti-monopoly regulation or appropriate taxation. Quite the opposite, the largest multinational corporations were able to become true monopolies, creating inefficiencies and a dynamic system that poses great challenges for government regulation.
An unlevel playing field
Digitalization includes a myriad of different fields and topics and sectors, such as big data, social media, AI, automation, robotics, platform economies, and a variety of players from industries not restricted to what is traditionally considered ‘tech industry’, all of which have the potential to disrupt society, culture, politics and economy. They are designed to do so, to create better, more efficient, more connected spaces; but lacking democratic input and long term planning they often enough create severe problems. A principal source of these problems is that many digital services are provided for free, or at a significantly reduced price, in return for information about the users, which is sold to advertisers and other “influence sellers.” This “third-party-financed digital barter” – involving three-way transactions between digital service providers, data subjects and influence sellers – creates a system that is ultimately driven by the influence sellers for private gain. The objectives of the influence sellers are not well aligned with the objectives of the data subjects. This system generates great disparities of information and market power, further upsetting the alignment of objectives between influence sellers and data subjects.
User consent for what is effectively ubiquitous commercial surveillance is inadequate, given power and information asymmetries between individuals and dominant technology platforms. Data protection as currently defined and enforced is unable to secure user control of data, or freedom from commercial and even political and social manipulation. Users are not in a position to judge the value of their data, and neither do they have control over who ends up using it. They have little to no control over their personal data and any other collective data they generate. The consequences of this data gathering process are far reaching and severe. Digital service providers use complex algorithms to steer our attention and our information intake in specific directions while also structuring social exchange: largely with the goal of increasing revenues for the third-party influence sellers and marketing interests, and without our active awareness. This process undermines democratic, social, political and economic freedoms, to the point that fundamental human rights cannot be guaranteed in this kind of online space. All this endangers our society and democracy together with our economic systems. Missed out taxes harm the welfare state and thereby social cohesion, manipulation of information polarizes debate and endangers democratic process, open debate and thereby also social cohesion.
For all these reasons, a course correction not to say a revolutionary structural change in digital data governance is necessary. The opportunities of the digital revolution can continue to be beneficial to all while negative externalities are reigned in or softened. Digitalization will only become truly effective and efficient for most when inefficiencies and stark inequalities created by the current system of regulating are addressed.
A new classification to put users in control of their data
This is a job for high level politics: Policymakers are the only ones with the democratic power to change the rules accordingly to make the online world fairer, more efficient, and more competitive. These changes cannot be made by individual firms themselves and not by local civil society movements. It is precisely one of the main assets of the digital world that it “knows no borders”; this also means that system regulation cannot be undertaken by solitary actors, but that it requires multilateral effort.
The success of digitalization reform is conditioned by how we treat data. And it is important to distinguish between different types of Data:
- Official data (O-Data), which is “official data” that requires authentication by third parties for the purpose of conducting legally binding transactions and fulfilling other legal obligations in many jurisdictions. Authentication can come from the state or other legally accepted sources. Examples include a name, date of birth, professional qualifications, land registry deeds, or passport number, to be authenticated by the legitimate parties.
- Collective Data (C-Data), is “collective data,” which data subjects agree to share within a well-defined group for well-defined collective purposes.
- Private Data (P-Data) may be divided into “first-party data” volunteered by the data subject or generated by the data subject and observable by other parties, and “second-party data” generated by a second party about the data subject or inferred about the data subject from existing data. This is data that does not require authentication by third parties and is not collective. It may be data that is volunteered by the data subjects (such as personal photographs), generated by the data subjects (such as location data from mobile phones), observed (such as a transaction) or inferred (such as psychological data deduced from web searches).
We therefore propose the following:
- Give individuals genuine control over O-Data and P-Data, with easy-to-use technical tools and supporting institutions;
- O-Data should be verifiable from the authenticating source (and is subject to audit);
- Second-party data can only be used in the interests of the people it is about, echoing offline relationships such as doctor-patient or lawyer-client.
- We therefore need to create legal structures and necessary institutional support (such as education programs) for the establishment of a flourishing range of “data commons” to allow people, instead of platforms, to manage and benefit from C-Data, both individually and collectively.
A key issue with the current situation is that the big digital data providers are so very advanced in their knowledge and skills regarding data as well as so very non-transparent with how and why they use it. Most users do not know which data is collected about them and what happens with that data once it has been collected. But even if users go to the effort of understanding these things, they rarely have alternative options: many services do not function properly unless collecting all data is enabled and privacy rights such as GDPR are waived by the user during setup; and not taking part in these platforms or services is often not an option due to their monopoly position both in the office and in private, social life. Current rules on data protection are insufficient.
Eliminating power asymmetries
A further step towards protecting personal autonomy involves enforcing shared responsibility. Digital service providers can be made responsible for who they partner with to track or target users. Customer-facing websites and apps should be responsible for who receives access to their users’ data – whether that access is by sale or by placement of trackers and beacons on their sites. Therefore, the regulation of data governance needs to be changed from the top; starting with EU data protection laws, which are at this point among the most advanced in the world, if still insufficient. We therefore propose:
- To provide and secure digital rights of association for individuals that counterbalance platform power;
- To provide effective legal protection for vulnerable digital users;
- To ensure (for example via the Digital Services Act) that competition in the online world is treated the same as that offline, with greater coherence between competition, consumer rights and data protection agencies.
We must consider that commercial and political manipulation of vulnerable users through non-transparent, non-accountable collection and usage of their data without their knowledge can only be stopped if we change the rules of the game dramatically: thus ensuring basic human rights and weakening the attacks on democracy through such measures as fake news and unaccountable online hate speech we have witnessed in recent years. Pandemics are another very current example of how a new data governance regime would benefit the common good: only those who can trust in the fact that their data will be used for the intended purpose alone and not be sold on are willing to share data to track and trace the spread of a virus such as Covid-19.
Apart from distorting democratic and social norms, this is an issue concerning the fundamental rights and laws in our economic systems: Two necessary (but not sufficient) conditions for a market system to function in the interests of its participants are that (i) the participants have control over the goods they sell and gain control over the goods they buy and (ii) the participants have the opportunity to engage in voluntary exchange, by trading goods at prices that they have agreed on. Neither is guaranteed in the digital world as we speak. Furthermore, economic decision-making rests on the perceptions, beliefs and preferences of market participants. Each of these determinants is in the hands of the digital service providers through the flow of digital information that they manage in the interests of third-party funders. Manipulation is only possible because a market actor, in this case a data broker, has intimate knowledge of what makes a target’s decision-making vulnerable. By encouraging “third party-financed digital barter,” the current regime further undermines the workings of the free market system, together with the governments that rely on this system for tax revenues. The reason, obviously, is that the free market system works through price signals, which digital barter has eliminated.
Revisiting data governance for the recoupling of wellbeing
Only with expansive and paradigm changing actions can digitalization become a real success story and lead to societal as well as economic progress, rather than leading to the exclusive gain for a few very large digital service providers. States and societies would very directly profit financially from new regulation and redistribution of data access: the problem of profit shifting is further enabled by non-transparent third party financial barters in which it is often not clearly stated who gains what from the transaction, and tax points would be easier to set. The new regime will not happen by itself. For the new regime to become successful, it needs broad adoption. For broad adoption in the EU, it must be made a legal requirement for the EU. The new digital regime could play a central role in the creation of a European digital single market and is consistent with the GDPR. Progress on this front could put the EU at the vanguard of a new digital age in which online and offline policy becomes harmonized and the growing problems of the current digital regime are overcome.