The Enigma Of Privacy: What Will You Sacrifice To Protect Your Identity?

Sir Karl Popper painted Plato as the public enemy number 1, in Open Society and its enemies. An ideal system of government that would push back any lover of freedom and progress was proposed by Plato in his work The Republic. Popper warned that Plato’s utopia “has done everything possible, to eradicate from our lives, everywhere and in every way all that is private and individual …” Our eyes, ears and hands seem to see, feel and act as if they belonged not to individuals, but to the community. ”

Machine Learning (ML), artificial intelligence (AI), data analysis and sensor networks could fulfill Plato’s and Popper’s desire for displeasure. New technologies generate new concerns, but human beings tend to satisfy themselves for the risks of benefits and pleasure, convenience or productivity. How do we face what’s coming?

Europe Opens The Way, But Where?

The General Data Regulation of the European Union Protection (GDPR), which came into force on 25 May, is a latest and thorough attempt to clarify the individual data rights. Protecting a person’s data privacy is a worthy goal. As the Economist has recently argued, it has some positive effects, such as the creation of a global “privacy infrastructure”. But at what cost?

The GDPR is a well-intentioned, necessary, but ambitious attempt at complete regulation. The interpretation and application of the regulation will change significantly in the coming years, while addressing other regulatory regimes, technological changes, consumer behavior and competition.

For example, GDPR applies to any EU citizen worldwide. Companies often do not know if a particular user in Sacramento or Sydney is a European citizen. They will be based on a person’s willingness to self-identify him/herself. As an unwanted consequence, clicking on the “European citizen” on a site could effectively answer the question: “Do you prefer service or privacy”?

Consider the “right to be forgotten” by the GDPR. If required, a company must remove data of an individual from its systems unless it is necessary to perform contracted services or abide by other regulations. It is easy to say. How do you address conflicts when the new regulatory requirement compares financial services such as anti-money laundering?

How Did You Do That?

Article 22 of the GDPR requires “significant information on the logic involved” in decisions with a person’s data. ML uses data to increase efficiency over particular jobs. ML operators are unlucky – often not sure how their systems generate specific recommendations.

Insert explicable AI systems (XAI), through which the AI ​​learning and decision processes are recognizable by experts. XAI systems could provide clarity, and are essential in applications such as security, accountability, research and defense. The United States Department of Defense’s Advanced Research Project Agency. (DARPA) has an important program to develop the capabilities of XAI. Also, how do regulators expect companies to act when people apply for exclusion after ML systems have used their data as part of the broader data set to generate information? Will regulators prevent companies from using these results? Which tracking of data usage will be required to comply?

GDPR includes another menacing component very similar to the American law for persons and companies requiring it –that’s regardless whether the regulators fixed it or not. The Austrian privacy lawyer Max Schrems sued Google, Facebook, requesting fines of almost $ 9 billion on the first day of the execution. It is unclear whether sanctions will eventually be assessed, but the fact that almost anyone can file complaints under GDPR presents companies with potentially unlimited risks. Such actions could become factors of international trade policy and negotiations outside the direct control of regulators and diplomats. In a serious way for Europe, the cost and complexity of respecting the GDPR disadvantages newcomers against entrenched players, few (none) of whom are Europeans. While the railways and steel producers gave economic and therefore geopolitical power in the nineteenth century, the accumulation of data and applications does so today.

Our Privacy Enigma

The GDPR and its complications are part of the wider history of privacy in a connected era. Will individual desires and competitive realities overcome attempts to safeguard or over-protect personal data? To what degree can we attain service and security without violating personal rights? This challenge can be referred to as the enigma of privacy.

Consider a mental experiment. (Highlights: mental experiment!) Imagine two companies with privacy conditions of diametrically opposed data. The sacred protection of all personal data under the control of each individual is successfully applied. Tag them prudent: The other guarantees complete access to all personal data by any other person or organization. Tag them Promiscuous. Leaving aside the ethical aspects and preferences, which company would be more likely to obtain an economic advantage?

In China, the government retains expansive and invasive data rights. China’s policies towards its troubled Xinjiang province illustrate the potential and danger of nullifying government control-Security through Orwellian control. Meanwhile, the pursuit of individual freedoms in Europe entails considerable economic costs and the US corporate bias, can limit individual voice and choices. The desire of human beings to have what they want, where and when they want it, and for free, generates competitive advantages for the companies that can offer.

Company Open in the 21st Century

At the birth of locomotives, some people fear that humans will suffocate if they travel more than 20 miles per hour. A ridiculous notion in retrospect, although speed eventually contributed to more than a million car crash deaths each year in the United States. What could happen to our analogue based on data from the 21st century on traffic deaths?

The more data scolding we get, the more we sacrifice service, security, and other resources. The greater our promiscuity of data, the greater our risk of cyber-attack, exploitation and dependency.

In short, we have less and less control over what we imagine. Not only are there many data paths on each of us, quantum computing can later on test the basics of computer security. What could be discovered and what could be unexpected opportunities?

In the Open Society, Popper warned: “The responsibility of our ethical decisions is totally ours and cannot be changed in any other person, either to God, to nature, to society or to history. “

What do you give up and for what?

Leave a Reply

Your email address will not be published. Required fields are marked *