AI and Big Data: What about Privacy Protection?

Posted by Peter Rudin on 21. April 2017 in Essay

 

Picture Credit: GDD Gesellschaft für Datenschutz und Datensicherheit e.V.

Introduction

Imagine a world where an authoritarian government monitors everything you do, collects huge amounts of data on almost every interaction you make, and provides you with a score that measures how “trustworthy” you are.

In this world, anything from defaulting on a loan to criticizing the ruling party, from running a red light to failing to care for your parents properly could cause you to lose points. And in this world, your score becomes the ultimate truth of who you are – determining whether you can borrow money, get your children into the best schools or travel abroad; whether you get a room in a fancy hotel, a seat in a top restaurant – or even just get a date.

This could happen in China by 2020. It is the scenario contained in China’s ambitious plans to develop a far-reaching social credit system, a plan that the Communist Party hopes will build a culture of “sincerity” and a “harmonious socialist society” where “keeping trust is glorious.”

The government hasn’t announced exactly how the plan will work – for example, how scores will be compiled and different qualities weighted against one another. But the idea is that good behavior will be rewarded and bad behavior punished, with the Communist Party acting as the ultimate judge. This is what China calls “Internet Plus,” but critics call it a 21st-century police state.

‘Reward-Punishing’ schemes are part of our western culture as well. Violations of driving and traffic rules are digitally recorded in state-owned registries and repeated reckless driving might result in the loss of ones driver’s license. Criminal conduct and negative court rulings or payment default and bankruptcy will adversely affect one’s reputation or credit rating causing problems when applying for a new job or renting a home. In fact there are attempts under way, politically as well as economically motivated, to establish central registers to efficiently identify potential ‘problem individuals’. Last but not least, calls for such action also originate from the widespread fear of terrorism.

The major part of our population is made up of law abiding citizens being proud of having a ‘white vest’. So a central register which identifies ‘problem individuals’ might be considered desirable by many. However the question looms, how to be erased from this register if one has been registered by mistake or one has learned the lesson and no longer should be considered a ‘problem individual’.

Contrary to the concept of a totalitarian society, the individual, his personal welfare and rights and responsibility within society represent what we consider democratic values. These values, however, are threatened to be undermined by companies monopolizing internet usage for monetary reasons.

From Informing to Persuading

Internet giants like Google or Facebook generate the majority of their income through personalized advertising. As we make use of their free services we leave a digital footprint which is used to profile us. With the help of machine learning this data can be used to generate psycho-demographic profiles of users. Accessing the Google search engine will leave a trace of one’s personal or professional interests, performing a credit card transaction will provide information about our shopping behavior, booking a vacation will provide information about our financial income potential, tracking our movements through GPS will provide information about our mobility, subscribing to a partner service will provide information about our life situation etc. All this information can be used for profiling our personality.

While officially, as stated in the terms and conditions, the identity of the user is protected, it can, in practice, be inferred quite easily. Today, algorithms know pretty well what we do, what we think and how we feel—possibly even better than our friends and family. Often the recommendations we are offered fit so well that the resulting decisions feel as if they were our own. The more that is known about us, the less likely our choices are free and not predetermined by others.

Some software platforms are moving towards “persuasive computing.” Using sophisticated manipulation technologies, these platforms will be able to steer us through entire courses of action.  Perhaps even more significant is the fact that manipulative methods change the way we make our decisions. Studies carried out by behavioral economists, for example, document the fact that the way questionnaires are formulated influences a ‘yes’ or ‘no’ answer on the same subject.

The right of individual self-development can only be exercised by those who have control over their lives, which implies informational self-determination. A democracy cannot work well unless those rights are respected. Constraining these rights undermines our constitution, our society and the state.

Data protection efforts

Contrary to an authoritarian government, democracy offers the opportunity to institutionalize and legalize services that maintain our constitutional rights. There are several governmental efforts under way to implement or improve existing data protection laws.

Responding to this requirement the European Commission has set-up data protection directives which came into force in May 2016.  EU Member States have to transpose these directives into their national law by May 2018. The objective of this new set of rules is to give citizens back control over their personal data, and to simplify the regulatory environment for business. The data protection reform, also referred to as GDPR (General Data Protection Regulation) is a key enabler of a Digital Single Market which the Commission has prioritized.

Under the current directive, any data “by which an individual can be identified” is the sole responsibility of the data controller, i.e. the owner of this data. Under the new regulations becoming effective next year any company or individual that processes this data is also responsible for its protection, including third parties such as cloud providers.

With GDPR in place, personal data can only be gathered legally under strict conditions, for a legitimate purpose. Furthermore, persons or organizations which collect and manage your personal information must protect it from misuse and must respect certain rights of the data owners which are guaranteed by EU law.

The consequences of GDPR are far reaching, for example:

  • It regulates the transfer of data across country borders.
  • It requires organizations to include data protection considerations in the core of their business when developing new solutions and services.
  • It issues fines in case of non-compliance – as high as 10% of the global annual turnover.

The coming years will show how effective GDPR will be in protecting private data. Transaction-based service companies such as on-line shops will have to operate under stricter privacy rules. Profiling an individual based on his entire internet footprint, however, is likely to continue. We will continue to use free services by periodically signing lengthy on-line ‘consent agreements’ which most people don’t read anyway.

The case for a Trusted-Data-Service-Provider

To manage the rising complexity of our internet footprint opens a business proposition for a Trusted-Data-Service-Provider offering individuals a range of services such as:

  • Continuous secure storage of selected data streams incl. health records and financial data
  • Machine Learning tools to analyze this data (personal assistant)
  • Managing access to this data for 3rd parties such as family members, doctors, accountants
  • Provide anonymization services of outgoing internet traffic
  • Provide intelligent information filtering of incoming internet traffic
  • Surveillance of personal data information collected by others

With these services in place a service customer has the option to pass internet traffic through a trusted personal assistant and securely store his data for future analysis.

There is a strong analogy to the trust functions of financial institutions; in fact trusted services as outlined above might represent a new business opportunity for existing banks providing their trust reputation is intact. As data symbolizes the currency of the 4th industrial revolution a bank with an international reach might have the image and the policies in place to act as Trusted-Data-Service-Provider. As banks have the reputation to comply with government regulations to avoid illegal conduct, public perception to extend their financial service offerings with a Trusted-Data-Service is credible. If I trust a bank to take care of my financial assets, I might as well trust them to manage my internet identity and privacy.

Leave a Reply

Your email address will not be published.