paint-brush
The Dark Side of Big Databy@jeremyerdman
1,530 reads
1,530 reads

The Dark Side of Big Data

by Jeremy ErdmanJuly 6th, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The Myers Briggs Type Indicator. The Big 5 Personality Test. Harry Potter Sorting Hat Quizzes.
featured image - The Dark Side of Big Data
Jeremy Erdman HackerNoon profile picture

The growing trend we need to stop

The Myers Briggs Type Indicator. The Big 5 Personality Test. Harry Potter Sorting Hat Quizzes.

Chances are, you have taken or heard of one these tests. Even more likely, you have taken a Buzzfeed personality quiz or one of the millions that populate our Facebook newsfeed.

These tests offer the ability to condense our many thoughts and behaviors into a framework and ethos that allows us to better understand ourselves and future behavior.

More importantly, it makes us feel relatable and understandable.

In undergrad, a sizeable contingent of my school’s population loved the Myers Briggs. Hundreds of students took the test, and we all shared our results amongst each other.

It provided a framework to better understand why interpreted and responded to situations differently.

For many, it helped us better understand ourselves.

Time has moved forward since and technology has progressed since then. Technology like big data and artificial intelligence have allowed us to better understand our world around us and each other.

With all the data out there about detailing our interests, online interactions, and purchases, more than enough exists to build more accurate personality and character models of us.

So, what happens when entities amass and analyze this information to not only understand us but also influence us?

Photo by William Iven on Unsplash

Cambridge Analytica

In March, Cambridge Analytica became a household name, notorious for its Facebook-based data-scraping operation.

Cambridge Analytica used a Facebook app called “This Is Your Digital Life” to acquire the data and information of 270,000 users. Part of this app had its users take personality quizzes to provide users with their assessment on the Big 5 personality test, which measured levels of:

  • Openness: People who like to learn new things and enjoy new experiences usually earn higher scores.
  • Conscientiousness: People that have a high degree of conscientiousness are reliable and prompt.
  • Extroversion: Extraverts get their energy from interacting with others, while introverts get their energy from within themselves.
  • Agreeableness: These individuals are friendly, cooperative, and compassionate. People with low agreeableness may be more distant.
  • Neuroticism: This dimension relates to one’s emotional stability and degree of negative emotions.

Cambridge Analytica took this personality assessment and compared it against the 270,000 users’ Facebook information and data, drawing correlations between certain likes/interests and personality characteristics.

Now, Cambridge Analytica was a political consulting firm, so they wanted a larger impact than just the 270,000 app users. Their app — beyond taking the information and data of the 270,000 users — also took the information and data from all their Facebook friends.

This gave them access to 81 million users. And using the correlations from the 270,000 users, Cambridge Analytica built personality profiles for these 81 million individuals.

They then used these profiles to target specific voters for political ads. For example, if the individual had interests that correlated with high neuroticism, they would be shown ads relating to terrorism or immigration that stoked that propensity for anxiety and fear.

Cambridge Analytica used people’s data, personality traits, and preferences to influence their voting behavior. What is the next step?

What happens when this data is used not only to influence behavior but also to punish those who don’t conform?

Photo by Diem Nhi Nguyen on Unsplash

China’s Social Credit System

In June of 2014, the State Council of China released the “Planning Outline for the Construction of a Social Credit System.” It details the development and roll-out for the measurement of China’s 1.3 billion citizens’ trustworthiness.

What exactly is the Social Credit System (SCS)?

Similar to the FICO credit score in the US, China’s social credit score amasses information about past transactions and interactions. It then ranks individuals based on their trustworthiness. This social credit score includes financial information, like the FICO score, but moves far beyond these types of transactions.

To build a better sense of someone’s overall moral character, trial company Sesame Credit rates people based on five criteria:

  1. Credit History — Does someone pay their bills on time?
  2. Fulfillment Capacity — Is a person able to fulfill their contract obligations?
  3. Personal Characteristics — Is a person’s information (like phone number or address) verifiable?
  4. Behavior and Preferences — Is a person’s behavior (like shopping preferences and tendencies) desirable?
  5. Interpersonal Relationships — Does a person surround themselves with good people and interact online in an appropriate way?

Building these profiles and resulting scores requires an enormous amount of data ranging from all your purchases, personal information, online interactions, and connections. And like Cambridge Analytica, China’s government plans to do more than simply build and track people’s profiles.

The SCS functions to encourage trustworthiness through reward and punishment. For citizens with too low of scores, punishments include banning from certain schools, jobs, hotels, travel, and dating apps. Already, 6.15 million citizens have been banned from taking trains and planes due to low credit scores.

Photo by Hanny Naibaho on Unsplash

These types of blacklists pose a problem beyond serving as simply a deterrent. If the punishment system lacks transparency, people may not understand why they are excluded or how to improve their score.

Many of the listed punishments— like banning from travel, jobs, schooling, and dating — prevent individuals better themselves. In effect, a low credit score can block a citizen from economic advancement.

And this block seems intentional, based upon Xi Jinping description of the system’s foundation using the mantra:

“Once untrustworthy, always restricted.”

Without clear guidelines on how to better one’s score, these types of systems can create a sub-class of citizens all connected by the government’s disdain for their behavior.

What do we do?

We, as people, have an innate desire to better understand the world around us. As I learned in undergrad, many of us find value in better understanding ourselves and our peers.

In the age of big data, this innate desire to better understand each other married with the desire for power can result in targeted manipulation and control.

So, how do we combat this growing trend?

Rather than controlling each other, how can we use big data to liberate each other?

How can we use technology to empower rather than oppress?

I do not know the answer, but I know I am not alone in wanting to find it.

Photo by Avi Richards on Unsplash