Data Privacy: Is your past being exploited to determine the future?
pri·va·cy /ˈprīvəsē/
noun
- the state or condition of being free from being observed or disturbed by other people.
“she returned to the privacy of her own home”
We all care about privacy in our personal lives — no one wants the neighbor peeping into our windows, going through our mail. We don’t want people to watch us eat dinner or our moments of netflix & chill with our partners. We don’t want Alexa or Google to eavesdrop on our conversations. Why?
Privacy is important. Without it, we’re vulnerable. Naked. In life, when we lose privacy, we lose rights. Rights to our bodies being ours, rights to our own thinking, freedom of will. It gives us autonomy.
Privacy is intuitive when it comes to our physical surroundings but what about our online lives?
When our physical privacy is broken, it’s violating. We feel taken advantage of and vulnerable. While the invasion of physical privacy comes in several forms, the invasion of our data privacy is most akin to the invasion of a stalker. A stalker follows you everywhere; peeping on everything you do. Follows your schedule; maybe watches you eat on your lunch break. They know when you leave for work at 0700 and when you’re home by 6. They follow you until following you shows your patterns.
Having a physical stalker is dangerous. The stalker might break into your home, physically assault you, or worse. With technology, the damage isn’t so obvious. The digital world creates the illusion of privacy and anonymity yet, it’s your devices that see and track everything. They know that your alarm is set for 0600 and you’ll spend the first 15 minutes awake reading Facebook updates. Think about buying something and look it up once — only to see it every time you open anything that shows you an ad? They follow you until following you shows your patterns.
The Terms & Conditions
Every time we sign up for a new app, land on a new website, we’re asked “Do you accept these Terms & Conditions?” “Do you accept these cookies?” What does that even mean? I can’t count how many times I’ve clicked accept just to get to the thing I want to do. We click accept to get through and even if we take that rare time to read them, we’re not thoroughly understanding the implications they could have.
Terms & Conditions are written so that a company can tell us what they’re going to do with the digital imprint we leave when we access their website, app, etc. Using their apps allows them to get information to better our experiences while using it. We’re finding, though, that it doesn’t stop there.
When we agree to Terms & Conditions, we are telling an organization that we acknowledge that they’re stalking us and that we know that they’re going to do what they want with the information. They say, “we’re tracking you and we’re probably going to sell that info to someone else and we’re not responsible for what they do with it” and we say “OK. Go for it.”
It’s for a good cause.
Don’t get me wrong. It’s for a reason — we are getting something out of it. The digital world is offering us something uniquely wonderful. It has created conveniences and distractions that are so good that, some day, Gen Z will say that they can’t imagine a world that existed without them. In fact, these conveniences are so good that we’re annoyed when things aren’t convenient enough. The reality is that things are stressful and fast pace — a little easier is a better life. Right now, as you read, digital life is keeping many paid, functioning and able to reach out to friends and family during the great isolating pandemic of 2020.
These conveniences are valuable. Yet convenience is coming at a price. It isn’t free. We don’t notice the cost because we don’t physically see it. It isn’t in front of us. Our lives are so fast paced and we’re so overstretched that it’s easier to take before stopping to ask, “What am I giving away in return?”
Our quid pro quo? Privacy. We’re selling ourselves for convenience.
The Impact of Selling Our Data
In 2013, a study was published stating that, based on what we liked on Facebook — researchers could predict details about who we are. Predictions around sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age, and gender.
In 2014, a man named Aleksander Kogan released a third-party application on Facebook called, “This is Your Digital Life.” By agreeing to use it on Facebook, users shared their data and the data of their friends. This data, in particular, ranged from everything from profile information, full inboxes of private messages, and more. If the user fully understood their terms & conditions and then understood how to change them, they could opt in and out of what data could be had. For the friends, however? There were no opt in or out processes. All in all, this app collected data from 87 million Facebook users and was then sold to Cambridge Analytica, a political research company.
Kogan, however, isn’t the only person / entity collecting data from our foot prints. We’re only talking about it because Kogan got in trouble when he broke his own terms & conditions with Facebook. He was allowed to collect but wasn’t allowed to sell the data to Cambridge Analytica. Is the moral here, then, just don’t sell the information to other people?
A “political consulting” firm, such as Cambridge Analytica, obtains data about people so that they can make smarter decisions on how to reach and market to these people. Part of defining your target demographic. It’s marketing psychology; learn and understand their patterns so we can suggestively guide them into making the decisions that get us to our goal. This works for general purchasing marketing, so why not politics? Except they aren’t selling us car insurance. They’re selling us politics.
Why did Cambridge Analytica want that information? Who cares what I messaged my friend on Halloween? It may not seem relevant and, likely, one message isn’t going to make a political campaign. When we say, “Ok, stalk us,” it isn’t just us any more. It’s the collective data of all the users. It’s the extrapolations from all the data points that are giving companies a rainbow with a pot of gold at the end of it.
Now, we’ve not only said. “Okay. Stalk me”. Now we’ve said, “Okay. Persuade me.”
The Problem with Persuasion
“Ours is the first age in which many thousands of the best-trained individual minds have made a full-time business to get inside the collective public mind. To get inside in order to manipulate, exploit, control is the object now. And to generate heat not light is the intention.” – The Mechanical Bride by Marshall McLuhan
Marketing is rooted in the basic psychology of what motivates us, how we view the world, our whims and our background. For that marketing to become impactful, it needs to be persuasive — speaking to a need, creating fear, appealing to our emotions or attempting to make us unhappy.
In the past, marketing focused on these things and had to target their efforts into places we may see them most. Movie theaters, commercials, mail ads, word of mouth. Digital marketing, though, is changing the game. It’s now data based. Knowing and subtle, unnoticed in a sea of digital ads yet somehow ingrains itself into the brain.
Has Facebook ever asked you if you remember seeing an ad of XYZ? I never remember seeing it but I’m guessing they’ve shown it to me. That’s why they’re asking, right? Interestingly, it doesn’t matter if you’ve seen the ad:
In 1956, Eunice Belbin of Cambridge University, attempted to understand the effects of propaganda on recall, mind and behaviors. To do this, she displayed road safety posters in a waiting room and observed the effects these posters had on individuals in the room.
“The effects were measured by 1.) the amount of information from the posters that was applied during the interpretation of photographs, and 2.) the amount of information included on each poster that could be recalled. The posters were viewed by a variety of people of all ages, and the results were measured. After 14 days had passed, subjects demonstrated the ability to use information from the posters even if they could not remember seeing it.” -The Psychology of Propaganda
If that isn’t enough to be concerning, in 2018, a study was done by taking a single Facebook like and some personality information related to introversion and extroversion. With just those pieces of information, the authors were able to influence more choices by creating ad copy that reflected the data extrapolated from those points. The study takes an interesting look at the pitfalls and considerations of this — noting that successful medical ad copy could influence individuals into making healthy life choices or potentially send the same people spiraling into gambling and addiction. Huh.
Marketing and the persuasion of behaviors used to rely on recurrence, familiarity, and clever stick in your head marketing. The more you see it, the longer you hear that jingle, the more the name becomes a familiar brand. Coca Cola bears, any one? The 2018 study above suggests what can be done without access to more intimate data and we’ve already discussed that there are businesses collecting our intimate data so there’s the potential that the impact can go even deeper. No need to be loud and in my face with marketing anymore — it will stick because it’s specific to me and speaks to my needs, fears, emotions and satisfaction.
The Potential Real Life Outcomes
In 2016, Donald Trump announced that he was running for president. He hired Cambridge Analytica to help run his political efforts. Remember? Political consulting agency that bought your personal data from Kogan. Let me preface: I don’t care how you feel about Donald Trump. We’re only talking about him because he hired them. I don’t care about how you feel about any presidential candidate. What I do care about? The whole reason I’m sitting here and going on endlessly? What if carefully curated propaganda influenced who you voted for in the election? Where could that leave us?
Cambridge Analytica gathered this information to develop “psychographic” profiling tools, which it claimed could tailor political ads to users’ personality traits. “We exploited Facebook to harvest millions of people’s profiles,” whistleblower Christopher Wylie told The Observer. “And built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.” – quoted from the Verge
Our egos might step up and say, Naaah. I’m smarter than that, I would see that coming. According to Eunice Belbin, you might be using information that you don’t remember seeing. So if I’m a smart marketer, and I would be if I was being paid large amounts of money by very rich people, I wouldn’t just be posting advertisements. I’d be curating content on blogs, on twitter, on podcasts, on every platform that I thought you might share with your friends and with your friends friends because … I have data that outlines how to make it personal. I know what you like and what you’ll relate to enough that maybe, just maybe, I can make your mind up for you.
Well, that would suck.
Know what sucks even more?
It’s legal.
You accepted the Terms & Conditions.
There are currently no protections. No lines drawn in the sand — which uses are okay and which are wrong. There are protections, however, for the companies that are taking our data and using it. They’re selling it, advertising with it, and giving it away. Maybe electing people with it.
We have no Terms & Conditions to make them agree to. We have the option to opt in and out of some things if we’re fortunate enough to understand the legalese but otherwise? Our government has sat very idly by, for the last 20 something plus years, while we have been saying that we have concerns about our digital privacy.
In 2018, when the whole Cambridge Analytica debacle unfolded, we demanded more from Facebook. Our government spent a few days inconveniencing Mark Zuckerberg, slapped him on the wrist and sent him on his way. The UK charged Facebook with a “hefty” fine.
Did we get any change?
The answer is no.
History Repeats
This isn’t the first time that the government has failed to protect the vulnerability and unsuspecting innocence of people. Our US government has a history of allowing others to take information from people without care or consideration for their well-being or for the impact on their lives.
From 1932-1974, a group of impoverished African American men were enrolled in a study designed to study the natural course of untreated syphilis. There were 600 enrolled – 399 with syphilis and 201 without. The men were told that the study would last 6 months (it lasted 40 years) and that they were being treated for “bad blood.” The study promised medical care, meals, and burial insurance for participating.
The men were never told they had syphilis and never treated with Penicillin when it became an accepted treatment in 1945. The study continued, continued, and continued even past running out of funding. The government finally caught on and the study was shut down in 1972. The results being that many died, partners contracted the disease and children were born with congenital syphilis. Life long damage.
While we don’t have syphilis, we do have anosognosia in our digital lives. The companies are telling us that everything is OK, under control. Risk is secured, we’re being taken care of. Yet we suffer from a complete lack of transparency.
For decades, studies and polls were published showing that we are concerned about what might be done with our data. That we’re not comfortable giving it away. We have polls and studies that show that we clearly don’t understand what a cookie is and what organizations can do with it. We have research studies that show our data can be used to extrapolate into our lives in ways that, personally, make me uncomfortable.
And then we have recorded evidence of those in power saying that they have done immoral things with our data (Cambridge Analytica leaders caught on tape, spouting about their improprieties). We have companies acknowledging that it is their job to protect us and our privacy and then to effectively do nothing but monopolize their share of the business market.
And now we have had our data used against us and we can’t escape. We are at their whim because we need the conveniences. For many of us, right now, it’s the only way we can pay our mortgage.
The Aftermath
It took 27 years with a cure and resulting improprieties in human research for the government to enact the National Research Act of 1974.
The National Research Act is why research looks like it does today. Studies are written in a way that protects as many people as possible. Governing bodies must review and approve. Participants are required to have “Informed Consent” — which today, we now see as documents written in plain language. Participants can ask questions, dictate what information is kept and can rescind at will. Participants get compensated for their time. Researchers must follow study protocols exactly, even to the point of how they write down the data on a form.
To obtain that data, researchers have to do extensive work that effectively earns it. They have to prove to us that they’re doing the right thing. For Facebook, Cambridge Analytica, or any other Aleksander Kogan… for them to get even more personal data than the typical research study? They just need us to accept the Terms & Conditions. And when we should get compensated for our data? We make them rich instead.
What’s our aftermath going to be?
When we hear about physical neglect, such as with the Tuskee study, it’s easy to see and feel the abhorrence of the problem. The damage is visible — it is tied to the faces of the men, women, and children impacted by the inhumanity. When our data is used against us, there’s no face to the abuse and no responsibility to the parties enacting it. The money protects the businesses. The money built on the backs of our data.
The potential life damage, though, is there. Regardless of who the current President is, if we are persuaded, manipulated even, into subconsciously electing an official not based on his/her qualifications but their ability to utilize your data and effectively campaign on your fears and unconscious bias.
It is impossible to know the full impact the risks could have but it’s simple. You wouldn’t give your address out to a stranger because of the risk. Why are we giving away where we live and who we are? We don’t trust people when we don’t know them. Why do we trust that businesses will be motivated to protect us when they’ve shown us over and over again that they simply aren’t interested? Money speaks louder than we do.
So what do we do?
“Over the last 16 months, as I’ve debated this issue around the world, every single time somebody has said to me, “I don’t really worry about invasions of privacy because I don’t have anything to hide.” I always say the same thing to them. I get out a pen, I write down my email address. I say, “Here’s my email address. What I want you to do when you get home is email me the passwords to all of your email accounts, not just the nice, respectable work one in your name, but all of them, because I want to be able to just troll through what it is you’re doing online, read what I want to read and publish whatever I find interesting. After all, if you’re not a bad person, if you’re doing nothing wrong, you should have nothing to hide.” Not a single person has taken me up on that offer. “ – Glenn Greenwald in Why privacy matters – TED Talk
You may still be sitting there, thinking, I’m one person. My data isn’t worth anything; there’s nothing there. To be honest, this was me for a long time too. It’s not a big deal, no one cares what I’m doing. It isn’t just about you anymore. This is about everyone and forcing the bar to be raised.
Personally, I don’t think we have room for 27 years of backtracking to fix this. When HIPAA was first released, it took 7 years before compliance began to be enacted. Technology is moving too quickly, the code is too deep. Backtracking into compliance is going to be time consuming and the longer we wait, the harder enforcement will be. So what then?
You have options.
- Be Informed
Know what they have on you –Start by downloading your data. Here’s a quick guide on downloading your data from Facebook by PC Mag.
- Change your Social Media settings
This article from privacy app Figleaf offers some ways to stay on social media and be more secure.
3. Change your tools.
When switching your tools is an option, start here: Privacy Tools. While you may need certain programs to stay functioning right now, you can do things such as shift to privacy focused providers, change web browsers, switch operating systems, or even move over to self-hosted Open Source Software.
- Get more secure — not just private.
A quick search will show you that data privacy is closely linked to data security and is about limiting your physical data — credit card numbers, malware attacks, etc as well. This is a simplified list from PC Mag of things you can do to be more secure online.
After you’ve done all you can to protect you, work to protect others too and write your legislators. Demand more from our political leaders. And when that doesn’t work? Make sure your vote goes out in November’s election. Protection won’t come unless we demand it.
Now get moving, get educated and make good choices.