Facebook Co-Founder Parker: We Hijacked Your Mind

It’s unclear to what extent Facebook users understand how much their every status change, click, like, addition to their friends list, message recipients, changes in relationship status, and groups joined is being monitored. Facebook has become massively profitable not because of a subscription system, as it remains a free service. Where Facebook makes its money is through highly-efficient algorithmic data that monitors and analyzes the thoughts and behaviors of Facebook users, data which it then sells to high-paying advertisers. Some of these algorithms are designed to keep us on Facebook as long as possible, which means collecting even more data that is then handed over to companies for targeted ads without the Facebooker’s knowledge.

Even those that are aware of this unprecedented behavioral data collection tend to continue trusting Mark Zuckerberg to handle that information in an ethical manner. They seem comfortable with, or at least tolerant of, the idea of Facebook’s constant attempts to pervade our psyches through analysis of our online behavior, down to the most minute detail. Even with highly publicized scandals, including the intentional infliction of emotional distress on unwitting participants in a non-consenting psychological study, Facebook users go about their business as if Zuckerberg is a benevolent dictator of social media’s most used, all-encompassing platform.  

But, like so many benevolent dictators, once you are inside the walls of Zuck’s Facebook kingdom, you can’t ever really leave, unless he is willing to make some exception for you. Which he won’t. You’re not special, but you are valuable as a being with behavioral and psychological tendencies which can be more effectively marketed to, whether you deactivate your Facebook account or not. Once you sign up, you are in the Facebook network for life. Maybe you don’t care. Like all these issues of decreasing personal privacy, a violation that has now penetrated our minds and subconscious behaviors seems particularly defiling, not to mention unethical. I say unethical because Facebook goes out of its way not to disclose the nature of their profit-driving practices – monitoring and analyzing your every move, click, like, and comment – to its users.

One Guardian reporter’s experiences with decreasing control over his own profile’s privacy settings puts a face to the shadiness of Facebook’s data-procuring methods.

‘gradually, the highest tier of privacy settings have been removed by Facebook. You can still hide individual posts, but your Facebook account itself is now public, whether you like it or not.

In October, that all changed. Facebook rolled out an update to its internal search engine, letting users search the entire network for the first time. All public posts became searchable for everyone…

Every profile on Facebook now shows up when users search for it by name, even those, like mine, with the tightest possible settings, no friends in common, no profile picture, and no content posted. Worse, if you then click on the profile, a large amount of information is still public: any page I’ve liked, any group I’ve joined, and, if I had any, every friend I have on the site.’ (The Guardian)

Most of this is not news to those who choose to do the slightest amount of digging into Facebook or review a comprehensive network of news sites regularly. But, with at least 30% of U.S. citizens getting their news primarily through Facebook or Twitter, they may still be unaware of the company’s transgressions and motives. That percentage is alarmingly high and reflects how completely Facebook has been ingrained in most Americans’ lives. We’ve been taught that almost every monopoly is bad, but Facebook is attempting to monopolize nearly every aspect of our lives, from socializing to storing photos and contacts, and dictating what news we do or don’t read. It turns out they even want to make us less productive, as they prefer we spend that time on Facebook.

If you think these claims are far-fetched, consider the recent statements from founding president of Facebook and founder of Napster, Sean Parker. You might remember him as Justin Timberlake’s character in the movie ‘The Social Network’. In an interview with Axios, Parker essentially confirmed what many already suspected: [Facebook is] exploiting a vulnerability in human psychology."

His interactions with various people who vowed never to get on a social network like Facebook shortly after its inception were foretelling of the company’s aims from the outset: to gather data on the intimate details of the lives of as many people as possible.

"When Facebook was getting going, I had these people who would come up to me and they would say, 'I'm not on social media.' And I would say, 'OK. You know, you will be.' And then they would say, 'No, no, no. I value my real-life interactions. I value the moment. I value presence. I value intimacy.' And I would say, ... 'We'll get you eventually.'" (Axios)

Parker says he had not thought through the unintended consequences of a network that would eventually grow to billions of users, but often ponders those consequences now. Parker believes that children, specifically, will see developmental deficits in part due to Facebook’s alluring algorithms that even adults who spent the majority of their lives without Facebook can’t avoid. It’s easy for Parker to play the good guy now that he is no longer associated with the company, essentially laying the blame on Zuckerberg, who does deserve most of it.

"I don't know if I really understood the consequences of what I was saying, because [of] the unintended consequences of a network when it grows to a billion or 2 billion people and ... it literally changes your relationship with society, with each other ... It probably interferes with productivity in weird ways. God only knows what it's doing to our children's brains."

But it was not data collection which Parker says he, Zuckerberg, and the other founders were initially concerned with, though getting you to comment more was a goal that they saw as valuable. For whatever reason, Parker and Zuckerberg’s driving question was “how do we consume as much of your time and conscious attention as possible?'"

"And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that's going to get you to contribute more content, and that's going to get you ... more likes and comments."

They were fully aware of vulnerabilities in the human psyche, and believed that they were creating a system aimed at exploiting that weakness, learning how to control your behavior through algorithms. The execution of this concept would ultimately make them billionaires, and Facebook a source of mass behavioral data that has rendered Zuckerberg a demi-god in numerous fields of business, especially the tech sector. Parker and Zuckerberg knew that psychological exploitation and manipulation were the bedrock of their business’s success, and decided that it was an essential, justifiable evil within their grand plan.

"It's a social-validation feedback loop ... exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology."

Zuckerberg has run with this original concept, perfecting algorithms aimed at consuming more and more of your time while collecting more and more of your cognitive and behavioral processes. Essentially, perfecting the exploitation of your psychological weaknesses for Facebook and its client-advertisers’ gain. You’re probably not going to stop using Facebook, even knowing the intention and effectiveness of its engineers’ hijacking and re-wiring of your mind. Because they, and their algorithms, are that good. You know it’s bad for you, they always knew it was bad for you, and you still can’t stop.

Or can you?

Related News
Comments