BY ANGELINE CLOSE SCHEINBAUM AND KRISTEN SUSSMANS
Facebook is the most pervasive social media platform used today, reporting 2.2 billion monthly active users as of the fourth quarter of 2017. This is more than any other social media platform. More of us need to consider what this means and the risks that are involved when a company has data associated with 2.2 billion people.
As users, we gave consent for Facebook to share our information — but what about giving consent on behalf of others? Cambridge Analytica, a data analysis company, worked with Facebook to collect consumer information on users’ media habits, online behaviors, demographics and psychographics. In social media, this gray area is termed “second order consent.” In Facebook’s Data Policy terms, Facebook is clear to articulate the “different kinds of information” the company collects and how they use it, but most people haven’t fully read these or did not understand them.
That’s why we need more vigilance, more regulation and more laws to protect us.
News of Cambridge Analytica’s use of data was not about a data leak. Facebook, by design, is a profitable business model based on data, and as Facebook users, we have given consent. As consumers and business owners who use Facebook, we are “the product” being sold. Consider what marketing and targeted advertising is. At the heart are terms of exchange, relationship and value. Over time, when trust is earned, a series of exchanges may become a mutually beneficial relationship.
Alternatively, when trust is broken — as was the case with Cambridge Analytica’s use of Facebook data to serve its political agenda — people begin to question the true motives of marketers and those companies using the consumer data. Reality has set in for many of the 2.2 billion users who have probably never read Facebook’s terms, nor have they consciously considered the power of their information.
There is also a societal cost with social media, and scholars recently provided data-backed research about it. From a consumer psychology lens, there are macro effects of social media, including digital drama, overconsumption of the news, spreading fake news, inauthenticity of our digital self, cyberbullying, revenge pornography, social media addiction, fake accounts, trolls, and even deaths seen on Facebook live. There are hidden costs of social media; there is too little privacy, too many comparisons, too much customization, too much information and too many temptations with social media today.
Change is in order. Ultimately, this responsibility lies with lawmakers, businesses and consumers. The current state of the industry walks a fine line of ethical business practice. All businesses, especially Facebook, Google and Amazon, should take ownership in this issue of privacy by providing more simplified information and discontinuing the enablement of consumer data sharing as it is currently done.
It’s our duty as consumers to understand the information that these companies collect, be knowledgeable of the terms, and consciously decide how and when we want to use their services. This requires slowing down and avoiding the temptation to click through a third-party login through Facebook.
Consumers also can submit complaints to the Federal Communications Commission, which implements and enforces America’s communications laws.
Businesses can also make changes with their media mix allocations. Some businesses, like Tesla, have quit their brand Facebook pages or suspended ads on Facebook. Some are upset about the newer “pay to play” aspect of social media and impending rising costs of social media-based advertising.
There needs to be meaningful industry self-regulation, but that might be unrealistic right now, or not enough to warrant meaningful change. We therefore have reached a point where we need more social media laws that protect consumers and not just the social media giants.