Skip to Content Skip to Footer
Reassuring Consumers on the Use of Their Data

Reassuring Consumers on the Use of Their Data

Hal Conick

magnet propelling toward heart bubbles

Imagine that you’re waiting in a doctor’s office. You may feel sick, worried or in pain, but you know that your medical information will be safe—the doctor is legally obligated to protect your data. You feel comfortable that your information won’t be used nefariously or even questionably.

The term “online privacy” means different things to different people, says Jenna Jacobson, assistant professor of retail management at Ryerson University. Jacobson—an author of a 2019 paper titled “Social media marketing: Who is watching the watchers?” published in the Journal of Retailing and Consumer Services—says that while it’s perfectly legal for marketers to collect, analyze and use publicly available data, brands don’t often consider whether they’re making consumers feel comfortable.

The information you provide at the doctor’s office is private, but what about that which you publicly post to social media? The premise changes: Most people openly post but aren’t sure how brands use their likes, statuses and photos. For such an opaque part of the marketing process, the stakes are high. A recent survey by cloud platform Acquia found that 65% of people don’t know which brands are using their data, while 65% said that they’d stop doing business with a brand that was dishonest about how it uses their data. This public data isn’t legally protected, so it’s up to brands to determine how they’ll collect, analyze and use it, often leaving consumers in the dark on their privacy rights.


“I think that there’s a lack of understanding about how social media data can actually be used,” Jacobson says, also noting a lack of professional and organizational norms in marketing and many other professions. “For us, it was important to acknowledge that just because social media data is publicly accessible doesn’t mean that individuals don’t have concerns and feel discomfort with third parties using their data.”

According to Jacobson’s research, the majority of consumers aren’t comfortable with how their data is used. The paper found that 53.1% of consumers are uncomfortable with their data being used for targeted ads, 42.3% are uncomfortable with it being used for opinion-mining and 41.9% are uncomfortable with it being used for customer relations.

This doesn’t mean that consumers want to hide. The paper found that consumers are constantly assessing the benefits and risks of sharing online and managing their data privacy boundaries by considering what they share. They continue sharing because they want the benefits of using social media, whether that benefit is communication with people or being shown something useful they didn’t know about through marketing.

Brands can collect data and use it in a way that helps consumers, but Jacobson says that they often don’t contemplate what consumers could find unsettling. Your doctor isn’t likely to go off on a negative rant about your health to a nurse in front of you—although it’d be legal—because it would likely make you feel terrible. Jacobson says that brands should consider what data practices would make consumers uncomfortable if they found out about them and brainstorm ways to change these practices. The goal should be to build consumer trust in the process and leave them without unpleasant surprises, perhaps by creating guidelines or practices on a business or industry level.

In Jacobson’s paper, she and her co-authors argue that marketing professionals have ethical responsibilities that extend beyond legal requirements. “For social media marketing to be executed effectively and ethically, the recipient of the marketing material—the consumer—needs to be comfortable with the practices,” they write. If consumers discover that a brand is using their data in a way that makes them uncomfortable, they may no longer buy from that brand.

People often think about data ethics in terms of politics. Much of the past few years has been spent debating what’s ethical and unethical when it comes to political data collection and targeted marketing.

Morten Bay, a research fellow at the Center for the Digital Future at the University of Southern California’s Annenberg School for Communication and Journalism, had his interest in the topic piqued by observing how ISIS and Russia used social media as something of a virtual warzone. His interests extended into how social media was used during the 2016 presidential campaign, especially the Cambridge Analytica scandal, in which President Donald Trump’s campaign leveraged third-party psychometric data—measures of a person’s knowledge, abilities, attitudes and traits—to target individual voters. Some compared this use of data to the microtargeting used by President Barack Obama’s campaigns, but the data procured by Cambridge Analytica was mined without user consent or knowledge that it’d be helping a political campaign, a far cry from collecting publicly available social media data.

Bay wrote a paper—titled “Social Media Ethics: A Rawlsian Approach to Hypertargeting and Psychometrics in Political and Commercial Campaigns,” published in a special issue of ACM Transactions on Social Computing—arguing under a framework of philosopher John Rawls that using hypertargeting and psychometrics in politics was not ethical, as it blocked out the competing information necessary to a democratic society. In his paper, Bay writes that persuasion on social media is a zero-sum game: “Part of a persuader’s mission is to succeed in presenting information in a way that blocks out competing, contradictive information.”

But hypertargeting is different in commercial use, Bay says, and not exactly unethical. One can argue that people aren’t well-informed enough to know about data mining and targeting in marketing or advertising, he says, but that isn’t necessarily an ethical problem if they aren’t sold a certain product. The Rawlsian framework doesn’t see targeting used in marketing and advertising as unethical, so long as consumers have the chance to both opt in and out—in politics, users can’t opt out, as social media has become a central location for political news and debate.

“I’m not sure we can do anything about it in a commercial sense unless there’s an actual breach of contract,” Bay says. “But on a political level, the stakes are much higher.”

So how can a brand strike a balance between doing something that’s clearly legal and perhaps ethical, but makes consumers feel disconcerted?

Have Clear Policies, But Educate and Empower

Jacobson’s paper finds that it’s necessary to create a clear privacy policy, but that alone isn’t sufficient to make consumers comfortable. After all, most people tend to skim or skip these policies entirely.

Jacobson and her co-authors write that brands should “empower users with a higher level of control over what information they want to share, with whom and for what purpose.” Much of this will likely start with education on how brands use consumer data. Although most consumers are uncomfortable with targeting, people are more familiar with targeting than they are of opinion mining or customer relations, two tactics which are more masked and less apparent to the average user. This means that brands have an opportunity to educate consumers about these marketing practices, using education to build comfort.

“Digital literacy will continue to be crucial as technologies evolve and new ways to use individuals’ data emerge,” Jacobson and her co-authors write. “The onus does not only lie with individuals; rather, third parties that use the data need to be held to higher ethical standards.”

repeating pattern of blue thumbs up circles and red heart circles

Be Transparent

Like Jacobson, Bay believes that it’s up to marketers and advertisers to figure out how they want to represent themselves ethically. Society may tacitly accept current data collection practices—most of society is blissfully ignorant, as Acquia found—but consumers may learn more about these practices and quickly sour on how their data is being used and their privacy invaded. What was once a benign practice could quickly turn cancerous for a brand.

To prevent being caught engaging in something consumers may one day see as unethical, Bay suggests that brands be transparent with how they collect, analyze and use consumer data.

“If you say to people, ‘We would like to give you X opportunity, but it requires us to get your data and perform this analysis on it,’ then people can make up their own minds,” he says. Companies often falter here by telling consumers what data they collect but not how they’ll analyze it, “but as long as they make sure [everything] is transparent and people are informed, [data practices] can never really be completely unethical,” he says.

Do You Need the Data?

Though it may sound counterintuitive for marketers who want to be on the cutting edge of data analysis, Bay says that brands should consider what kind of data they actually need to collect.

“What’s your benefit of actually starting to collect this data?” he says. “Can you actually use it for anything useful? … I would imagine that for 40% or 50% of companies going into this field right now, they’re just doing it because of the hype.”

Consider Becoming a Privacy Champion

Brands have an opportunity to become a champion of data privacy and perhaps win new business.

Jacobson says that she could envision a brand positioning itself as an ethical leader. This would be a challenge for most marketing departments, she says, as being a champion of data privacy would mean a high level of data literacy, something that doesn’t come easily and requires following industry changes. But brands that are outwardly ethical and put consumers’ comfort first could win their trust and increased sales.

Jacobson says that data privacy is constantly evolving­—and it may be the new frontier for ethical practitioners. She sums it up simply: “These ethical practices are good business practices.”

Hal Conick is a freelance writer for the AMA’s magazines and e-newsletters. He can be reached at or on Twitter at @HalConick.