Consumer Data Privacy and the Importance of a Social Contract

by   |   August 6, 2013 4:22 pm   |   0 Comments

As consumers become increasingly aware of the sheer quantity of personal data collected about them online and through their mobile devices, companies are being forced to raise their levels of transparency about what information they are gathering. As a result, a new social contract is emerging between brands and shoppers. While the Federal Trade Commission and some members of Congress are pressing for more regulations to protect consumers’ digital privacy, some companies are beginning to view privacy as a currency to be traded with individuals in return for various benefits.

Related Stories

Indoor location holds promise for marketers’ use of customer analytics.
Read the story »

7 best privacy practices for companies managing customer data.
Read the story »

To protect privacy, Personal builds business model to trade on consumers ‘small data.’
Read the story »

Data brokers at center of FTC’s ‘Do Not Track’ consumer privacy proposal.
Read the story »

Ilana Westerman, CEO and cofounder of Create With Context, Inc., a Santa Clara, Calif.-based digital strategy consultancy launched in 2005, spends her days helping Fortune 500 companies figure out the best ways to hone their brands in the digital universe. Her firm’s clients include Visa, Barclaycard US, Intel, Yahoo, Accenture, Adobe, Panasonic, and Amway. Rather than approaching privacy issues from a defensive position, Westerman—who worked for Yahoo’s user experience team in its earliest days—encourages clients to create a framework of trust between themselves and consumers. Understanding what consumers want and expect from the companies they do business with is a key element of that strategy.

Alec Foege, a Data Informed contributing editor, talked to Westerman about building trust with consumers, good data privacy practices and the potential for business gain that comes from such practices. An edited transcript of the conversation follows.

Data Informed: What is Create With Context’s core mission?

Ilana Westerman

Ilana Westerman

Ilana Westerman: We are a user-experience research and design firm. We focus solely on digital product. User experience is what you see but it’s also what the functions of the product are. We help our clients come up with new products in the digital space, based on the user’s need. We’re not technology-first; we’re user-first. We’re people-first. We’re about understanding the needs of people and then designing digital products that fit into their lives. We tend to work with Fortune 500 companies. We go across many sectors. We find that’s important because learning from the biotech sector will help us in the financial services sector.

One of our core areas that we focus on is building trust toward consumers. Your data is one area of trust. There’s other ways that you create trust, too. For example, trust that the service will work. So it goes beyond big data.

How is the focus on trust in the online environment evolving?

Westerman: This is really a new area where we see a lot of opportunity for innovation, a lot of opportunity to give value to consumers. But because consumers aren’t aware of it, right now there’s a lot of hesitation. What we’re doing is understanding, first, the consumer’s perspective—what their expectations are, what they want, what they’ll use—and then trying to create products that fit in to that. That is a win-win for both the business and the consumer.

What have you learned about privacy in the digital space over the last few years?

Westerman: When consumers are unaware—when their expectations are that they’re not being tracked, that their data isn’t being collected, or they don’t know the breadth of the data being collected, and then when they become aware of that—you really can erode trust. When we’re going through our lives, we’re not going out and reading privacy policies or figuring out what’s happening. We just have mental models of what our expectations are.

For example, if you’re on your phone and you click on Maps and you click on Directions, you have an implicit expectation that that app knows where you are. You want it to know where you are. That’s a benefit to you, and that you expect it. Similarly, if you’re on your phone and you click on the mail icon, your mail comes up and you don’t expect to put in your user name and password for it. Same thing on your computer. When you boot up your computer and you open your mail client, you expect it to know who you are.

But if you go to a browser, and you go to your Gmail or your Yahoo.com mail account, you expect to put in your user name and password. You don’t expect it to know who you are. Or if you go to Google and you search for something, you don’t expect that it’s capturing every number over time. Over time, people are starting to see more and more, “Wow, I looked for this over here and I saw an ad over there.” So they’re starting to make some connections that some of this stuff is happening. Now usually people feel, “Well, that really creeps me out. I don’t like that.”

We’ve heard a lot recently about how brick-and-mortar retailers can track shoppers’ paths inside stores via their smartphones. How does this impact the implicit social contract?

Westerman: The key thing is to understand, what are people expecting? And if what you’re doing as a company is not what they’re expecting, [you have to be] communicating that to them, and making them aware of that, which is always a challenge, because that’s not what they’re trying to do.

The current project that we’re working on found that, in-store, less than 33 percent of people think that maybe data is being collecting in the store. The majority of people have no concept that anything is being collected via their smartphone about location when they’re in a store. So the first challenge for retailers is, how are you going to make people aware of this?

We’ve been doing this research looking at, what do people pay attention to in the store? We’re finding that people pay very little attention to signs in stores. So even if you posted notice in a store, that might not work. Then we’re looking at, can we do something on the phone itself, to provide notice? Well, most people don’t have phones out when they’re shopping because, guess what? They’re shopping, their hands are full, and they’re looking at products. So we’re still working on that research, and we don’t have the answer yet, but we need to figure out a way to first communicate to people that this data’s being collected.

The second thing is really giving them a value-add. There’s a reason this data is being collected. Partly it’s being collected for marketing purposes and so stores can better lay out where items are, so you can understand what people like. But there’s also huge opportunity to create other value-added apps for consumers in stores. So we’re doing a series of concept tests around some concepts like, ‘Help me find something in the store.’ It can be as simple as that. Then what happens is that people’s expectations shift.

What work have you been doing in the online world regarding privacy and trust?

Westerman: The majority of what we do is looking at people’s expectations online and how to create better transparency and control.

We did some work on what’s called a Privacy Icon. , which would be something that would notify people if personal information was being gathered, they could set settings at that point if they wanted to.

We were part of the FTC Mobile Disclosure Panels last year. The Privacy Icon is in the upper tray of a phone. When you are doing something like going to a website or doing an app that is collecting personal information, it would just glow and show you this is happening. If you cared, you could check what was being collected and change your settings. This is work we ourselves fund.

We’re midway through that. Sometimes concepts work and sometimes they don’t. That one is still kind of in flight. I’m not sure if that’s the right way to communicate. We have a number of things we’re looking at. We’re really trying to understand what people are expecting and trying to bridge that gap between expectations and reality. And then giving people control, but giving them control when they want it. People don’t want to necessarily control everything. I don’t need to go set a setting there. If they don’t really care about it, not forcing them through a series of screens. So we’re really looking at models around that. And then looking at what we can do with all this data that they’ve collected about us. Concepts such as Maybe you might like. Or a concept around price. There are a lot of different things you could do, that you would be willing to put your data out there for, because you get something back for it.

What are some ways of gathering personal that are less likely to “creep out” customers?

Westerman: If people are aware that it’s being gathered, they are much less upset. The next thing is, if it’s being gathered and it makes complete sense, like the location GPS on Yelp, then they’re not bothered at all. They’re happy. But if it’s being collected and it doesn’t make sense, and they realize it’s being used to market to them in the future, and those sorts of things, they’re less happy.

That doesn’t mean that people won’t trade their information. We just recently ran a survey, and 81 percent of people would give up one piece of their personal information for 50 percent off a gallon of milk. We say we care about our privacy, but if you give us something in return, some of this stuff, we don’t care as much about.

What kinds of personal information do people place more value on?

Westerman: What’s interesting is that people very much have different feelings about different types of information about them. What is more personal, what is more important to them overall? Then also, what makes sense and how much they trust who they’re giving it to.

There are a lot of factors that go into that. One of the most interesting things is the things that we traditionally think as personally identifiable – your name, your address, your phone number, your email—are the things that people are most willing to give up.  The things like behavior—what you searched for, what apps are on your phone, things like that—people feel like that’s more personal, they get more creeped out and are less likely to give that information up than some of these more traditional things.

For businesses, the main things are: What are you collecting? Why are you collecting it? And making sure you know how your consumers feel about it. If you’re on a job site, you want them to know what your occupation is. But you don’t want them collecting maybe your religion or your age.

What is your perspective on new proposed legislation regarding the collecting of personal data?

Westerman: Our goal is to get the word out. Our goal is also to show that legislating design is not something that’s a good idea. Letting designers design things that create transparency is important, but telling them to create dialog boxes could actually not do what we’re trying to achieve. We always share our information with the FTC, as we do with other companies, but we don’t really get involved on the policy side. We’re more on the product side.

Does adhering to certain privacy and trust guidelines offer companies an advantage in the marketplace?

Westerman: In the past, privacy has been in the cost center. It’s a legal thing within companies. It’s not about the product, it’s not what’s making companies money. And so they’ve kind of been on the side. When I meet with a lot of chief privacy officers, they very much want the same thing that the product people do. They want to create good experiences for their customers. And so having those two groups come together, having more product groups realizing that, “Hey, I can build trust here.”

Brand managers know how important trust is, and hey, I can use this data to create new value for my customers and maybe new revenue streams. When that all comes together into the mix, we’re going to see a lot more being done around them.

Why it’s not happening now is that those two groups aren’t in the same vertical within a company. Frequently, the privacy officers don’t even see products until right before they get launched. At that point, they tell people to go back and change things and they’re seen as the mean guys and the people who stop products.

Xbox and Microsoft has done a good job of building a lot of privacy into the product. They embed privacy into the product team. That kind of model is what I hope moves forward, where companies start to understand that not only can it be very damaging to a brand, but it can actually be the opposite. You can build a lot of trust. It would be nice if those groups could come together but it’s going to take time. Nothing happens overnight.

Are you talking about Privacy By Design?

Westerman: Privacy By Design I think of as a very specific set of guidelines, most of which I very much agree with, but we try to make it a little bit broader, and we talk about creating trust. Baking trust into product. A goal of a product is to create transparency around the data that’s being collected and letting designers come up with many different ways of creating that transparency, depending on the product and the company. And really having that design goal at the very beginning is the way to do it.

But incorporating concerns about trust and transparency from the beginning of the product cycle costs money. How do you justify that added expense to your clients?

Westerman: Companies invest a lot of money in building that trust in other ways. They do community projects. They talk about their return policies, for example. All of that kind of stuff builds trust. So smart companies are going to quickly see that if they don’t do this, then they create this underlying feeling of distrust with their customers. And loyalty is lost.

The challenge we want to see companies take is say, “Alright, we’ve been collecting data for a long time, and we haven’t really been giving value back to the customer.” We need to figure out new applications and things that we can do that actually gives them something back for what they give us. I think it can be a win-win for everyone. Like the example if in-store, I had an app that could tell me where the socks are, all of a sudden I want you to have my location, and you can now message me about other things knowing my location.

What is the most surprising thing you’ve learned about privacy and consumers in recent years?

Westerman: In general, Americans trust. We’ve been tracking trust over time, both in specific industries like banking and healthcare as well as specific brands as well as government. And in general, on a scale of 1 to 5, we’re above 3 on almost everything. So we’re coming to the table from a place of trust, so don’t squander it.

Alec Foege, a contributing editor at Data Informed, is a writer and independent research professional based in Connecticut, and author of the book The Tinkerers: The Amateurs, DIYers, and Inventors Who Make America Great. He can be reached at alec@brooksideresearch.com.




Tags: , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>