Editor’s note: The following is an excerpt from People Analytics: How Social Sensing Technology Will Transform Business and What It Tells Us about the Future of Work. Reprinted with permission from FT Press, a division of Pearson.
Sensing data can be a major threat to privacy. Whether the data is from cellphones or web browsing histories, the potential abuse of this massive trove of data is an important concern. At the same time, awareness of this issue is alarmingly low, which means that people often don’t understand the power of the data that they make available.
This problem is magnified with the sociometric badge. Think of the sociometric badge as the natural evolution of the company ID badge. No longer just a tool to open doors, this new kind of ID badge enables you to understand yourself and your company at large through data-driven reports and feedback.
This sensor technology has amazing potential. From gathering five minutes of data with the badge, you could figure out not only whether you’ll win a negotiation, but how well you’ll do. With this badge deployed across millions of individuals at different companies in countries all over the world for not minutes but years or decades, imagine what we could learn about to help people collaborate more effectively and create better organizations.
The opportunities offered by using the badge technology not only can revolutionize our understanding of organizations and society at large, but can also be used to create organizations where privacy is a thing of the past, and managers watch every movement, every conversation, looking for inefficiencies. Ironically, this means that data abundance, rather than scarcity, becomes the biggest hurdle to overcome.
Companies can already legally:
• Watch employees via closed-circuit TV
• Log keystrokes
• Take screenshots of employees using their company computers
• Read employee e-mails
Exposing additional sensitive information from the badges, such as location and who you talked to, could lead to egregious abuses. This data could allow companies to determine when you’re in the bathroom, how much time you “wasted” talking to your friend in another department, and so on. Under current U.S. law, this kind of monitoring is completely legal.
This is a major failing of the U.S. legal system. Overreaching corporate monitoring should not only be morally distasteful, it should also be illegal. Most countries in Europe and Asia ban this activity, but they go so far as to prevent most analysis of this kind of data. To reach a productive middle ground, individuals and companies need to agree on steps to take when dealing with this extremely sensitive data.
Projects that use the sociometric badge should adhere to the “new deal on data” championed by MIT Professor Sandy Pentland. The core concepts of this new deal boil down to these three points:
• Data collection is opt-in and uses informed consent.
• Individuals control their own data.
• Any data sent to third parties must be aggregated.
The Importance of Opt-In
Companies already collect a lot of data without your informed consent. For example, when the organizations I’ve been part of collect data with the sociometric badge, we spend weeks answering questions from participants, explaining what data we collect, and even give them consent forms that have our actual database tables. If people don’t want to participate, we also hand out “fake” badges that don’t collect data but otherwise look and act just like normal badges. This prevents those uncomfortable with the technology from being singled out, and in general makes everyone more likely to participate. Participants can also opt-out at any time. In practice this happens very rarely, because after a few days’ time people essentially forget that they’re wearing the badges.
Taken together, these steps help assuage people’s concerns and help us consistently achieve more than a 90 percent participation rate in all of our projects. Compare that to surveys, where researchers are ecstatic to get a 50 percent response rate.
With such high participation rates, the data itself becomes even more valuable; and whenever something is that important, everyone is going to try to stake their claim to it.
The Need for People to Control Their Data
Modern companies are extremely protective of their data—as they should be. Google’s entire revenue stream, for example, is dependent on the data created by its users. This protectiveness extends to corporate e-mail, where courts have continually reaffirmed the rights of companies to read their employees’ e-mails as long as they are accessed through company servers.
The sensitive nature of sociometric badge data points to the necessity of a change in this model. Without individual control of data, companies would be free to use your data any way they saw fit. For example, this kind of data could predict health risks (depression can be predicted from changes in communication patterns) or your likelihood to leave the organization (people getting ready to quit start to withdraw socially before making the announcement), leading your superiors to pass you over for promotion or diminish your role.
If individuals control their own data, then any potential abuse can immediately be avoided by individuals denying access to their data. Individuals can delete their data at will to prevent access to their information.
Overall, there generally are no good business reasons for companies to control the data of individuals. Knowing where Bob is at 2:30 on Tuesday, for example, doesn’t tell you about productivity. Companies should care much more about the general patterns and aggregate statistics that describe how different teams and divisions are collaborating and what behaviors and interaction patterns make people happy and effective. These aggregate statistics are also the only way to preserve privacy.
Data Aggregation Protects Individuals’ Privacy
Anonymizing data from sensors is essentially impossible. Mathematically, it’s incredibly unlikely that someone would go to the exact same places and talk to the same people as you. Even if someone took a notebook and simply wrote down some of the times that a target person talked to others, then that someone would know whether he had the target’s sensor data.
The only way to deal with this problem is to aggregate data. Instead of allowing everyone to see information about each individual, the data is averaged over groups. This allows people to compare different teams and see how their own behavior stacks up in their group, but prevents anyone from identifying a specific person.
In my experience, companies usually aren’t too concerned by this restriction. People still get their individual data and can use it to improve. For example, they could see that they’re not interacting enough with another team on a project, or that compared to the happiest people at their company they tend to go to the coffee machine less frequently. The organization sees the aggregate data and general trends, which it can use to identify behaviors and collaboration patterns that make people and teams happier and more effective. This approach gives everyone what they want, even reducing liability for companies in case their servers get hacked. Because they don’t have individual data, even someone with malicious intentions couldn’t use the data to discriminate or spy on a co-worker or employee.
To a lesser extent, companies today actually struggle with data anonymization problems. How does an organization deal with salary information? What happens if you submit a complaint about a co-worker? Creating an organization that is open in its approach to these questions is critical not only for gaining widespread acceptance for this technology, but also for building a successful organization.
Trust and Transparency
At the core of the precepts outlined in this excerpt are the importance of trust and transparency in organizations. If you don’t trust the people you work with and work for, you’re going to be unhappy, unproductive, and generally looking to jump ship for another job as soon as you can. If organizations instead position data collection policies to increase trust and transparency, employees learn how to improve and be happier, and companies can vastly increase their success.
People also shouldn’t be overly distracted by the privacy concerns associated with the widespread adoption of sensing technologies. As discussed in other parts of this book, this technology has the potential to bring about radical, positive change in the way people work, from changing what it means to have an org chart to making management focus on people first. Ethically applying this technology and realizing these amazing possibilities is up to us.
Ben Waber is president and CEO of Sociometric Solutions, a Boston-based management services firm that uses social sensing technology to understand companies’ internal communication patterns and drive innovative transformation services. He is also a visiting scientist at the MIT Media Lab, where he received his Ph.D., and was previously a senior researcher at Harvard Business School.