Routinely clicking the “I accept” button on digital user agreements has become an inseparable part of online life. Not surprisingly, most consumers don’t read the associated legally binding contracts, many of which are lengthy, hard to understand, and written in extremely small typefaces. But as documentary filmmaker Cullen Hoback reports in his new movie, “Terms and Conditions May Apply,” the consequences for ignoring such statements can be quite serious.
In some cases, signing off on these agreements may grant companies the unconditional right to sell and distribute an individual’s personal information. They also give the right to governments to request that information. In one incident documented in the film, an Irish citizen is stopped and detained by U.S. airport security after posting a message on Twitter that suggested he wanted to “destroy” America, a tweet that was intended as a joking reference to his plans to party hard while on vacation in Los Angeles.
Hoback says his main goal in making “Terms and Conditions May Apply” (see a short clip above) was to raise awareness among the general public about what companies and governments are now capable of in regard to collecting and analyzing personal information. The film has been shown in select theaters nationally since its release in July, and became available on September 3 on iTunes. Hoback also has launched social advocacy site at www.trackoff.us, which encourages consumers to contact their legislators to increase awareness about privacy infringement.
In an interview with Data Informed, Hoback discusses what he learned about corporate data privacy practices during the making of his film, as well as his views about the limits of government regulations and what consumers deserve in an always-connected world.
Data Informed: What is your background and how did you become interested in the topic of privacy online?
Hoback: I’m really just a filmmaker. I happened on the subject when I wanted to figure out how technology was changing us. I felt that the deeper I went, the less it was about technology itself, but something else that was going on in the background, behind the technology. Once I started looking at all these legally binding contracts that come with everything digital now, it became apparent that this great shift had occurred.
Your film came out around the time the Edward Snowden incident. How did that affect the public’s understanding of privacy issues?
Hoback: It was the first incident that really captured the public’s imagination. It was the first one that felt personal to people. People felt like there was this giant haystack, and that their phone conversations were being cataloged and collated was kind of alarming. We had many people whistleblowing at the NSA at the top, but they didn’t have the documents to back it up.
How can consumers hope to keep up with the pace of technology that seems to further compromise our privacy with every new innovation?
Hoback: I think when you look back at how we got here, we didn’t embed privacy into the systems from the start. Partially, it’s because regulators are slow to act. These things are complicated and you also have some very powerful lobbying on behalf of these companies coming in and saying that there’s economic factors at play here. There are a lot of things that have kept us in a place where things like encryption haven’t been built in from the forefront. But it’s also because companies want to make the most convenient product as well. It tends to be more convenient to not have to protect information. It’s faster. But when it comes to tracking information, that’s a whole other story. It’s very easy for companies to justify it because I think from their end, they didn’t really see any major user harm. The problem was that by collating all of that information and cataloguing it, you’ve got something that’s incredibly useful for ulterior motives and bad actors. I don’t think that we landed here with bad intent along the way, but I do think that 9/11 opened up the door for the acceptance of industry that wouldn’t have been acceptable otherwise. In many ways, 9/11 gave permission for using data this way, and gave companies the ability to pass the buck.
Today, many companies talk about the value proposition, in other words, what consumers might get in return for providing their personal information companies. Does that seem like a fair trade to you?
Hoback: I just don’t think people understood the nature of the trade. I think they understood that you don’t get anything for free, but I don’t think that the deeper cost is really that clear. It was by design in many respects. If you knew you had the choice of paying with $500 worth of your data or paying $500, you would fully understand the value of what you’re giving people. Because it’s free, I think that’s the policy, is that notion. And it’s just not a fair word to use, because there’s a significant cost.
I think part of why companies that had encryption first haven’t been successful is because people simply don’t understand the value of [those systems]. That’s part of what I was trying to do with the film. It builds awareness so that new companies which are developing tools with privacy first can be successful.
Can designing with privacy in mind from the start be used as a marketing tool?
Hoback: Privacy shouldn’t be something for the few. It can’t be something that you can get if you can afford it. What if we started some kind of a system where either you get privacy or you don’t whether or not you can pay for it? I don’t think it would necessarily be a good idea to have really intense privacy legislation, like a right-to-know law, for instance, for every single Internet company, every single startup. I think it should have a cap, so it’s not actually crippling startups from getting going. But I think that would be a very good first step, just the right to know how much information a company has on you.
Didn’t we used to have laws like that in the pre-Internet world?
Hoback: Yeah, it’s weird. Because in the digital world, everything has an analog in the real world. Yet none of our safeties and protections carried over. That to me is the alarming part. Right now, personal data is owned by the company, but it should be the other way around. We should have control of who we give our stuff to.
Have you gotten any responses to your film from the big Internet companies, such as Google and Facebook?
Hoback: Facebook always says “no comment.” And Google hasn’t said anything, but they did do something. They’ve now actually updated their archives pages to include previous privacy policies. It’s pretty easy to assess that that was a direct response to the film. They’ve added all the original privacy policies back to 1999. Which is good, I’m glad that they did it. But it shouldn’t be a matter of us having to catch them doing it.
Are newer social networks changing their approach to privacy?
Hoback: I think the reason the use of SnapChat [a smartphone photo and video-sharing app] is going up is because people like the idea of impermanence. They want things to disappear.
Most consumers say they care about privacy, but few do anything about it. Generally, they’re more interested in buying things than worrying about protecting their personal information. How do you combat that kind of apathy?
Hoback: You know, it was not very long ago that we were running from lions. The danger is these are very intellectual things. Every time it happens, you have to think about it. We’re busy people who don’t make enough money. It’s really hard to expect people to also care about this when they’ve also got so much else going on. So if it isn’t impacting people directly, it’s difficult to get people to care. But, on the other hand, we can’t assume companies will regulate themselves.
Alec Foege, a contributing editor at Data Informed, is a writer and independent research professional based in Connecticut, and author of the book The Tinkerers: The Amateurs, DIYers, and Inventors Who Make America Great. He can be reached at email@example.com.