What You Need to Know about Surveillance Capitalism
When it comes to surveillance capitalism in the digital world, here’s what people should be concerned about: “Who knows? Who decides who knows? And, who decides who decides who knows?”
Harvard Business School professor emerita Shoshana Zuboff explores those questions in her 2019 book, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Zuboff visited campus on February 25 to talk about surveillance capitalism as part of the Babson-Olin-Wellesley (BOW) faculty-led initiative made possible by a BOW Presidential Innovation Grant with focus on artificial intelligence and machine learning awarded to Eni Mustafaraj (Wellesley), Julie Walsh (Wellesley), Clare Gillan (Babson), Joe Ricciardi (Babson), and Erhardt Graeff (Olin). The event featured a lecture and reading by Zuboff, who also answered questions from the audience about the complexities of data privacy, surveillance policy reform by the government, and the knowledge gap between technologists and politicians.
Earlier in the day, Zuboff joined Wellesley students, Eni Mustafaraj, assistant professor of computer science, and Julie Walsh, assistant professor of philosophy, for tea and conversation. Mustafaraj and Walsh share more here about their interest in Zuboff’s work, how they are incorporating her book into their classes, and what people need to know about the impact of surveillance capitalism on our society and lives.
Q: Zuboff defines surveillance capitalism as “the unilateral claiming of private human experience as free raw material for translation into behavioral data.” How does that affect the relationship between individuals and technology’s use of information?
Julie Walsh: One important lesson from Zuboff is that we need to be aware of what’s happening and not be complacent about the surveillance we are undergoing. How this impacts or changes the way we engage with technology is a tough question. Many of us know that our data is being taken without our knowledge and then used to try to manipulate us into becoming more predictable and more reliable consumers.
But even with this knowledge, many of us find it extremely difficult to act: to delete Facebook, delete Instagram, to call our congressperson to see what they are doing to protect us when we’re online, to inform ourselves as much as possible about the use to which our data is being put. Zuboff is leading us to this kind of awareness, but it’s up to each of us to make changes in our own lives that bring our behavior into line with that awareness. It is very hard. It is perhaps especially difficult for digital natives, for whom Facebook, Instagram, and other applications have been a part of their entire lives.
Q: As you have been discussing surveillance capitalism with your classes, how have students reacted and what lessons are they taking away? Why is Zuboff’s book so relevant at this time to your work and in your fields?
Eni Mustafaraj: The Ethics of Technology class, taught by Professor Walsh, and my Artificial Intelligence class are joining together seven times throughout the semester for what we are calling AI & Ethics Labs. Zuboff’s work has formed a cornerstone in our collaboration. Our students are also working on two projects together, one on an ethics audit of voice assistants, like Amazon’s Alexa, and one on the ethical principles of the use of AI by big tech companies and whether those principles are being followed.
Walsh: Professor Zuboff’s insights really led us to develop these labs and projects, and our students are learning about epistemic inequality, when the user has little or even no knowledge of what a company knows about them and what they are doing with that knowledge. Thinking through this inequality and how to eliminate it is an urgent need at the moment. The surveillance capitalists have done an excellent job of making us think that the erasure of our privacy is inevitable. Professor Zuboff is telling us that it isn’t, not yet anyway, and that there is work to do.
Zuboff’s call is a perfect iteration of the way that philosophy and computer science can speak to each other: Philosophy gives us the ethical frameworks to think through the harms involved in such surveillance, and computer science helps us see what is possible, technically speaking, to do and to change.
Q: Are there any examples of surveillance capitalism tactics and practices being used for good?
Walsh: The lesson, I think, is that we are the owners of our private data, and that it is up to us to decide how to distribute it. Perhaps surveillance capitalism can be put to good, but it is up to the capitalists to convince us that their goals are good, and then it is ultimately up to us whether we participate in it. But Professor Zuboff also issued a word of caution: Companies are very good at using fear to get us to think that their objectives are good. If we are afraid of something happening, we might be more likely to buy a product to keep us safe, even if that product is collecting our data and invading our privacy. We need to very carefully consider whether the things that the surveillance capitalists tell us to fear are, indeed, threats to our safety.
Mustafaraj: There is a reason why surveillance capitalists use fear to lull us into complacency: it has worked before. Surveillance capitalism was able to thrive in the past two decades because its apparatus became handy and continued to be perfected during a challenging and vulnerable time in our history: the terror attacks of 9/11. Zuboff tells a story in her book about how in 2000, the Federal Trade Commission had recommended federal regulations to protect the online privacy of consumers because online companies couldn’t be trusted to self-regulate. But once 9/11 happened, the focus of governments around the world shifted from privacy to security, and the public was made to believe that the War on Terror was worth sacrificing privacy. Online surveillance was instrumental in enabling governments to pursue their security agendas, and for many years (as we learned from Edward Snowden’s leaks) surveillance capitalists and governments worked together to strengthen the surveillance apparatus.
Q: What else should people know about surveillance capitalism?
Mustafaraj: Unfortunately, because surveillance capitalism, as Zuboff explains in her book, was invented at Google and has been perfected by many tech companies, such as Amazon and Facebook, we now equate all things digital with surveillance capitalism. But that doesn’t need to be the case. The world wide web, which defines the technological principles and protocols on top of which Google and Facebook operate, was not invented by surveillance capitalists, but by an academic researcher, Sir Tim Berners-Lee, who shared this technology for free on the internet and never benefited financially from its invention.
Surveillance is not an intrinsic part of internet-based technologies, but as Zuboff explains, surveillance capitalists are the ones who have captured our digital technologies and repurposed them to generate profits of various nature. As a famous saying by historian Melvin Kranzberg goes: Technology is neither good, nor bad, nor is it neutral. It depends on what we as society value the most. If we were to shift our values so that we prioritize human freedom and dignity above economic growth and financial benefits, then the digital technology will start working mostly for us as opposed to against us, as it feels in this particular moment in history.