4 Surveillance Capitalism

Take a second to consider the standard manila folder—not the folder icons on your computer. We’re going old school for just a moment to consider the actual folder with its simple, clean design and distinctive color. It has a tab that peeks out on the side so that you can label the contents of the folder and therefore easily organize and retrieve information, which is especially helpful if you have a lot of different types of information spread across multiple folders.

It’s a pretty simple concept, but information wasn’t always organized this way, and as we’ll discuss in this chapter, this type of data collection isn’t always beneficial to everyone. In fact, it seems fitting that the manila folder is often considered a symbol of imperialism and oppression. Invented in the United States in the early nineteenth century, manila paper originally came from a type of banana tree called acaba, which grew predominantly in the Philippines. When the United States colonized the Philippines, it took control of the acaba trade to ensure that it—not the Filipinos—would profit (Lui). The U.S. used the wood for a variety of purposes, including manila envelopes and paper.

In its most basic form, surveillance is about collecting information about people so it can be easily retrieved and analyzed. Before filing cabinets were invented in 1890, information was written in ledger books or in loose bundles of paper, which made information less accessible because it was difficult to find specific information. The filing cabinet revolutionized record management practices by making it possible to quickly retrieve a single document (Bristow). Soon, filing cabinets were a must in businesses across the country, and large storage facilities emerged with the singular purpose of securing large amounts of data. In other words, filing cabinets became foundational to the way that institutions managed systems—and people. They were used to document identities, relationships, events, and transactions.

Now think about all of the folders out there somewhere that contain information about you—medical records, academic transcripts and other test scores, employment evaluations, bank information, credit scores, tax records. It’s obvious why privacy has become such an important issue. Everyone has personal information stored in files across various institutions. Sometimes we don’t even know these files exist. Rarely do we know all of the information that the files contain or how the information was collected. It’s a form of institutional power because it increases our visibility (and vulnerability). Individuals are provided opportunities—or not—based on the information in their files. Data collection also has the power to control behaviors. Think about standard employment practices that require workers to clock in and out, that mandate an annual employee evaluation, that put surveillance cameras in parking lots and workstations, or that require random drug testing. These are all ways of monitoring and managing the behaviors of employees, of creating systems of rewards and punishments based on the information in a person’s file.

The digital age of computers, web browsers, and cloud-based computing and data storage expanded data collection practices. It’s even easier to store and retrieve vast amounts of personal data. It’s also possible to collect all kinds of user information without the users’ explicit knowledge. That’s what “cookies” do. When you visit a website and it requires you to “accept cookies,” a code is embedded in your browser’s software that allows the website to collect information about your online activities and to recognize your computer when you revisit the site (Google, “How Google”). Information about your preferences, profile information, browsing history, past purchases, and even your navigation path and time spent on individual pages is collected. Cookies also allow websites to track your activities across the internet, even when you aren’t on that specific website. A robust user profile is developed about you—your interests, activities, and identifying information—all so that companies can create targeted ads and online experiences that will increase your chances of buying certain products and services.

Cartoon graphic of a surveillance camera
CCTV Surveillance Notice, by Amityadav8, on Wikimedia Commons (CC BY-SA 3.0)

The term “surveillance capitalism” refers to this collection of personal data for the purpose of financial gain or some other benefit related to power and control. In the introduction of Surveillance Capitalism, Josh Lauer and Kenneth Lipartito define surveillance capitalism as a

broad range of strategies and techniques, both formal and informal, that commercial actors—including lenders, merchants, employers, managers, service providers, and others—deploy to observe, record, predict, and control human behavior and relationships. The targets of such commercial surveillance typically include clients and customers, borrowers and buyers, staff and laborers (free and unfree), markets and competitors (5).

It’s a broad definition because surveillance is so prevalent throughout our daily lives. We’re reminded in many parking lots, stores, schools, and government buildings that we are being recorded. Cameras also exist at many intersections and toll booths to ensure that people follow the rules. Many people have cameras on their doorbells and inside their homes so they can monitor the activities of visitors or maybe even their own pets or family members and prevent negative behaviors. Then there are the smartphones, smart watches, and GPS apps that track your location throughout the day. Our Siris, Alexas, and Google Assistants are listening devices that record our dialogue, which can be reviewed by tech companies themselves or third-party contractors (Metz). There is even some fear that our devices are listening to our conversations when we aren’t using them. For instance, this study examined 81 different Internet of Things devices and found that many of them surreptitiously transmit information to tech companies like Netflix, Amazon, and Google, even when the device isn’t being used (Ren et al.).

Advancements in digital technology have made it easier than ever for organizations to secretly collect information, which has created a number of privacy concerns. This chapter explores the history of surveillance capitalism, identifies common surveillance practices, and examines how surveillance is often used not only to predict but also to control human behaviors. The chapter ends with basic guidelines for protecting your private information.

Learning Objectives

  • Understand the history of surveillance and how different practices emerged over time as technologies became more advanced.
  • Understand the connection between surveillance and power.
  • Consider all of the different types of surveillance practices and the types of data that organizations are able to collect based on our digital devices.
  • Learn how commercialization connects with surveillance and how companies use surveillance and metadata to not only predict but influence behavior.
  • Consider why surveillance is such a concern for society.
  • Learn practical ways to protect your own private information.

The History of Surveillance Capitalism

As discussed in the previous section, the practice of compiling information into personal files has a long history that predates the advent of digital technology. Surveillance practices have an even longer, more complex history. Michel Foucault’s 1975 book Discipline and Punish: The Birth of the Prison has had a profound influence on our understanding of surveillance practices and the social consequences throughout history. In it, Foucault traces the rise of the prison system in eighteenth-century Europe. Before prisons emerged as the standard form of punishment for criminal behavior, people who were convicted of a crime were typically subjected to some sort of public torture or punishment. For instance, criminals might be put in the pillory or in stocks in the town square, which was intended to deter future crime from the individual as well as everyone else in town. When prisons emerged, however, punishment shifted from being mostly physical to being mostly psychological. Within the confines of the prison, every aspect of the prisoner’s life was regimented, monitored, and controlled.

The panopticon is what made surveillance possible. Defined by its circular structure and a system of backlighting in each prison cell, the panopticon made it possible for a guard in the tower to monitor every movement that the prisoners made. They were constantly visible. The guard in the tower, on the other hand, was hidden. There were no lights in the tower to let prisoners know whether or not they were being watched; however, they knew it was always a possibility. So they had to constantly be on their best behavior to avoid punishment. Over time, the prisoners began to internalize the gaze of the guard and to monitor their own behaviors.

Inside one of the prison buildings at Presidio Modelo, Isla de la Juventud, Cuba.
Presidio Modelo 2, by Friman on Wikimedia Commons (CC BY-SA)

The panopticon serves as a poignant metaphor for current surveillance practices. Just like the prisoners in the cells, everyone in society is subjected to different forms of surveillance that force their exposure—making their behaviors, identities, and relationships visible. Meanwhile, the organizations that collect this information about us (the metaphorical guards in the tower) are largely invisible. We don’t often see who specifically is collecting information about us, what that information is, or how it’s being used. Even more significant is the psychological effect this surveillance has. Like the prisoners in their cells, most people don’t know if they are being watched. They just know that it’s always possible, so they also internalize the gaze of authority to monitor their own behaviors.

A classic example might relate to traffic laws. Most of us monitor our own speed. We come to a full stop at a stop sign. We sit at an intersection waiting for the light to turn green even if there are no other cars at the intersection. It’s always possible that a cop might be stationed nearby or that we are being filmed by a traffic camera somewhere, and because we don’t want to get a speeding ticket or have our license revoked, we drive a little slower. We make the full stop. We wait for the light to turn at the empty intersection. Often we don’t even consciously make these choices because the behaviors—and often the underlying message about safety and good driving—have become ingrained in our minds. The same is true for lots of activities we do all day long, even parts of our identities that we might typically think of as inherent. Foucault would argue that a complex system of surveillance, rewards, punishments, self-monitoring, and internalization is responsible for nearly all of our behaviors.

While surveillance itself isn’t new, the term “surveillance capitalism” typically refers to more recent practices related to digital technology and data collection, which emerged gradually once the internet shifted to a commercial enterprise run by a few private companies. You might recall from the first chapter that the World Wide Web was created with a focus on information sharing. The NSFNET, developed by the National Science Foundation, was dedicated to educational advancement and had initially banned the commercial use of the internet. That changed in the mid-1990s as the number of users grew beyond what the NSF could manage. Private companies provided internet services, and more and more businesses registered their websites with the purpose of attracting new customers.

One of the key events that perpetuated the development of surveillance capitalism was the invention of cookies, which would record the unique navigation paths of individual users as they visited particular websites. According to Meg Leta Jones in “Surveillance Capitalism Online,” these paths were called “clickstreams,” which allowed web creators to understand how users were navigating their site and to use that information to evaluate the website’s effectiveness. This technology was not built into the original design of the World Wide Web. In fact, Tim Berners-Lee intended for the web to be “stateless,” meaning that it would retrieve the information that a user is looking for, but nothing about the transaction would be recorded or documented. For companies looking to understand the user journey, they called this original design the “memory” problem (Jones 186). The cookie, invented by computer scientist Lou Montulli, created a unique session ID that began when a user accessed a specific website. The server for that website would place the text code into the memory of the user’s browser, which is then stored on the hard drive and allows the website to recognize that user each time they return to the website.

Importantly, Jones points out that the cookie is an “opaque” memory, meaning that it’s invisible to the user (187). The code could include anything, to collect any sort of data, and the user wouldn’t know, which connects directly to the idea of surveillance. The user becomes more visible while the companies collecting the data are largely hidden. This obviously raised concerns about user privacy, which then prompted Montulli to adjust the process so that users could manage their own cookie preferences and receive notifications before a cookie was placed on their computer. They could accept or reject the cookie, though the default was set so that all cookies were accepted. He argued that this was a technique that actually protected users because it kept their identities anonymous and it allowed them a measure of control over whether information was collected, though they still wouldn’t know what type of information was collected or how it was being used.

And then as the Clinton administration ushered in self-regulation policies that removed commercial restrictions online (White House) and as only a handful of large tech companies survived the dot-com crash in the early 2000s, new practices emerged that worked against the privacy of users (Salvucci). Whereas companies had originally tracked user data only when they were on a particular page, now they began tracking the users themselves as they traveled across the internet, and they began selling user data to interested third parties who could now market their products and services directly to users based on their past behaviors.

Interestingly, Jones points out that surveillance capitalism didn’t begin with the cookie. As far back as the 1970s, direct mail companies would buy mailing lists from varying companies with the explicit purpose of targeting them with mail advertisements. For instance, one medical lab sold patient information to a direct mail service, which could then target those patients with ads for specific products. One woman received diaper coupons because her medical information revealed that she was pregnant. Jones explains, “Data breaches, sexual privacy, informational asymmetries, and racial bias were issues with direct marketing expressed by members of Congress in 1970” (197). However, when given the chance to take their names off all mailing lists, very few people responded. Though direct mail was considered a minor nuisance, the opt-out method was deemed sufficient to protect consumer information, which Jones argues is what set the precedent for increased surveillance capitalism once cookies were established.

Another important component in the history of surveillance capitalism was the emergence of Google, which improved search results by creating a system that collected information about users, their search queries, and their online behaviors. As Shoshana Zuboff points out in The Age of Surveillance Capitalism, the system was initially about enhancing the user experience and providing the most relevant queries. However, it didn’t take long for Google to realize the value of the massive amounts of user data it was collecting, which could be used for targeted advertising that profits off people’s personal data. Zuboff also points to Facebook as a pioneer in surveillance capitalism through its collection of user data and targeted advertising. Zuboff argues that increasingly, users are treated as commodities as every click is recorded as some sort of metric for enhanced analytics and profit. Businesses obviously profit off this sort of data, but so too do political organizations that have a stake in influencing voter opinions and behaviors. The last chapter discussed the power of media spins and echo chambers to direct people’s perceptions of reality. By collecting data about these users, political entities could target their ads to individuals who are more likely to respond favorably to their message—because they are predisposed to certain beliefs and values. For example, the Cambridge Analytica scandal in 2016 revealed that the data firm had acquired data from 50 million Facebook users and then sold that data to political campaigns so they could build voter profiles and create targeted ads that would sway voter sentiment (Confessore). This data was used to assist the Ted Cruz and Donald Trump campaigns in 2016 and to promote public support for Brexit.

Surveillance Practices

Remember that word “ubiquitous” from the first chapter? If our use of technology is ubiquitous, so too is surveillance. Hardly a moment of the day goes by when we don’t have our phones nearby, tracking our location. And as mentioned in the previous section, many companies are tracking our activities across the internet, gathering data about sites we’ve visited, hyperlinks we’ve followed, products we’ve purchased, and so on. But the ways in which we are surveilled and the (known) uses of this information extend much further. Though it’s impossible to identify every surveillance tactic and though government organizations and large tech companies aren’t always forthcoming about the types of information they are collecting, this section will look at some of the more common surveillance strategies made possible through a variety of technologies.

Video surveillance. Video cameras are increasingly common, particularly in public spaces such as busy intersections, parking lots, street corners, schools, shopping centers, and restaurants. Often, there will be a sign alerting the public that they are being recorded, but not always. Sometimes the cameras themselves are easy to spot, but not always. As you might imagine, cameras are used in many circumstances to help prevent crime and other unwanted behaviors. They’re also helpful once a crime or some other incident has occurred because they provide footage of what happened and who was involved. Facial recognition software and license plate scanners make it even easier to identify individuals in specific places and visibly track their movements.

The concern many people have relates to the abuse of video surveillance—tracking and recording someone’s movements for the sake of spying or using video surveillance in deceptive or inappropriate ways. For instance, people of color are typically monitored more closely, and video surveillance makes it easy to target/scrutinize these individuals while they are buying groceries, pumping gas, eating dinner, and so on (Lee and Chin). There have also been incidents where video cameras have been discovered in inappropriate places, demonstrating a gross abuse of power. In 2003, a middle school in Tennessee was sued by parents after they found out there were video cameras in the locker rooms that had recorded their children undressing, and because of a lack of security, the footage had been accessed by unauthorized users outside of the school system (ABC News).

Laptops and cell phones also have built-in video cameras, which is a cause for concern. During the COVID-19 pandemic as schools shifted to online learning, many people were concerned about platforms like Zoom that gave viewers access to students’ homes, including their bedrooms and other activities that occurred in the background (Lieberman). Similarly, it became common practice to use remote proctors and other surveillance technologies to monitor students’ behaviors while they were taking tests. In many instances, students felt the monitoring was unfair and overly aggressive. And then there are the typical concerns about hackers who gain access to video feeds from laptops, phones, and personal home surveillance systems, usually with the intention of gaining bank account information or spying on users’ personal activities (Erickson).

Voice recognition. Many technologies also have a microphone, including your phone, your computer, and many of your household devices like Siri or Alexa. There’s been concern for some time that the government spies on personal conversations in order to collect personal information (Weinstein). However, tech companies are also leveraging voice recognition technology in order to identify who we are and build a profile about us. Google says that it uses snippets of voice recordings in order to get aggregate data about user preferences and how the device can be improved, but it says it doesn’t connect specific conversations with specific users (Tucker). Of particular concern is the fact that these smart devices are listening all the time that they are on, even when they haven’t been activated (King). What’s more, the recordings are sometimes sent to third-party reviewers, which puts personal data at an increased risk.

Emails and texts. If you have an email account or a cell phone provided by your employer, they have the right to view that information. In fact, many employers have monitoring software installed on employee computers that allows them to view emails and other online activities (West). Cell phone companies retain records pertaining to text messages and when they were sent. Though most say that they retain the messages themselves for only a short time after they have been delivered, some do store the data for several days or even months after delivery (Evans). Likewise, up until 2017, Google would scan Gmail customers’ emails for the purpose of having them analyzed for direct marketing purposes. Though Google no longer scans emails, it is still able to track users across multiple Google apps, including Gmail, Google Chrome, Google Analytics, and Double Click. In fact, Google recently published its privacy label for the Gmail app, which reports on all of the types of data Google will collect, including location, purchase history, contacts, user content, search history, identifiers, and “other data” (Google, “Gmail”).

Social media. Social media companies like Facebook, X (formerly Twitter), Instagram, and YouTube can also collect information about you. This includes the information you provide in your profile as well as your posts. It also includes your list of contacts and the things that you like and share. All of this is information that can be used to provide targeted ads across multiple web platforms—not just your feed on that particular site. Many social media sites also collect information about your location, phone and email contacts, text and email messages, payment information, online views, and more. These social media companies not only use the information to better their own platforms, but they also make a lot of money by selling user information to advertisers (Vigderman and Turnera).

Digital records. In the digital age, most of the private information about you is stored on a digital server somewhere—your health care history, the medications you take, your bank account information, your grades, your credit score, your legal record. For instance, think about the type of information that most colleges collect about individual students—high school transcripts, income information, bank information, individual test scores and homework activities, attendance records, disciplinary files, physical or mental health information, specific disabilities, and so on. Likewise, doctor’s offices, government organizations, merchants, and others all collect private information about you.

Big data. On a larger scale, more businesses as well as educational and governmental organizations are able to collect big data—massive amounts of information from a lot of different users that help them see big-picture trends. It’s less about the individual user and more about analytics. To put it into context, this article in the Harvard Business Review says that Walmart collects 2.5 petabytes of information every hour based solely on customer transactions (McAfee and Brynjolfsson). Since a petabyte is “20 million filing cabinets worth of text,” that means Walmart can fill 50 million filing cabinets in the course of an hour, and that doesn’t count the information collected through cookies and other surveillance tools.

IoT. Let’s look at one last category—Internet of Things devices. These are smart devices that connect to the internet and can then be controlled by our phones. Home security systems, garage doors, heating and cooling systems, medical sensors, smart watches, digital assistants like Alexa, doorbell cameras, and so on are all IoT devices that collect data. Some of the information is logged right away when you initialize the product and create a user profile, but most of the information is created through your ongoing use of the product (McFadin). This includes information about the product and its status, the surrounding environment, and usage.

Activity 4.1

This section includes many of the most common forms of surveillance and data collection. Make a list of the surveillance technologies that you encounter each day—whether on your personal devices or in public spaces.

After you make your list, separate them into two lists—one for what you consider to be acceptable or even beneficial uses of surveillance and one for what you consider to be too intrusive or threatening in some way. Consider your rationale for what constitutes positive surveillance (if anything) and what types of activities cross the line.

Commercialization and Behavior Modification

By now it’s probably pretty clear that surveillance capitalism is about power and profit. Thinking back to Foucault’s theory of the panopticon and the psychological effects that surveillance has on an individual, surveillance is a mechanism of control because it creates “docile” bodies who have become conditioned to behave in desirable ways, typically submitting to authority and accepting certain behaviors (and identities) as “normal.” Once people are conditioned in this way, it’s easy to capitalize on their perceptions and values to create market demand for certain products and to influence buying behaviors. Organizations in every industry collect information about people’s experiences and turn it into data that can be leveraged to make predictions about what they will do. The more relevant the advertising is to an individual, based on their demographics, interests, location, and so on, the more likely they are to buy the product. In an interview about her book on surveillance capitalism, Zuboff explains,

The opportunities to align supply and demand around the needs of individuals were overtaken by a new economic logic that offered a fast track to monetization.…This economic logic has now spread beyond the tech companies to new surveillance-based ecosystems in virtually every economic sector, from insurance to automobiles to health, education, finance, to every product described as “smart” and every service described as “personalized.”

Zuboff demonstrates the pervasiveness of surveillance capitalism with the recent discovery that even breathing machines are collecting data about people with sleep apnea. Health insurance companies receive that data and then use it for their own benefit, usually to deny or reduce coverage.

It would be impossible to trace all of the ways that surveillance capitalism either makes or saves (as in the example above) companies money, but a few statistics might help put the growth of surveillance capitalism and digital advertising in perspective:

  • From 2001 to 2020, digital advertising has grown by more than 40% (Ebsworth et al.).
  • In 2021, digital advertising rose nearly 30% from the previous year, making the largest annual growth in history (Oberlo). The total spent on digital advertising that year was more than $520 billion worldwide.
  • Google, Amazon, and Facebook receive the bulk (two-thirds) of these advertising dollars.
  • Amazon’s income from advertising doubles every two years ($14.1 billion in 2019). In 2020, Google made $147 billion in advertising. In that same year, Facebook made $86 billion, totaling 98% of its income (Ebsworth et al.).

Certainly, surveillance capitalism has proven to be popular, which is why it has grown so quickly. However, the major concern stems from two interrelated issues:

  • Invasion of privacy. As much of this chapter has demonstrated, surveillance capitalism is clandestine (Andrew et al.). While people create Facebook and Google accounts because they believe these services are created for their benefit, they are often unaware that their profile information and online activities are tracked so that commercial entities can commodity their experiences and inundate them with advertising. Lots of different applications and smart devices have access to data that many people couldn’t imagine, and much of it is sensitive data that they wouldn’t want to be collected—and they certainly wouldn’t want it to be shared and exploited.
  • Behavior modification. Researchers like Zuboff claim that surveillance capitalism not only predicts what we will do but it can actually alter the choices that we make through consistent yet subtle messaging that rewards us for certain behaviors and thus conditions us toward preset outcomes. Zuboff puts it this way: “The shift is from monitoring to what the data scientists call “actuating.” Surveillance capitalists now develop “economies of action,” as they learn to tune, herd, and condition our behavior with subtle and subliminal cues, rewards, and punishments that shunt us toward their most profitable outcomes.”

Behavior modification goes hand in hand with surveillance. As companies collect more data about us, they are able to create better algorithms and ultimately make better predictions. This is largely dependent, however, on people’s increasing engagement with digital media. Nir Eyal’s book Hooked: How to Build Habit-Forming Products explains the way that companies intentionally create addictions to products through a system of “triggers” (prompting users to use the product) and “rewards” (based on the user’s compliance with the trigger). As Ebsworth et al. explain, “Increasing engagement via the hook model provides both a constant source of behavioural data and a committed audience for advertisements.” In other words, as engagement (and addiction) increase, free will and freedom of thought would diminish as people are “conditioned,” “tuned,” and “herded” to make specific buying choices. As Foucault would say, they have become “docile” bodies.

Importantly, the concern about behavior modification is about more than just how people spend their money. It’s also about ideologies and political choices. Political parties and interest groups can use data collection to steer people toward certain websites and information and use a similar system of rewards and punishments to direct their attention as well as their ways of thinking. “Political partisans, including shadowy private interests and foreign governments, promise to swing voters and elections with microtargeted social media campaigns” (Lauer and Lipartito 1). Some notable examples include the Cambridge Analytica scandal noted above and also more recent attempts to influence Cuban voters in South Florida with propaganda that positions Democrats as communists (Kapnick; O’Sullivan and Sands).

Employers also use surveillance technologies to control behaviors. Systems for clocking in and out, video cameras in public workspaces, and software that monitors employee activities on work computers and cell phones all have the effect of keeping employees in line so that they can keep their jobs and perhaps be rewarded with promotions or pay increases. This is exactly the kind of surveillance and control that ties directly to Foucault’s panopticon. People modify their behaviors to avoid punishment (or gain a reward), and over time as they internalize the gaze of the “guard” (in this case, the employer, but it could also be a teacher, government official, religious leader, etc.), their ways of thinking and being begin to shift in ways that benefit the authorities in charge.

Importantly, while everyone in society is surveilled to some degree, it is always the populations with the least amount of power—those who are already marginalized—who are surveilled to the greatest extreme: welfare recipients, people of color, immigrants, working-class employees, criminals, and women. Gellman and Adler-Bell explain it like this:

Privacy scholars speak of philosophical rights and hypothetical risks; privacy-minded middle class Americans fear allowing the government too much access to their electronic trails. But there is nothing abstract about the physical, often menacing, intrusions into less fortunate neighborhoods, where mere presence in a “high-crime” area is grounds for detention, search, and questioning by police. At age sixty-five, tens of millions of Americans claim their Medicare benefits with nothing more eventful than completion of some forms. (Medicare.gov even promises to “protect your privacy by getting rid of the information you give us when you close the browser.”) An impoverished single mother on Medicaid faces mortifying questions, face-to-face with benefit managers, about her lovers, hygiene, parental shortcomings, and personal habits.

Their point is that surveillance capitalism affects everyone but not to the same degree. Its prevalence in our society has had real consequences for minorities, putting them under higher levels of scrutiny, finding reasons to inflict “punishments,” and perpetuating discriminatory practices and injustices.

You might wonder what all of this has to do with digital writing. In some ways, it’s maybe more geared toward digital literacy, understanding how to use digital media and the consequences of your choices. This critical unit is about complicating the overly simplistic and utopian views of digital media as unequivocally progressive and beneficial. It’s about lifting the veil to examine the more negative effects of digital media, which will hopefully lead to more informed and intentional engagement. Also, as we’ll see in the third section of the book related to functional literacy, digital writers do participate in surveillance practices. SEO is based exclusively on behavioral data—how people search for information and what they do as they travel across the internet. Digital writers use that data to write copy that will capture users’ attention. They also use analytics to assess the effects of their SEO practices, and they use lead generation to send email campaigns and targeted social media ads to people—some who may have opted in to receive updates and ads, but many who haven’t. Critical literacy means thinking more deeply about these practices and their ethical and social—not just financial—consequences in order to make informed choices.

Protecting Your Privacy

While it’s impossible to navigate online spaces and remain completely off the grid, there are some important ways that you can control/limit the types of information that businesses can collect about you. You can—and should—take important steps to protect your privacy, your data, and your accounts from cybercriminals. This final section of the chapter lists some basic things that you can do.

  • Create strong passwords. Probably the best thing you can do to protect your private information from hackers is to create passwords that they can’t guess. That means that they should be long and have a variety of upper and lowercase letters in addition to numbers and symbols. You should also have different passwords for different accounts. That might be a lot of passwords to keep track of, but there are password managers available that generate strong passwords, help you keep track of the passwords for your different accounts, and monitor your accounts for data breaches. A couple of common data managers are 1Password, LogMeOnce, and Dashlane. Of course, you’d want to make sure that if you use a password manager, your “master password” for that account is strong.
  • Use two-factor authentication. Many accounts are moving toward requiring two-factor authentication, which only grants you access to your account when you provide two pieces of evidence that you own the account, including a password in addition to a temporary passcode, facial recognition, a fingerprint, or an answer to a security question. This is a great way to prevent unauthorized users from gaining access to your accounts. You can also enable two-factor authorization on some of your accounts, including your Apple ID, Google, Facebook, X, and banking websites. Check your account settings to see what types of security features are available.
  • Install a browser extension. As this chapter discusses, companies and other organizations can collect more information about you than you might realize based on your browsing history. A browser extension like uBlock Origin is designed to prevent this data collection as well as the targeted ads that are generated as a result. Privacy Badger is another web extension that prevents trackers and targeted ads.
  • Cover your camera. It’s easier than you might think to install software on your computer that will take over your camera and collect private footage. An easy way to avoid that happening is to put a sticker over your camera.
  • Opt out of targeted ads. Platforms like Apple, Facebook, Google, and X allow you to disable interest ads in your account settings, which will help cut down on the amount of targeted ads that you receive. You can also use this Simple Opt Out website to prevent similar data collection and ad targeting from numerous other organizations. Since you are “opted in” by default, you’ll have to manually opt out of any sort of data collection.
  • Consider a virtual private network (VPN). If you often connect to public Wi-Fi, a VPN can create a more secure connection by encrypting your data and hiding your IP address so that your internet service provider and other third-party organizations can’t track your activities or see your information.
  • Use antivirus software. Once your device has been “infected” with a virus, it’s incredibly difficult to recover. Antivirus software adds another layer of protection that will help prevent viruses and other malicious software. Windows Defender is built into Windows automatically. Similarly, Mac computers have browser extensions and other security built in. You could always add another layer of security, such as Norton Antivirus or Malwarebytes. It’s also important that as updates are made available for your browser, apps, and home devices, you install the latest version, which often has improved security features designed specifically for the latest threats.
  • Enable remote tracking on your phone. If you lose your phone, it’s important to make sure that your data doesn’t fall into the wrong hands. To enable remote tracking, you must first set up a pin number and/or biometric login (i.e., facial or fingerprint recognition). You can then activate your phone’s remote tracking feature, which will allow you to identify where your phone is and, if necessary, delete all of its data from a remote location.
  • Beware of phishing scams. Increasingly, scammers are targeting people through their emails and text messages, often posing as someone that you might know and encouraging you to click a specific link or download a file. Never click on a link in an email or text message unless you are certain it’s legitimate. You can hover over the link to see for sure what the URL is and if it matches the location where it says it will take you. Lots of misspellings and invalid email addresses or phone numbers are also red flags. You’d also avoid downloading unknown files unless you verify with the person who sent it about what it is. These are common ways that criminals spread malware that can lock you out of your device.
  • Encrypt the information on your laptop. It’s also possible to enable encryption on your laptop so that if it’s ever lost or stolen, your data can’t be accessed because it will come up as gibberish. For Mac users, this encryption tool is called FileVault and can be turned on in the system preferences menu. For PC users, device encryption is available under the settings menu, under “Updates & Security” (Morse).
  • Be discerning. You want to keep your phone number and email address as private as possible. Don’t give them out unless you have to, and if you do have to give an email address to sign up for coupons or some other perk, it’s helpful to have a secondary “burner” email address devoted to spam. That way, if the account does get hacked, it doesn’t include a lot of personal information about you, and it’s not connected to your important accounts. You should also make sure your settings on social media and other accounts are set to “private” and that you don’t share sensitive information or photos over social media, email, or text messaging that could be damaging if they fell into the wrong hands.

As you can see, these guidelines relate to privacy in a variety of ways—from targeted ads and data collection prompted by businesses and tech companies to data breaches from hackers looking to steal your private information. Though you won’t be able to block all forms of surveillance that have become so prominent in our society, taking a few easy steps and being smart about the information that you share will help you take control over your data and provide a measure of protection from the invisible forces that have a stake in tracking your digital activities.

Activity 4.2

This section provided lots of helpful, and fairly easy, ways that you can protect your information. This activity is simple: See how many of the security measures you can complete on your phone or laptop in the course of 15 or 20 minutes. Many of them require a simple security adjustment or a change to your account preferences.

For those that are more complex—researching VPNs and browser extensions, for instance—do some research about your options, including features, cost, and consumer ratings. Compile a list of your top options.

Discussion Questions

  1. Look again at the introduction to this chapter and the analogy of the manila folder as it relates to today’s more complex data collection. How did the manila folder and other organization systems make surveillance possible? Why is it fitting that the manila folder is sometimes seen as a symbol of oppression?
  2. The chapter states in several places that data collection has the power to control behaviors in various ways. What does this mean? In what ways does surveillance influence people’s behaviors?
  3. What is a panopticon, as described by Foucault? How does the metaphor of the panopticon relate to current surveillance practices in our culture?
  4. Which types of institutions perform this type of “panopticon” surveillance? Name as many as you can think of.
  5. What does the “capitalism” in “surveillance capitalism” refer to? How do companies make money off surveillance practices?
  6. What are some key historical events that perpetuated surveillance capitalism as we experience it today?
  7. This chapter mentions several ways that your data are collected. Which ones were you already familiar with? Which ones were a surprise?
  8. How do data collection and targeted advertising relate to behavior modification? What are some basic activities that organizations want to “nudge” people to do? Can you think of any specific examples of this type of nudging?
  9. Why is surveillance capitalism such a concern?
  10. What are some basic ways that you can protect your own personal data?

Sources

1Password. “The World’s Most Loved Password Manager.” 1Password.com, n.d., https://1password.com.

ABC News. “Locker Room Cameras Expose School to Suit.” ABC News, 16 July 2003, https://abcnews.go.com/US/story?id=90486&page=1.

Andrew, Jane, et al. “Data Breaches in the Age of Surveillance Capitalism: Do Disclosures Have a New Role to Play?” Critical Perspectives on Accounting, vol. 86, 7 Dec. 2021, https://www.sciencedirect.com/science/article/pii/S1045235421001155.

Apple. “Control Personalized Ads on the App Store, Apple News, and Stocks.” Apple.com, 2022, https://support.apple.com/en-us/HT202074.

Apple Store. “Gmail-Email by Google.” Apple.com, 2022, https://apps.apple.com/us/app/gmail-email-by-google/id422689480.

Banks, Adam. Race, Rhetoric, and Technology: Searching for Higher Ground. Routledge, 2005.

Bristow, David L. “How Filing Cabinets Changed the World.” History Nebraska, 24 Feb. 2022, https://history.nebraska.gov/blog/how-filing-cabinets-changed-world.

Consumer Watchdog. “How Google and Amazon Are ‘Spying’ on You.” ConsumerWatchdog.org, 2022, https://www.consumerwatchdog.org/privacy-technology/how-google-and-amazon-are-spying-you.

Dashlane. “Dashlane.” https://www.dashlane.com.

Ebsworth, Jonathan, et al. “Surveillance Capitalism: The Hidden Costs of the Digital Revolution.” Evangelical Focus Europe, 7 Sept. 2021, https://evangelicalfocus.com/jubilee-centre/13158/surveillance-capitalism-the-hidden-costs-of-the-digital-revolution.

Erickson, Alexa. “The Real Likelihood You’re Being Watched through Your Laptop Computer.” Reader’s Digest, 1 Apr. 2022, https://www.rd.com/article/laptop-camera/.

Evans, Joseph B. “Cell Phone Forensics: Powerful Tools Wielded by Federal Investigators.” Fordham Journal of Corporate and Financial Law, 2 June 2016, https://news.law.fordham.edu/jcfl/2016/06/02/cell-phone-forensics-powerful-tools-wielded-by-federal-investigators/#:~:text=Cellular%20service%20providers%20retain%20records,very%20long%2C%20if%20at%20all.

Eyal, Nir. Hooked: How to Build Habit-Forming Products. Penguin Publishing Group, 2014.

Facebook. “How Can I Adjust How Ads on Facebook are Shown To Me Based on Data about My Activity from Partners?” Facebook.com, https://www.facebook.com/help/568137493302217.

Foucault, Michel. Discipline & Punish: The Birth of the Prison. Vintage Books, 1975.

Gellman, Barton, and Sam Adler-Bell. “The Disparate Impact of Surveillance.” The Century Foundation, 21 Dec. 2017, https://tcf.org/content/report/disparate-impact-surveillance/?session=1.

Google. “Block Certain Ads.” Google Support, https://support.google.com/ads/answer/2662922?hl=en.

———. “GMail—Email By Google.” Google, n.d., https://apps.apple.com/us/app/gmail-email-by-google/id422689480.

———. “How Google Uses Cookies.” Google Terms & Privacy, https://policies.google.com/technologies/cookies?hl=en-US.

Jones, Meg Leta. “Surveillance Capitalism Online: Cookies, Notice and Choice, and Web Privacy.” in Surveillance Capitalism in America, edited by Josh Lauer and Kenneth Lipartito, University of Pennsylvania Press, 2021.

King, Evan Thomas. “How Google and Amazon Are ‘Spying’ On You.” Consumer Watchdog, 15 Dec. 2017, https://consumerwatchdog.org/uncategorized/how-google-and-amazon-are-spying-you/.

Laidler, John. “In New Book, Business School Professor Emerita Says Surveillance Capitalism Undermines Autonomy—and Democracy.” Harvard Business Review, 4 Mar. 2019, https://news.harvard.edu/gazette/story/2019/03/harvard-professor-says-surveillance-capitalism-is-undermining-democracy/.

Lauer, Josh and Kenneth Lipartito. “Introduction” in Surveillance Capitalism in America, edited by Josh Lauer and Kenneth Lipartito, University of Pennsylvania Press, 2021.

Lee, Nicol, and Caitlin Chin. “Police Surveillance and Facial Recognition: Why Data Privacy is Imperative for Communities of Color.” Brookings, 12 Apr. 2022, https://www.brookings.edu/research/police-surveillance-and-facial-recognition-why-data-privacy-is-an-imperative-for-communities-of-color/#:~:text=In%20fact%2C%20surveillance%20and%20data,current%20circumstances%20and%20political%20regimes.

Lieberman, Mark. “Massive Shift to Remote Learning Prompts Big Data Privacy Concerns.” Edweek.org, 27 Mar. 2020, https://www.edweek.org/technology/massive-shift-to-remote-learning-prompts-big-data-privacy-concerns/2020/03.

LogMeOnce. “All-In-One Security Platform.” LogMeOnce.com, https://www.logmeonce.com.

Lui, Claire. “The Manila Envelope: The Inspiration Behind an Exhibition’s Graphic Identity.” Guggenheim.org, 2 Apr. 2021, https://www.guggenheim.org/blogs/checklist/a-manila-envelope-the-inspiration-behind-an-exhibitions-graphic-identity.

McAfee, Andrew, and Erik Brynjolfsson. “Big Data: The Management Revolution.” Harvard Business Review, Oct. 2012, https://hbr.org/2012/10/big-data-the-management-revolution.

McFadin, Patrick. “Internet of Things: Where Does the Data Go?” Wired, 2018, https://www.wired.com/insights/2015/03/internet-things-data-go/.

Malwarebytes. https://www.malwarebytes.com/.

Metz, Rachel. “Yes, Tech Companies May Listen When You Talk to Your Virtual Assistant. Here’s Why That’s Not Likely to Stop.” CNN.com, 19 Aug. 2019, https://www.cnn.com/2019/08/19/tech/siri-alexa-people-listening.

Microsoft. “What Protected with Windows Security.” Microsoft.com, https://support.microsoft.com/en-us/windows/stay-protected-with-windows-security-2ae0363d-0ada-c064-8b56-6a39afb6a963.

Morse, Jack. “How to Encrypt Your Computer (and Why You Should).” Mashable.com, 14 Aug. 2021, https://mashable.com/article/how-to-encrypt-computer-windows-mac#:~:text=How%20to%20encrypt%20a%20Windows,encryption%20option%2C%20select%20Turn%20on.

Norton Antivirus. https://us.norton.com/.

Oberlo. “Digital Ad Spend (2021–2026).” Oberlo.com, https://www.oberlo.com/statistics/digital-ad-spend#:~:text=Digital%20Advertising%20Growth,expected%20to%20continue%20in%202023.

Ong, Walter. “The Writer’s Audience is Always a Fiction.” PMLA, vol. 90, no. 1, 1975, pp. 9–21, https://www.jstor.org/stable/461344.

Ren, Jingjing, et al. “Information Exposure from Consumer IoT Devices: A Multidimensional Network-Informed Measurement Approach.” IMC, Proceedings from the Internet Measurement Conference, 2019, pp. 267–269, https://moniotrlab.ccis.neu.edu/wp-content/uploads/2019/09/ren-imc19.pdf.

Salvucci, Jeremy. “What Was the Dot-Com Bubble & Why Did It Burst?” The Street, 12 Jan 2023, https://www.thestreet.com/dictionary/d/dot-com-bubble-and-burst.

Schabner, Dean. “Locker Room Cameras Expose School to Suit.” ABCnews.com, 16 July 2003, https://abcnews.go.com/US/story?id=90486&page=1.

Simple Opt Out. “Opt Out of the Data Sharing You Wouldn’t Opt in to.” Simple OptOut.com, https://simpleoptout.com/.

Tucker, Jamey. “Voice Assistants Are Always Listening. Should You Be Worried?” What The Tech, 8 Feb. 2021, https://www.whatthetech.tv/voice-assistants-are-always-listening-should-you-be-worried/.

Twitter. “Interest-Based Opt-Out Policy.” Twitter Business, https://business.twitter.com/en/help/ads-policies/product-policies/interest-based-opt-out-policy.html.

Ublock Origin. https://github.com/gorhill/uBlock.

Vigderman, Aliza, and Gabe Turner. “Who Much Would You Sell Your Social Media Data for?” Security.org, 22 July 2022, https://www.security.org/blog/how-much-would-you-sell-your-social-media-data-for/#.

Weinstein, Adam. “The Government’s Phone, Text, and Email Spying Explained.” ABCnews.com, 7 June 2013, https://abcnews.go.com/ABC_Univision/governments-phone-text-email-spying-explained/story?id=19347440.

West, Darrell M. “How Employers Use Technology to Surveil Employees.” Brookings.edu, 5 Jan. 2021, https://www.brookings.edu/blog/techtank/2021/01/05/how-employers-use-technology-to-surveil-employees/.

The White House. “Technology and Innovation.” The Clinton-Gore Administration: A Record of Progress, 2000, https://clintonwhitehouse4.archives.gov/WH/Accomplishments/technology.html.

Zuboff, Shoshana. The Age of Surveillance Capitalism. Profile Books, 2019.

License

Icon for the Creative Commons Attribution 4.0 International License

Writing for Digital Media by Cara Miller is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book