6 Affordances and Constraints: A Critical Analysis of Digital Spaces

This final chapter in the Critical Literacy Unit ties together some important information as we delve a bit deeper into a critical perspective of technology and how some of the theories we’ve discussed so far might be applied to your daily life. The previous chapters have explored various aspects of digital technology with the goal of complicating and extending your understanding of the social, professional, educational, and civic benefits. Hopefully, by now it’s clear why the concept of a “digital panacea” is a myth and that for you to be truly intentional and effective as you navigate various digital spaces, having a deep understanding of the benefits as well as the limitations of those spaces is crucial.

Far from more dystopian views of technology, this textbook celebrates the opportunities and incredible rewards that so often come along with online engagement—the opportunity to quickly and easily find information about a variety of topics, the ability to communicate with family and friends and colleagues to maintain social connectedness and increase productivity, the ability to engage in civic groups and participate in meaningful dialogue to address important issues. It’s no wonder that so much of our lives revolve around digital technologies; they’ve become integral to our communication patterns, our daily workflows, our personal interests, our civic activities, and our entertainment preferences. Indeed, part of critical literacy is being aware of various ways that you rely on digital technology. As we’ll see in this chapter, these technologies don’t just affect what you can do, they have a profound impact on your thought patterns and your sense of identity—what you can think and who you can be. And while that might sound a little chilling, particularly after our discussions about surveillance and behavior modification, it’s not necessarily a bad thing, and it’s certainly not new. Human beings have always been “shaped” by the tools that they use. Digital technology is a profound example in a long line of other examples, and it has significantly expanded the types of identities we can inhabit.

As we’ve considered in different ways throughout this first unit, digital technology isn’t automatically, inherently beneficial. There are opportunities for rewards just as there are also risks associated with digital participation. Without intentionality and careful self-monitoring, your use of digital technology can have a negative effect on your social relationships, your cognitive function, and your emotional health. (See chapter 2 for a recap of “The Bad and The Ugly.”) Without a conscious awareness of your everyday digital practices and the more subtle—often invisible—encounters that you might have with digital surveillance, commercialization, echo chambers, and underlying agendas and spins, you will most certainly be susceptible to the manipulations of other people whose values and interests may or may not align with your own. You’ll be unaware of how businesses and large, impersonal corporations motivated by profit margins influence how you spend your time and money, what you think about, and even what you believe to be true about the world. There’s also the reality that you have certain “privileges”—as evidenced by your basic ability to access this digital text (i.e., your access to a digital device and your ability to read the text)—and without a critical awareness of what those privileges are and the difficult obstacles that other people face, you’d never notice. You wouldn’t be a positive force to combat some of the most fundamental social injustices because you haven’t challenged yourself to pay attention to what they are.

This textbook is focused on helping you develop your own digital literacy so that you have the underlying skills and critical thinking required to leverage the numerous benefits of participating in digital spaces while also knowing how to identify and mitigate the risks and limitations. While the previous chapters in this unit focused on specific theories and concepts related to critical literacy, this final chapter takes a step back to look more generally at the affordances and constraints of digital tools. It will provide a framework that will help you think more deeply about the technologies, platforms, and practices that define your own participation and to tease out some of the inherent complexities of the choices you make from an ideological, personal, professional, and practical perspective.

Learning Objectives

  • Understand the concept of mediation and how technologies are always used as tools to mediate human activities.
  • Adopt a broader understanding of “technology” and how many of the basic technologies that we use have become “invisible” to us.
  • Learn the concept of affordances and constraints and how this framework can be used to take a closer look at how technologies affect your ways of doing, meaning, relating, thinking, and being.
  • Understand the concept of ideology and how ideologies are embedded in the design and use of digital technologies.
  • Consider the technologies you use for personal use—for communication, shopping, entertainment, health care—and how the framework of affordances and constraints might be applied.
  • Consider the technologies you use (or will be required to use) in your (intended) profession and how the framework of affordances and constraints might be applied.

Affordances and Constraints of Digital Tools

Most of the time, when we think of digital technologies, we focus on the things they allow us to do. For instance, we can talk “face-to-face” with someone miles away using FaceTime or Zoom. We can curate our own music playlists and listen to the songs we like all day long with portable earbuds. We can quickly and easily see what our friends, family, and acquaintances are up to with a quick scroll through our social media feed. And on and on. Many of the technologies that we use have become so commonplace that we don’t really give much thought to what life would be like without them. However, when we’re forced to go without them—maybe because we forgot our phone at home or the Wi-Fi has gone out—then suddenly we realize just how much we rely on digital technology to accomplish even the most basic tasks.

While digital technology—like all technology—certainly does allow us to do things more quickly and easily than we might have otherwise, its effects on our lives are much more profound. As we’ll discuss in this section, it has an inherent influence on our thought patterns and our own sense of “self” as we act out identities utilizing these tools. We’ll be relying heavily on the scholarship of Rodney Jones and Christoph Hafner, whose book Understanding Digital Literacies: A Practical Introduction describes the way that technology mediates all of our activities. Remember from chapter 3 that to “mediate” means to go between two things or to “facilitate interaction” (Jones and Hafner 2). They aren’t referring only to digital technology but any sort of technology that is human-made for the purpose of helping us accomplish certain tasks. Pencils, blankets, clothing, furniture, and toothbrushes are all examples of technologies created for very practical purposes. And as Jones and Hafner point out, they also “mediate” our activities as we interact with the world. For instance, we would be hard-pressed to keep our teeth clean and healthy without using certain technologies—a toothbrush, toothpaste, dental floss. The same goes for the quick reminder we might write ourselves on a Post-it note or the coffee that we drink to get ourselves going in the morning. Without the pencil and paper or the coffeemaker and mug, you wouldn’t be able to complete these very basic tasks, which probably seem fairly incidental in the overall scope of your day—until you have to go without them.

One of the best examples of technology and how it mediates our activities relates to our communication practices. In itself, language is a technology. It’s a human-made tool that helps us accomplish specific purposes, and it serves as a mediator between our thoughts and feelings and the people with whom we want to communicate those thoughts and feelings. Unfortunately, the Vulcan mind meld—where you simply touch your fingertips to someone else’s head in order to accomplish total mutual understanding—isn’t a real thing. You have to use language—be it verbal, written, sign language, or body language—to help someone else understand your meaning. Language is a medium that facilitates your interactions with others. As we learned chapter 3, language is often limited in helping us fully express ourselves and understand others. Misinterpretations and misunderstandings are common, though there is something to be said for the way that communication allows people to connect in very profound ways (Balter).

The main idea is that the technologies that we use do help us accomplish a variety of activities, but beyond that, they also become “extensions of ourselves” (Jones and Hafner 2). The more that we rely on everyday technologies to quickly and easily perform tasks, the more invisible those technologies become (Steinhardt). We might be extremely aware of how newer technologies like Zoom or virtual reality equipment open up new types of experiences and possibilities, but we don’t think so much about how our coffee mug holds hot and cold liquids and provides the ability for us to carry those liquids around. It’s an invisible technology that we take for granted, adopting it as part of our identity. Jones and Hafner put it like this: “In order to do anything or mean anything or have any kind of relationship with anyone else, you need to use tools. In a sense, the definition of a person is a human being plus the tools that are available for that human being to interact with the world” (2).

Drawing from Marshall McLuhan and Lev Vygotsky (Kurt), Jones and Hafner offer a framework that allows a deeper understanding of the way that technology influences our fundamental definitions of self. In his introduction to Understanding Media: The Extensions of Man, McLuhan writes, “Any extension, whether of skin, hand, or foot affects the whole psychic and social complex.” They don’t just affect what we can do but who we can be and how we relate to other people. Though basic technologies become invisible and are easily taken for granted, they have significant influence over every aspect of our human experience. When examining these influences, Jones and Hafner created a framework for understanding both the affordances—the possibilities that are enabled through our uses of technologies—as well as the constraints—the possibilities that are foreclosed by those same technologies. We can look at both the affordances and constraints of any given technology in five different ways:

  • Doing. At a fundamental level, technology helps us do certain things that we couldn’t do otherwise (the affordances). As we adopt their use for specific activities, they also prevent us from doing other things (the constraints). Jones and Hafner give the example of a microphone—a technology that allows you to easily talk to a large group of people so they can hear you (an affordance), though it makes it impossible to have a private conversation with one or two people in a crowd (a constraint). Similarly, an email allows us to communicate important information in writing to an individual or a group (an affordance), but it’s also impersonal and it slows down the back-and-forth exchange that an in-person conversation would provide (constraints).
  • Meaning. The technologies we use also influence the types of meanings that we can make. Applications like Zoom and Facebook Live, for instance, communicate audio and visual elements and have the effect of making the audience feel like they are simultaneously experiencing an event with someone else. These messages and events take on a different meaning for the participants than they would have otherwise through text alone (affordances). However, those meanings are still limited by the vantage point of the camera or the inability of the microphone to pick up more subtle sounds (constraints). Jones and Hafner also point out that as technologies emerge, we have different ways of referring to our activities—live streaming, zooming, posting, sharing, following, texting, chatting, and so on—which are different kinds of meanings than we had before.
  • Relating. Our use of technology influences the types of relationships that we can have based on the ways that we communicate. Some technologies allow for one-way communication from a speaker to an audience. Others allow for private one-on-one conversations as a limited group communicates back and forth. Still other technologies allow large groups of people to communicate back and forth. For instance, a text message is great for having a more private conversation with a closed group of people (affordance). However, it’s not very effective for speaking with a large group (constraint), and the nature of the platform makes it more difficult to share lengthy, in-depth messages (another constraint). Though you also relate to people outside of digital technology in different ways, chances are that many of your interactions do take place in digital spaces, and the technologies that are available to you have a significant influence over who you are able to communicate with and the method of communication.
  • Thinking. According to Jones and Hafner, our use of technology has “the capacity to change the way we experience and think about reality” (7). They go on to say that as we use technologies to accomplish certain tasks and interact with our surroundings, “certain things about the world will be amplified or magnified, and other things will be diminished or hidden from us altogether” (7–8). They are referring not just to what we think about but also our thought patterns. Because we have access to so much information at our fingertips, we don’t spend so much time memorizing facts (which can be seen as both an affordance and a constraint). Instead, we can spend more time with more sophisticated forms of thinking, like creating new theories about the information that we have access to. We can spend more time collecting, analyzing, interpreting, synthesizing, and reasoning (affordances). On the other hand, our constant interaction with short, entertaining videos and brief online text with bullet points and hyperlinks might also affect our ability to engage for longer periods of time with other forms of information that we find “boring” or difficult (constraints).
  • Being. On a fundamental level, the technologies we use have a direct influence on our sense of self—the identities that we inhabit. Doctors and nurses use medical stethoscopes and other medical equipment. Carpenters use hammers and measuring tape and other carpentry tools. As a student, you use textbooks and word processing applications and learning management systems, which are crucial to the ways that you act out your identity as a student. The technologies that you have access to influence the type of person you can be and how you express those identities—allowing for some options through the affordances and foreclosing other options through the constraints.

It might be worth noting that Jones and Hafner don’t consider constraints to be inherently bad. In fact, without the presence of a constraint, people often adopt technologies without thinking about better alternatives. And the presence of a constraint can spark people to think about more creative and innovative solutions. These concepts are simply tools that can help you think more deeply about the technologies you use, even (especially?) those that are so commonplace, you almost forget they are there. For the remainder of this chapter, we’ll use this framework to examine technologies from different perspectives in order to further extend your critical literacy and decision-making skills.

Activity 6.1

Make a list of the different technologies (beyond digital) that you use on a daily basis that have in some ways become “invisible” extensions of yourself.

Alternatively, you might take a look in your backpack or purse or even your pockets. What technologies do you carry with you each day?

For each technology that you identify, consider what those technologies allow you to do. How do they mediate your activity? How might they be considered extensions of yourself?

Now take a closer look at one or two of the technologies you identified. What are the affordances and constraints in terms of what these technologies allow you to do, mean, relate, think, and be?

An Ideological Perspective

A critical perspective of the affordances and constraints of digital technologies would certainly include an examination of the ideologies that are embedded in our assumptions of and uses for the online spaces we navigate. Everything that you do or think or believe about the world is compelled by underlying ideologies—what you have internalized as “normal” or “good” or “valuable.” Through the natural course of interacting with others—your parents, teachers, friends, pastors, and so on—you’ve developed your own ways of thinking about and acting in the world. For our purposes, there are three things that are helpful to understand about ideology:

  1. It’s based in social interaction. Value systems aren’t objective, nor are they independent of human creation. An ideology is a lens that is shaped by social, cultural, and historical forces, advocating one way of seeing and being over others.
  2. It’s often not explicit. Ideologies are implicitly embedded in the things we say and do, but they exist below the surface, often resisting a deeper interrogation. Critical literacy explicitly examines these underlying assumptions.
  3. It’s difficult to notice. Because we are so entrenched in our beliefs about the world, it’s easy to take them for granted as the objective reality—the “right” way of seeing and thinking. That’s why dominant ideologies that serve dominant groups at the expense of others are so difficult to identify and dismantle.

As mentioned in the introductory chapter, this book mirrors the framework for teaching digital literacies as outlined in Stuart Selber’s Multiliteracies for a Digital Age. He too dedicates a section of his book to “critical literacy,” which focuses on “how students might be encouraged to recognize and question the politics of computers” (75). In a similar way, the “Critical Literacy” portion of this book is dedicated to examining digital technologies and spaces that we tend to take for granted as inherently good and teasing out some of the complexities and political ramifications of our digital media practices. As we’ve seen throughout this section, there are a variety of ways that digital platforms can be utilized, just as there is a range of positive and negative effects that stem from the technologies we adopt and how we use them.

When we examine digital spaces from an ideological perspective, it means that we’re identifying the value systems and power structures embedded in those spaces that we tend to overlook. In 1985, Melvin Kranzberg, professor and president of the Society for the History of Technology (SHOT), gave a presidential address at the Henry Ford Museum, where he outlined “truisms” about how technologies develop and the effect they have on society. The first of his “Kranzberg’s Laws” was this: “Technology is neither good nor bad; nor is it neutral” (545). In other words, digital technologies are designed by human beings with their own biases and ideologies, and those biases are reflected in the ways those technologies are used, often having the effect of benefiting certain groups of people over others. For instance, a 2016 article about algorithmic biasing demonstrates that even the math formulas that are used in the design of a program aren’t neutral (Kharazian). The article looks specifically at a report about the Pokémon GO app and the fact that the vast majority of the stops on the program are situated in white neighborhoods, which in turn, made the game more accessible for white users living in those areas.

The Pokémon GO example illustrates some of the things we’ve already said about ideologies being embedded in digital technologies, usually in ways that are subtle, implicit, and easy to overlook. An ideological perspective is a way of looking at digital spaces beyond your immediate needs and uses and considering other types of questions: “What is lost as well as gained? Who profits? Who is left behind and for what reasons? What is privileged in terms of literacy and learning and cultural capital? What political and cultural values and assumptions are embedded in the software?” (Selber 81). These are all questions that relate back to the affordances and constraints that we identified earlier in the chapter and the deeper questions that consider how technologies encourage certain ways of doing, meaning, relating, thinking, and being. Selber goes on to say,

As such an uncomfortable line of questions implies, a critical approach to literacy first recognizes and then challenges the values of the status quo. Instead of reproducing the existing social and political order, which functional modes tend to [focused solely on the “how to”], it strives to both expose biases and provide an assemblage of cultural practices that, in a democratic spirit, might lead to the production of positive social change. (81)

In other words, the technologies we adopt can perpetuate social inequalities (Gaskell), but when we adopt an ideological perspective, we can expose biases and power structures and work toward effective solutions.

Once again, Selber provides a helpful framework to examine specific technologies and digital spaces. He identifies the following parameters:

  • Design cultures—considering the values and perspectives that influence the way that technologies are designed. Do some users benefit more than others because of embedded design bias?
  • Use contexts—looking at the ways that technologies are used in specific contexts. For instance, certain applications are sometimes required in the classroom or in the workplace in order to complete certain tasks. What are those technologies? Why are they required over others? How do the digital technologies and policies of a place affect different groups of people in the community?
  • Institutional forces—understanding the larger power structures that influence the technologies we adopt. What larger agendas are being served for a university or a company? Are these benefits at the expense of others in the community who have less power?
  • Popular representations—looking at the ways specific technologies are culturally constructed. How are our assumptions about technology embedded in cultural messages? How do those representations influence the technologies we adopt? How are larger practical and ethical concerns addressed (or not)?

From an ideological perspective, aspects of digital technology relate directly to the values that get promoted in a certain space—the beliefs, assumptions, actions, and identities that are promoted as “good” or “normal.” When paired with the critical exercise of considering the affordances and constraints—the types of identities, relationships, thought processes, and so on that are made possible by certain technologies as well as those that are excluded or devalued—we get a much better sense of the ideologies that are embedded in the technologies we use as well as the larger social consequences.

Activity 6.2

Consider a specific digital platform that is prominent in your school or workplace. It might be Canvas, Google Drive, or an application that is more specific to your major.

Now examine that platform from an ideological perspective, using the parameters named in this section:

  • Biases embedded in the design
  • Use context
  • Institutional forces
  • Popular representations

Write a response to each item and then write an overall conclusion about the social values that are embedded in that platform. What are the positive and/or negative consequences for different types of users?

A Personal Perspective

In contrast to the ideological perspective that challenges you to look more broadly at the technologies we adopt and the influences those technologies have on different groups of people and value systems, a personal perspective compels you to look at the individual—the affordances and constraints of the digital platforms that you use. In contrast to the professional perspective, which we’ll consider in the next section, the personal perspective doesn’t relate directly to your professional endeavors, though admittedly the line between professional and personal is sometimes blurry (even more so because of digital technology). We’ll be referring more specifically to the technologies that you turn to for accomplishing more personal objectives—communication with family and friends, educational pursuits, health care, and so on. While we already named some of the personal benefits of digital communication technologies in chapter 2, this section challenges you to look at the affordances and constraints of the specific platforms that you utilize. We’ll look together at some basic examples, but the idea is that you would apply the affordances and constraints framework to your own everyday practices.

Communication

There are way too many digital communication platforms to name, but some of the most common are text messaging, email, teleconferencing platforms, and specific social media sites. It will be up to you to list the platforms that are most common to you. At first glance, it’s obvious that digital technologies can greatly enhance our communication practices, making it possible to stay connected with a lot more people, regardless of geographical constraints and time differences. What’s more, different platforms offer different levels of intimacy, ranging from a quick text message or “like” on someone’s Facebook post to the more personal and nuanced conversations we can have with someone over Zoom. The focus in this chapter, as we work with affordances and constraints, is to consider the trade-offs of the platforms you select for given tasks. Each platform offers benefits and opportunities, but as we learned from Jones and Hafner, it also forecloses other possibilities. Because communication is so pivotal to all of the items listed in Jones and Hafner’s framework (i.e., your ways of doing, meaning, relating, thinking, being), it’s important to think carefully about the platforms you use and the effects they have.

Let’s look at one example in particular: text messaging. This seems like a pretty universal form of communication that applies to the majority of people. It’s also one of the more prominent ways that people can quickly and easily communicate. So let’s look at the affordances and constraints of this particular platform:

Doing. The benefit of texting is also probably obvious. It’s quick to send and receive a text message. It allows for direct communication with a single person or a small group. It’s also versatile, allowing users to include not just text but also emojis, pictures, and GIFs. However, there are some definite constraints to messaging that make it less ideal for other forms of communication. For instance, it isn’t effective for having a conversation with a large group of people. You can’t tell when someone has read your message. It’s also a constraint that you can’t delete a text message once it’s been sent, and once it is sent, it can be copied and reshared in ways that are beyond your control.

Meaning. Text messages allow for short, basic messages that allow people to make plans, share information, and check in with one another. Because of the ability to send not only text in a message but also hyperlinks, emojis, pictures, videos, and GIFs, users are able to create deeper meanings and create quick links to more developed information. Also, because of the nature of a text thread, there is a documented history of recent messages, which makes it easy to “keep up” with the conversation. In contrast, a text message wouldn’t easily support a long message. A text message also can’t express the full range of emotions and meaning behind the message. It doesn’t allow readers to hear the inflection in the speaker’s voice or their tone, which makes it more likely to be misinterpreted.

Relating. Text messages also allow us to relate to family and friends in a convenient way, which helps facilitate healthy relationships. We can easily share information with our friends to help maintain those friendships. We can check in with family members to sustain family dynamics. Because text messages are quick, we can do this pretty easily. However, text messages would be a difficult way to sustain a deeper relationship over a long period of time. A couple in a long-distance relationship probably wouldn’t feel very deeply connected with each other if they simply sent text messages back and forth. Similarly, a parent who misses their child who has gone off to college probably won’t be fully satisfied with text messages because they can’t hear their child’s voice or see their face. It’s missing the level of intimacy that some relationships would require.

Thinking. It’s interesting that as text messaging capabilities have evolved, so too have our ways of thinking about messages. For instance, it seems more prominent for people to think about their responses to a message in terms of a movie scene, which can be easily grabbed as a GIF and plugged into the text thread. People are more likely to think about their conversations in comparison to movie scenes. They’re also more likely to distill larger ideas and emotional responses into a few sentences or even an emoji. All of this is to say that text messaging also constrains our thinking by forcing users to oversimplify their experiences and ideas.

Being. As already stated, text messaging is pretty universal, so it doesn’t strictly limit the type of identity that a person can have. However, there are some definite identity markers that a person who writes text messages takes on: They are someone who has the income and tech savvy to own a cell phone and respond to messages. They are someone with a range of knowledge about texting conventions—the acronyms and the ability to send hyperlinks and GIFs. Often the types of emojis or GIFs that a person sends are a reflection of their identity in some way, based on their interests and personalities.

Education

Of course, digital technologies are also used for educational purposes—both formal and informal. More informal spaces like Wikipedia or Reddit provide additional information about a wide variety of topics and interests. Other platforms are used in more formal educational contexts. For instance, Canvas is a learning management system commonly used to give students access to resources, policies, and assignments for a particular course. It can be used to supplement in-person instruction or as a hub where online videos and teleconferencing can take place. Similarly, Google Workspace for Education provides a range of communication tools that allow students and instructors to communicate, collaborate on projects, receive feedback, and so on.

It seems especially true that there is no magic bullet when it comes to ed tech. Though new platforms and applications are constantly being introduced with the promise to engage students, spark their creativity, and improve learning, the reality is that ed tech also has its trade-offs—aspects of a particular platform or program that are beneficial to the larger educational goal and then aspects that are limiting. It comes to mind, for instance, that most platforms put the instructor at a pretty significant advantage to surveil student activity. Canvas, for instance, reports when students last logged into Canvas and for how long. It gives the time stamp for when students turn in an assignment. Similarly, Google Docs allows shared users (i.e., teachers) to see the entire process of creating a document—when it was opened, what the drafting process looked like, how long it took, who else was on the document, and so on. It puts student activities under the microscope, compelling certain behaviors over others and often creating an environment of distrust and suspicion.

Further, there has been much recent discussion about data analytics intended to measure student performance for the purpose of assessing teaching practices and student learning. While this type of ed tech is most often promoted as a helpful tool, Neil Selwyn reminds us that all technologies are imbued with ideologies and that it’s important to consider what those ideologies and political agendas are. He goes on to voice concerns about the reductionist process of data analytics, which simplifies rich student experiences to a couple of data points: “This relates to a broader suspicion of educational data inevitably being inaccurate, incomplete, poorly chosen, or simply a poor indicator of what it supposedly represents” (12). Similarly, Gert Biesta et al. question whether our focus on collecting data has undermined our underlying educational goals: “The rhetorical power of the idea of ‘what works’—and similar notions such as evidence-based practice or evidence-informed teaching—should not make us forget that things never work in an abstract sense and never work in a vacuum” (2). They go on to discuss the benefit of measuring student performance but argue that it can lead to a “perversion of what education is supposed to be about” when that becomes the primary goal. In other words, school systems can sometimes get so caught up in creating measurable learning outcomes that they distort the teaching practices in order to increase performance scores, regardless of whether those scores are connected to any meaningful skills.

Just as Selwyn argues against a “blind faith in data,” on a broader scale, it’s important to resist a “blind faith in ed tech” as we consider the range of effects of the technologies we use, their underlying ideologies, and the social consequences for different groups of people. Any digital platform that is used in the classroom should be scrutinized to understand the affordances (the things it allows in terms of doing, meaning, relating, thinking, and being) as well as the constraints (the things that are disabled and the potential negative effects).

Health Care

Another great example of how digital technology is used for personal objectives is health care. Because technologies are so often invisible, it’s easy to take for granted the technologies that facilitate your medical care. However, in the last couple of decades, the medical industry has made huge shifts toward digital platforms that make it much easier for doctors to care for patients. It also makes it more convenient for patients if they want to change doctors or get specialized care. One primary example is the digitization of medical records. While this hasn’t been an easy process (Badalucco), moving patient records from the standard manila folder to an electronic format has made it much easier for doctors to access patient information, transfer records from one doctor to another, and access treatment information that will reduce medical error and increase patient outcomes. Another great example is the electronic prescription, which makes it much easier and faster for doctors to send a prescription to your preferred pharmacy. Not only is this more convenient, but the system is safer, since dosing information is automatically printed on the bottle and doctors have records of all the medications that a person is taking in order to reduce adverse effects.

The advancements in medical technologies and the correlating benefits are too numerous to list. This study by Alotaibi and Frederico looks across a wide range of medical technology advancements, such as electronic physician’s orders, clinical decision support, electronic prescriptions, automated medicine-dispensing cabinets, and patient data management systems, among other things, and determined that “information technology improves patient’s safety by reducing medication errors, reducing adverse drug reactions, and improving compliance to practice guidelines.” Another huge shift in the health care industry came in the form of telehealth and telemedicine in response to the COVID-19 pandemic (Centers for Disease Control and Prevention). In many instances, patients can receive the same type of care from the convenience and safety of their home. The article from the CDC even mentions “remote intervention” in which surgeries can take place via robot, which reduces the direct contact between a patient and physician. There are also more automated technologies, such as medical wearables (Insider Intelligence) and digital therapeutics (Digital Therapeutics Alliance), that provide patients with a higher level of independence as well as better health outcomes.

Regarding our framework of affordances and constraints, the affordances seem obvious. These technologies provide significant opportunities to do more things—to meet with doctors, to track medical histories and personal data, and to receive prescriptions and medical interventions. They also increase the range of meanings we can make as we adopt specialized terminology (getting your “steps” in, for instance) as well as our ways of thinking as our personal data and medical data become increasingly top of mind.

However, this critical unit of the book is about resisting utopian perspectives of technology in favor of those that examine deeper, more subtle consequences. It’s easy to identify all of the benefits of medical technologies, but the reality is that these technologies are often complex and difficult to learn, which can hinder doctors’ ability to focus their attention on patient needs (Gawande). What’s more, these systems aren’t immune to medical errors that have serious medical consequences. In fact, in some instances, they can lead to an increase in mistakes when inputting or retrieving patient information and a decrease in communication and professional consultations between doctors responsible for patient care (Coiera et al.).

Another significant challenge relates to access. Obviously, not all people have equal access to digital technologies to use this type of health care. As these technologies and medical processes become more commonplace, often replacing the old, more “hands-on” ways of doing things, there are certain populations that get left out, which leaves them vulnerable to missing out on healthcare information and interventions that would improve their health. Certainly, the digital divide is a cause for concern. Additionally, there’s the reality that some medical technologies aren’t designed for all users equally. For people who aren’t intuitive about technologies or who don’t speak English well, these technologies can also function as a barrier instead of a gateway to better health care.

Perhaps one of the most significant issues relevant to healthcare technologies is the changes in how we relate to other people. Many of the advancements identified above take out the personal touch between doctors and patients. The more intimate, social, and emotional aspect of health care is sacrificed for the sake of safety and self-preservation. Similarly, the relationship between patients and doctors has shifted because of rising costs, concerns over being sued, and a focus on unnecessary medical procedures, all of which diminish the credibility of the medical industry (Shmerling).

These are just a few ways that digital technologies are used for personal use and how the framework of affordances and constraints can be applied. As you consider your own practices and the digital technologies you employ—for entertainment, shopping, or educational purposes, for instance—a deeper consideration of how these technologies affect your ways of doing, meaning, relating, thinking, and being can be extremely helpful in uncovering the consequences of that use so that you can be intentional about the choices you make.

A Professional Perspective

From a professional perspective, the framework of affordances and constraints provides a way to examine how digital technologies are employed in various jobs. As digital technology becomes increasingly pervasive, the majority of jobs require extensive use of those technologies in order to accomplish even the most basic tasks. This has obvious implications for the digital divide, as we’ve discussed earlier. People who are more familiar with these technologies and have access to new advancements as they emerge are better positioned to obtain more skilled jobs that require digital literacy skills, which in turn, provides financial and social benefits that accrue over time. The chasm between those with digital literacy skills and those without is particularly problematic from a professional perspective since it perpetuates the cycle of low income for certain populations. That is certainly one way of looking critically at the professional use of technology. However, this section focuses more explicitly on different types of jobs and the affordances and constraints that digital technologies bring to those jobs—particularly to those workers who benefit in many ways from the digital technologies that make certain parts of their jobs easier and more flexible. But as we’ve learned throughout this unit of the book, a critical examination of technology also considers the disadvantages—the more negative outcomes that emerge as we adopt certain technologies.

Clearly, we can’t focus on all jobs in this brief section. Instead, we’ll be looking at four different categories of digital workers based on Ens et al.’s research about the affordances and constraints of “decent work.” Their paper “Decent Digital Work: Technology Affordances and Constraints” examines four different types of digital workers, which refers to people whose jobs are defined by the technologies they use. While many people in their jobs might employ digital technologies to varying degrees, not all of them have been fundamentally changed because of digital technology. They say, “Digital work is then better conceived of as the type of which, which is fundamentally reconfigured through the use of digital technologies embedding increasing levels of mobility and precarity” (1; emphasis original). In other words, there are aspects of technology that are beneficial to personal autonomy because more things can be accomplished in a single day and it’s easier to stay connected with clients and colleagues. It’s possible to work from almost anywhere. These are all affordances, but on the flip side is the reality that the opportunity to accomplish more at work can lead to weakened boundaries between work and home life and create higher levels of stress and fatigue. Being able to work and connect with colleagues from anywhere allows for increasing levels of flexibility and personal freedom, but it can also feel confining in the fact that it’s almost impossible to get away from these connections or work obligations.

Specifically, Ens et al. look at the affordances and constraints of technology for four types of digital workers:

  • The “gig” worker. This is a freelancer who is able to piece together different projects from a variety of different employers in order to make a living. While there is flexibility in where these people do their work and there is some freedom in being able to accept or reject a job (the affordances), there is also the need to constantly keep work coming in in order to make a living. The “gig” worker is also at a disadvantage when it comes to a lack of promotional opportunities, medical or retirement benefits, and a professional community to provide support and emotional connection.
  • The “digital nomad.” Like the gig worker, the digital nomad worker is not tied to a single employer. They are employed through contracts and are able to travel the world while they have digital meetings with clients and fulfill their contractual obligations. Unlike the gig worker, this is a “nomadic” type of person who enjoys the freedom of being able to work from anywhere and therefore spends most of their time traveling. This is obviously a huge benefit, particularly if they are able to get contractual jobs that sustain the expense of constant travel and self-indulgence. On the other hand, it’s also stressful to always be looking for the next job or to sometimes have to take on multiple jobs. It’s also difficult to keep a consistent work schedule while traveling and to stay connected with clients and other people. Also, there is never a truly “paid vacation” as there would be in a stable, corporate environment, which can lead to fatigue and burnout.
  • The “nine-to-fiver.” This is someone with a single employer, bound by the traditional eight-hour workday. While many nine-to-five jobs occur in an office or some other facility, the emergence of digital technology has made it increasingly possible for nine-to-fivers to work remotely from home, which provides more flexibility and convenience. However, these workers are still tied to the obligations of a single employer and don’t have as much flexibility regarding their work hours. The opportunity to work remotely can also weaken workers’ sense of belonging.
  • The “traveling elite.” Like a nine-to-fiver, the traveling elite has a single employer, but their role in that company is one that requires constant travel in order to meet with clients. It’s high-intensity and can be extremely rewarding (and lucrative), but it can also be exhausting, since the schedule often requires long hours. Also, these workers spend quite a bit of time in hotels and airports, which can lead to feelings of social disconnect.

While these are very broad categories that barely brush the surface of the types of technologies that different professionals use and the affordances and constraints of each one, these categories do provide a picture of how professional jobs have shifted as a result of digital technologies, providing new types of jobs as well as new opportunities within jobs that previously existed. It also underscores the reality that with every affordance (a new benefit that provides convenience and flexibility) there is a constraint (an aspect of the technology that can be limiting and isolating in some way). As you think more specifically about a profession—probably your own current or intended profession—and the technologies that are compelled within that field, this framework of affordances and constraints as well as the different types of digital work can help contextualize the consequences of those technologies.

Let’s look at one example: teachers. While there are all different types of teachers at different institutions, it’s true that advancements in digital technology—especially during the COVID-19 pandemic—dramatically shifted the work structure for many teachers. Suddenly, they had to shift from in-person to online instruction while also making themselves available to answer questions and provide additional help in some way—either via synchronous or asynchronous instruction. Many of the technologies that have emerged made it easy for teachers to record lectures, have students do group work in digital spaces, have individual conversations with students, and provide extra resources (all affordances). However, as many students and instructors have reported, it’s difficult to engage students in an online environment. Watching (and recording) videos can be tiring, and it lacks the social/emotional connections that are afforded in the classroom. What’s more, teaching became a lot more time consuming. Instead of having in-person conversations and answering questions on the spot, teachers spent a lot more time recording and uploading lectures, responding to student questions via email, grading additional assignments to check student learning, and so on. While platforms like Google Workspace, Canvas, Zoom, ScreenPal, and others allowed for education to continue (for some with access) during the pandemic, there were also numerous constraints as these technologies became more integral to the teaching/learning process.

A critical look at the technologies in the education field would look specifically at the technologies listed above and the affordances and constraints of each one, some of which are more conducive to helping students and teachers communicate than others, but which also limit personal freedom and create more emotional and cognitive strain. As you consider a range of other professions and the digital technologies that are employed, you should also examine how specific platforms are used and what the affordances and constraints are for professionals in that field.

Activity 6.3

Interview someone who is currently working in your intended profession. Your interview should focus on not just the particulars of their daily tasks and responsibilities but also the digital technologies that they employ to get their job done. Ask a range of questions about how these technologies are helpful but also how they might be limiting or confining in some way. You should also ask broader questions about benefits and disadvantages of their job (which may or may not be connected to technology).

Based on the interview, you should write up a profile about that person and the job that they do. Next identify the primary technologies that they use in their professional role and the affordances and constraints of those technologies. How do they help with productivity, communication, professional growth, and autonomy? In what ways are they limiting?

In this first unit of the book, we’ve discussed critical literacy with a focus on a wide range of advantages and disadvantages that are associated with digital technologies. The hope is that these larger considerations and frameworks become part of your own intellectual process as you evaluate the technologies you use every day that have become “invisible,” as you learn new technologies that are compelled for personal and professional reasons, and as you make choices about how to use those technologies. In fact, it’s those commonplace, overlooked technologies that often deserve more critical awareness. In her article “Technology and Literacy: A Story About The Perils of Not Paying Attention,” Cynthia Selfe argues that when technologies become “invisible” to us, they also become most dangerous because we aren’t thinking critically about their implications. While you might not always have choices about certain technologies—since some will be required in the educational or professional settings you engage—it can still be incredibly important to be aware of the broader implications of technology so that you can advocate for groups of people who have been overlooked or for processes that would be more advantageous in some way. And in those situations where you do have choices, a deeper understanding of the affordances and constraints of the technologies that are available to you will greatly enhance your effectiveness as a communicator and as someone who has a positive effect on the communities of which you are a part.

Discussion Questions

  1. What is the broader definition of technology (beyond digital technology)? How might these technologies be considered an “extension of yourself” as they “mediate” your activities?
  2. What does it mean that some technologies eventually become “invisible”? How might this be both a good and bad thing?
  3. What are “affordances” and “constraints”? How can these terms be applied to the technologies that you use? Give some examples.
  4. Jones and Hafner extend the framework of affordances and constraints to consider the effects of a particular technology on our ways of doing, meaning, relating, thinking, and being. Explain what these different categories mean and how they relate to the technologies people use.
  5. What is ideology? The chapter lists three important considerations of how ideologies are embedded in technology. What are they? Why is this an important aspect of critical literacy?
  6. What are some of the consequences of ideologies that are perpetuated in the technologies you use?
  7. Identify and explain the four elements in Selber’s framework for critically analyzing digital spaces. Can you provide an example of a technology and how you might evaluate it using at least one of those elements?
  8. What are some of the personal uses of technology that this chapter considers? How can the framework of affordances and constraints be applied to other aspects that weren’t explicitly discussed (entertainment or shopping, for instance)?
  9. What are the four types of digital workers identified in the chapter? How do these categories along with Jones and Hafner’s concept of affordances and constraints provide a framework that can be used to consider other professions and uses of technologies?

Sources

Alotaibi, Yasser K., and Frank Frederico. “The Impact of Health Information Technology on Patient Safety.” Saudi Medical Journal, vol. 28, no. 12, Dec. 2017, pp. 1173–1180, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5787626/.

Badalucco, Traci. “The Battle to Move U.S. Health Care from Paper to Digital Far From Over.” USNews, 5 Nov. 2015, https://www.usnews.com/news/articles/2015/11/05/the-battle-to-move-us-health-care-from-paper-to-digital-far-from-over.

Balter, Michael. “‘Mind Meld’ Engalbes Good Conversation: When Two People Talk, Similar Areas of their Brain Activate.” Science.org, 26 July 2010, https://www.science.org/content/article/mind-meld-enables-good-conversation.

Biesta, Gert, et al. “Editorial: Why Educational Research Should Not Just Solve Problems, But Cause Them as Well.” British Educational Research Journal, vol. 45, no. 1, Feb. 2015, pp. 1–4, https://doi.org/10.1002/berj.3509.

Centers for Disease Control and Prevention. “Telemedicine Access and Use.” CDC.gov. 6 Aug. 2021, https://www.cdc.gov/nchs/covid19/rands/telemedicine.htm.

Coiera, E., et al. “The Unintended Consequences of Health Information Technology Revisited.” Yearbook of Medical Informatics, no. 1, 10 Nov. 2016, pp. 163–169, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5171576.

Digital Therapeutics Alliance. https://dtxalliance.org/.

Ens, Nicola, et al. “Decent Digital Work: Technology Affordances and Constraints.” Thirty-Ninth International Conference on Information Systems, 2018, San Francisco. https://www.researchgate.net/publication/329450302_Decent_Digital_Work_Technology_Affordances_and_Constraints.

Gaskell, Adi. “Technology Isn’t Destroying Jobs, But Is Increasing Inequality.” Forbes, 3 May 2019, https://www.forbes.com/sites/adigaskell/2019/05/03/technology-isnt-destroying-jobs-but-is-increasing-inequality/?sh=4cf06c605e78.

Gawande, Atul. “Why Doctors Hate Their Computers.” The New Yorker, 5 Nov. 2018, https://www.newyorker.com/magazine/2018/11/12/why-doctors-hate-their-computers.

Insider Intelligence. “Latest Trends in Medical Monitoring Devices and Wearable Health Technology (2023).” Insider Intelligence, 13 Jan. 2023, https://www.insiderintelligence.com/insights/wearable-technology-healthcare-medical-devices/.

Jones, Rodney, and Christoph Hafner. Understanding Digital Literacies: A Practical Introduction. Routledge, 2012.

Kharazian, Zarine. “‘Technology Is Neither Good Nor Bad; Nor is it Neutral:’ The Case of Algorithmic Biasing.” Social Science Research Methods Center, 18 Nov. 2016, https://ssrmc.wm.edu/technology-is-neither-good-nor-bad-nor-is-it-neutral-the-case-of-algorithmic-biasing/#:~:text=Kranzberg’s%20First%20Law%20states%3A%20%E2%80%9CTechnology,cultural%20values%20and%20societal%20outcomes.

Kranzberg, Melvin. “Presidential Address: Technology and History: ‘Kranzberg’s Laws.’” Technology and Culture, vol. 27, no. 3, July 1986, pp. 544–560, https://www.jstor.org/stable/3105385.

Kurt, Serhat. “Lev Vygotsky—Sociocultural Theory of Cognitive Development.” Educational Technology, 7 July 2020. https://educationaltechnology.net/lev-vygotsky-sociocultural-theory-of-cognitive-development/.

McLuhan, Marshall. Understanding Media: The Extensions of Man, MIT Press, 1964.

Phaneuf, Alicia. “Latest Trends in Medical Monitoring Devices and Wearable Health Technology.” Insider Intelligence, 16 Apr. 2022, https://www.insiderintelligence.com/insights/wearable-technology-healthcare-medical-devices.

Selber, Stuart. Multiliteracies for a Digital Age. Southern Illinois University Press, 2004.

Selfe, Cynthia. “Technology and Literacy: A Story About The Perils of Not Paying Attention.” College Composition and Communication, vol. 50, no. 3, Feb. 1999, pp. 411–436, https://doi.org/10.2307/358859.

Selwyn, Neil. “What’s the Problem with Learning Analytics?” Journal of Learning Analytics, vol. 6, no. 3, 2019, pp. 11–19, https://learning-analytics.info/index.php/JLA/article/view/6386.

Shmerling, Robert H. “Is Our Healthcare System Broken?” Harvard Health, 13 July 2021, https://www.health.harvard.edu/blog/is-our-healthcare-system-broken-202107132542.

Steinhardt, Ken. “The Best Tech Makes the Underlying Technology Invisible.” Forbes, 30 Nov. 2021, https://www.forbes.com/sites/forbestechcouncil/2021/11/30/the-best-tech-makes-the-underlying-technology-invisible/?sh=41c236f17ba5.

License

Icon for the Creative Commons Attribution 4.0 International License

Writing for Digital Media by Cara Miller is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book