Understanding Computer Bias

Understanding Computer Bias

Have you ever noticed how Netflix keeps recommending the same types of shows over and over? Or how voice assistants (like Siri or Alexa) almost always have a female voice by default? These little quirks might seem harmless, but they can hint at something larger behind the scenes: computer bias.

Computer Bias Illustration

In this blog, we’ll unpack what computer bias is, share some simple real-world examples, and explore how we can create more fair and inclusive technologies.


What Is Computer Bias?

Think of “computer bias” as unfair preferences built into a computer system. Often, this bias isn’t deliberate—rather, it’s a result of how people design, test, or use these systems. Some typical ways bias sneaks in:

  • Data Issues: If you train an algorithm on data that’s missing certain types of people or situations, the system might perform poorly for those it hasn’t “seen” enough of.
  • Design Choices: Maybe the developer assumed everyone uses the product the same way, leading to one-size-fits-all features that accidentally exclude some users.
  • Testing Gaps: Sometimes, teams test only with certain groups—say, colleagues or friends—missing feedback from the broader population.

Key Takeaway: Humans are behind every tech tool, so our biases can show up in the algorithms we create—even if we don’t mean for it to happen.


Computing Bias Illustration

Popcorn Hack #1:

Provide an example of a movie, TV show, video game, or software that demonstrates bias and specify who is affected by it. Explain a potential cause of this bias.

Everyday Examples of Bias

1. Netflix Recommendations

Netflix is known for suggesting shows and movies based on what you’ve watched before. While it’s helpful, the recommendation system can unintentionally “pigeonhole” you. If you mostly watch comedies, Netflix might stop suggesting documentaries or foreign films, even if you’d actually love them. This can keep you stuck in a loop of the same types of content.

Why is that biased?

  • The algorithm heavily leans on past choices and might ignore other genres or shows that don’t fit its pattern.
  • People who share a profile (like families) might get skewed recommendations that don’t reflect everyone’s interests.

2. Virtual Assistants with Female Voices

It’s common for digital assistants (like Siri or Alexa) to default to a female voice. While some platforms offer alternatives, the default often remains female.

Why might this be a problem?

  • It can subtly reinforce stereotypes that women are “helpers” or “assistants.”
  • It may exclude people who’d prefer a different voice or feel more comfortable with another default option.

3. Social Media Age Gaps

If you look at who uses TikTok (generally younger folks) versus who prefers Facebook (often older demographics), you’ll see a clear age divide. Sometimes these platforms don’t explicitly stop people from different age groups from joining; however, their design, marketing, and trends can unintentionally favor one demographic over another.

Social Media Demographics

Popcorn Hack #2:

Think about a time when you felt a technology didn't work well for you. What was the issue, and how did it make you feel? Write a short paragraph describing the experience and suggest one way the technology could be improved to be more inclusive.


The HP Camera Incident: A Closer Look

One famous example involved an HP laptop camera that couldn’t reliably track the faces of people with darker skin tones. A user posted a video calling the camera “racist” because it followed lighter-skinned faces with ease but struggled to track darker-skinned faces.

  1. Was it intentional? The user didn’t think HP deliberately designed it this way, but the final result was still unfair.
  2. Why did it happen? Likely limited test data during development. If you only test facial tracking on people with lighter skin tones, you miss potential issues for everyone else.
  3. Is it harmful? Yes. Beyond frustration, it alienates users and sends the message that the technology “isn’t made” for them.
  4. Should it be corrected? Absolutely. More comprehensive testing and more diverse datasets would make the camera work better for everyone.

Avoiding Bias in Tech

So, how do we stop bias from creeping into our algorithms and products? Here are a few practical tips:

  • Expand Your Data: Gather as many different types of samples as possible. For Netflix-like recommendations, that might mean training on a wide range of viewing histories from diverse users.
  • Encourage Diverse Teams: People from varied backgrounds ask different questions and notice different problems. This helps catch unintentional biases before a product launches.
  • Test, Test, Test: Don’t just rely on your friend group or your coworkers. Try “beta testing” with users of all ages, races, and abilities. Seek feedback and see if certain groups are having more issues.
  • Document Your Assumptions: Be transparent about how the algorithm makes decisions. If you know it’s heavily focused on past user behavior, note that clearly.

Popcorn Hack #3:

Imagine you're designing a fitness tracking app. How could bias sneak into your app’s recommendations or performance evaluations? Think about users with different physical abilities, ages, or health conditions. What features could you add to ensure the app is fair and inclusive for all users?

Fitness App Illustration

Why It All Matters

When biases go unchecked, technology can exclude people, reinforce negative stereotypes, or limit choices. By staying aware of potential pitfalls—from the data we collect to how we test and design our products—we can build a more inclusive digital world. After all, technology should be for everyone.

Ready to take action?

  • Try looking at your streaming platform’s recommendations. Do they all look the same? If so, shake things up: watch a documentary or a foreign-language show to encourage variety in your recommendations.
  • Voice your preferences. If you have a smart speaker or assistant, see if you can change the default voice. Notice how that small shift might affect how you interact with it.
  • Share your experiences. If you see tech that doesn’t work well for certain groups, point it out. Often, engineers and designers aren’t aware there’s a problem until users speak up.

Join the Conversation

Have an example of bias you’ve encountered in apps or websites? Add a comment below!

Working on your own project? Test it with friends and classmates who might use it differently. You could catch issues early and create something that everyone can enjoy.

We all have a part to play in recognizing and fighting computer bias. By being informed and proactive, we can help tech become a force that truly includes rather than excludes.

Homework Hack #1:

  • Choose a digital tool, app, or website you use regularly. It could be a social media platform, a shopping site, or a streaming service.
  • Identify Potential Bias: Are there any patterns in the recommendations or interactions that might suggest bias? Does the platform cater well to different user groups (e.g., age, gender, language, accessibility)?
  • Analyze the Cause: What might be causing this bias? Consider data collection, algorithm design, or lack of diverse testing.
  • Propose a Solution: Suggest one way the developers could reduce bias and make the platform more inclusive.
  • LINK TO MC: https://forms.gle/R9iSwyZkWW9N51be6