Skip to the content.

Computing Bias

hacks

DevOps

Popcorn Hack 1

An example of a movie is called Her. The people affected by it are the humans interacting with the AI in this movie. The AI doesn’t reciporacate with human emotions and doesn’t care about the people it talks to. It evolves past human interaction leaving people dependent and alone. A cause of this bias is because the AI was designed to evolve past human emotions, if the AI was designed around the human experience, these people wouldn’t be so hurt by it.

Popcorn Hack 2

A time when I felt a technology didn’t work for me is when I spoke spanish into a spanish to english translator, but my accent was good enough. It didn’t understand what I was saying and I couldn’t test my vocabulary and new words I didn’t know because my accent wasn’t factored in. If the translator was designed for many accents, I would have been able to use it, and others like me wouldn’t have any problems.

Popcorn Hack 3

If I were to design a fitness tracking app some biases could slip in. If my app was to detect when someone started exercising from an elevated heart rate, someone who naturally has an elevated heart rate due to a genetic condition might be detected as exercising even when they aren’t. Users with these health conditions would struggle to use this feature because how they were born I could add features to mark your resting heart rate, so I know when they are exercising and when they aren’t to make sure these people are included.

Homework Hack 1

I choose YouTube. There is a lot of bias in the recommendation system of YouTube. You can watch one video about a random topic and suddenly get recommended more stuff on that topic. If you are male you are more likely to get certain videos and if you are female you get other videos. This could be a data analysis thing where people with the same demographics get sent the same videos, even though people differ in interest. To make this more inclusive, developers can remove any biased recommendations based on demographics like location and gender.