Big Idea 5.3 Computing Bias- Tanay

Overview/Definition: Computing Biasses are the numerous Biasses in application that are based on human prefrences.

Types of Computing Bias - Tarun

Explicit data vs Implicit data: - Pranavi

Explicit data:

Implicit data:

Example: Netflix

Screenshot 2023-12-12 at 1.22.41 PM.png

Popcorn Hack:

In what other applications could have intential bias?

Google search result biases that highlight websites that pay Google in ad money or have paid for a higher spot in search results. 

Intentional Bias vs Unintentional Bias - Tanvi

Example 1: Hypothetical Loan company

Example 2: Candy Crush vs Call of Duty

Screenshot 2023-12-12 at 1.23.36 PM.png

Screenshot 2023-12-12 at 1.23.42 PM.png

Popcorn Hack:

How is their unintentional bias in apps such as TikTok or Instagram or otehr social media apps?

There is bias in social media through the cyclic nature of recommendations. Every time you watch a certain reel or like a certain post, more of the same content will be fed to you through your feed. 

Mitigation Strategies - Shubhay

  1. Is bias enhancing or intentionally excluding?

Bias can manifest in different ways. It can be unintentional and emerge as a result of inherent human biases in data or algorithms, enhancing certain patterns or groups. On the other hand, bias can also be intentionally introduced, leading to exclusion or discrimination based on specific characteristics. In some cases, biases are unintentional but can still result in exclusionary effects.

  1. Is bias intentionally harmful/hateful?

Bias itself may not always be intentionally harmful or hateful. It can originate from historical or societal prejudices embedded in data. However, intentional actions to introduce bias with harmful or hateful intentions can occur, especially when biases are manipulated to perpetuate discrimination or disadvantage specific groups.

  1. During software development, are you receiving feedback from a wide variety of people?

Effective software development involves receiving feedback from a diverse set of stakeholders, including users, developers, and other relevant parties. Engaging with a wide variety of people helps identify potential biases, usability issues, and ensures that the software meets the needs of a diverse user base. Feedback from different perspectives is crucial for creating inclusive and well-rounded applications.

  1. What are the different biases you can find in an application such as Youtube Kids?

Biases in applications like YouTube Kids can take various forms:

Content Bias: The recommendation algorithm may inadvertently favor certain types of content over others, leading to a lack of diversity in the content shown.
Cultural Bias: The platform may prioritize content that aligns with specific cultural norms or perspectives, excluding content from other cultures.
Gender Bias: The algorithm might unintentionally recommend content that aligns with gender stereotypes, limiting exposure to diverse perspectives.
Educational Bias: Certain educational content may be prioritized over others, impacting the variety of learning opportunities presented to children.
Language Bias: The platform may favor content in certain languages, limiting access to educational material and entertainment for speakers of other languages.

Addressing these biases is crucial to ensure that platforms like YouTube Kids provide a balanced and inclusive experience for all users, considering the diverse backgrounds and preferences of children and their families.