OFIS MENTOR SUGGESTIONS

  • This video is a tell-all about how algorithms can be racially biased exemplified by a facial recognition update imposed by Uber. This indignity can negatively affect the lives of black and brown people like Pa whose livelihoods rely on these independent occupations.

    Link: Click here | Time: 9 minutes | Media: Video | Source: The Economist I Recommended by: Aulane Mpouli, Ph.D. Student in Chemistry, Duke University

  • By reviewing over a thousand gait analysis records for children with Cerebral Palsy, these researchers were able to understand how movement develops for these children; as a result, they were able to inform clinicians that they should avoid surgical solutions for young children. This is a crucial finding that influences my research in a major way, we can use data to advocate for cost-effective, less risky treatment for these children and their families in a very convincing way.

    Link: Click here | Time: 25 minutes | Media: Text | Source: Developmental Medicine and Child I Recommended by: Hassan Farah, Ph.D. Candidate, Translational Biology, Virginia Tech

  • Artificial intelligence (AI) and machine learning are often used as tools for making decisions in health care and insurance coverage; however, most of these algorithms have an implicit racial bias against people of color, leading to discrepancies in Black patients' treatment and cost of health care in comparison to their non-Black counterparts.

    Link: Click here | Time: 10 minutes | Media: Text | Source: Health Affairs I Recommended by: Caitlyn Nguyen, Master of Biostatistics Student, Duke University

  • The article provides ten simple rules for responsible big data research. The first five rules around how to reduce the chance of harm resulting from big data research practices; the second five rules focus on ways researchers can contribute to building best practices that fit their disciplinary and methodological approaches.

    Link: Click here | Time: 10 minutes | Media: Text | Source: PLOS Computational Biology I Recommended by: Mohammed Baaoum, Ph.D. Student, Industrial and Systems Engineering, Virginia Tech

  • In Automating Inequality, Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America.

    Link: Click here | Time: 60 minutes | Media: Video | Source: The Berkman Klein Center for Internet & Society Talk by Virginia Eubanks, Associate Professor, Department of Political Science at the University at Albany, SUNY I Recommended by: Mohammed Baaoum, Ph.D. Student, Industrial and Systems Engineering, Virginia Tech

ARTIFICIAL INTELLLIGENCE (AI)

  • This is part of a series highlighting how AI may affect historically underrepresented or marginalized communities.

    Link: Click here | Time: 10 minutes | Media: Text | Source: Rising Voices and the Association for Progressive Communications

  • Examination of facial-analysis software shows an error rate of 0.8 percent for light-skinned men, 34.7 percent for dark-skinned women.

    Link: Click here | Time: 5 minutes | Media: Text and Video | Source: Massachusetts Institute of Technology News

  • When applying machine learning tools, "there are enormous gains to be made from using these tools. But so is the trepidation: as with all new “products,” there is potential for misuse. How can we maximize the benefits while minimizing the harm?

    Link: : Click here | Time: 5 minutes | Media: Text Source: Harvard Business Review

ALGORITHMIC BIAS

  • Report on a landmark federal study that showed facial-recognition systems misidentified people of color more often than white people, casting doubts on a rapidly expanding investigative technique widely used by law enforcement across the United States.

    Link: Click here | Time: 10 minutes | Media: Text | Source: The Washington Post

  • A report from 2016 shows how software discriminates against Black people when making recommendations for early release from prison.

    Link: Click here | Time: 10 minutes | Media: Text | Source: Datalab, a virtual organization at Zurich University of Applied Sciences blog

  • Two experts discuss how bias and fairness in algorithms slowly impact human behavior over time.

    Link: Click here | Time: 48 minutes | Media: Video | Source: VOICES for Social Justice podcast

  • Companies like Facebook, Netflix, and Uber deploy algorithms in search of greater efficiency. But when used to evaluate the powerful systems that judge us, algorithms can spur social progress in ways nothing else can.

    Link: Click here | Time: 10 minutes | Media: Text | Source: Wired Magazine

  • MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn't detect her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze."

    Link: Click here | Time: 8 minutes | Media: Video | Source: TEDTalk

  • When applying machine learning tools, "there are enormous gains to be made from using these tools. But so is the trepidation: as with all new “products,” there is potential for misuse. How can we maximize the benefits while minimizing the harm?

    Link: : Click here | Time: 5 minutes | Media: Text Source: Harvard Business Review

  • Introducing Big Data: Covid-19, IRIS' exciting new project unlocking big data for secondary students using real global COVID-19 data.

    Link: Click here | Time: 6 minutes | Media: Video | Source: LabXchange

  • This article explores how AI systems can replicate and amplify biases related to race, gender, and other social factors, leading to unfair treatment and unjust outcomes.

    Link: Click here | Time: 5 minutes | Media: Text | Source: LabXchange

  • Delve into the unseen facets of tech as we explore career accessibility, racial literacy, and the dynamic impact of AI with insightful discussions from specialists in the field. Join us for a journey shaping an inclusive and ethically-driven future.

    Link: Click here | Time: 56 minutes | Media: Video | Source: LabXchange

  • This text examines the prevalence of racial bias and discrimination in facial recognition software.

    Link: Click here | Time: 20 minutes | Media: Text | Source: LabXchange